Â鶹´«Ã½

Skip to main content

It turns out the real estate people have been right all along. , one of world’s largest mobile ad networks, suggests that for consumers, it really is about location, location, location – or at least honoring consumers’ location privacy preferences and not tracking them without permission. The case is the FTC’s first action against the operator of a mobile ad network. What’s more, it includes an interesting COPPA count because among the people whose location data the company collected were kids using child-directed apps.

InMobi offers an advertising platform for app developers and advertisers. By using InMobi’s software development kit (SDK), developers can sell ad space in their apps. Advertisers, in turn, can target consumers who use any of the apps that have InMobi’s SDK built in. Consumers may not know what’s going on behind the scenes to create the ads that appear on their screen, but industry members understand it’s big business. InMobi bills itself as the “world’s largest independent mobile advertising company†with a network that has reached over one billion unique mobile devices and serves 6 billion ad requests per day.

Of course, advertisers want to target people most likely to buy their products and geolocation is a key data point. (Imagine, for example, a luxury car dealer who wants to target only people who live in the town’s poshest neighborhoods and have been visiting auto dealers lately.) So InMobi offered advertisers products called “geo-targeting suites†that could provide data about consumers’ locations. The options ranged from “Where are they right now?†to “Where have they been in the past two months?†

Let’s shift to the consumer side of the equation. The Android and iOS operating systems have application programming interfaces (APIs) that provide apps with a consumer’s current location. But to access that data, both systems require developers to get the consumer’s consent through permissions. Consumers can refuse to allow a certain app to get their location information or they can use the Android and iOS settings to invoke an across-the-board rule rejecting location requests. 

You’ve probably guessed the irresistible-force-meets-the-immovable-object problem posed when InMobi’s location-grabbing SDK came up against a consumer’s choice not to share geolocation information. That’s the genesis of the FTC’s case.

According to the , even if a consumer had denied access to the location API on their device – in effect, telling an app “No, you can’t have that data†– until December 2015, when the FTC came calling, InMobi still tracked the person’s location and, in many instances, served geo-targeted ads. How did InMobi manage that? The company collected information about the WiFi networks the device was connected to or that were nearby and worked backwards to determine the consumer’s location.

The complaint explains in detail how InMobi sidestepped consumer choice, but it boils down to this. Depending on the operating system, InMobi grabbed network information – for example, the ESSID (network name), the BSSID (a unique identifier), and signal strength – from each WiFi network that a consumer’s device connected to or was nearby, fed this information into its geocoder database (which mapped WiFi networks to their latitude and longitude), and then inferred the device’s location. So even when a consumer had denied access to the location API, InMobi could still monitor their WiFi network connections to track their movements. Voila! A consumer’s geolocation – followed by location-targeted ads.

The offers several takeaway points for industry members. While the FTC primarily sues businesses for misleading claims made to consumers, the case against InMobi demonstrates that companies also can be held liable for deceptive statements made to other businesses when those misrepresentations ultimately affect consumers. In this case, the deceptive statements were in InMobi’s guide for app developers. InMobi said it tracked a consumer’s location and served geo-targeted ads only if the app developer provided access to the location API and the consumer gave opt-in consent. But using the WiFi method we just described, InMobi also secretly tracked location without permission.  Since InMobi wasn’t honest about how its software worked, app developers weren’t able to give consumers accurate information about whether and how they would be tracked.  Consumers, in turn, didn’t have facts that would have been material to their decision of whether to install or use an app.

The COPPA angle of the case merits attention, too. At first glance, InMobi’s Privacy Policy sounded all the right COPPA compliance notes – for example, “We do not knowingly collect any personal information about children under the age of 13.†In addition, the company included specifics about how it honored the July 1, 2013, amendments to COPPA, which extended liability to include ad networks that know they’re collecting personal information from child-directed apps or websites: “. . . InMobi is continuing to ensure that we do not collect and use information from children’s sites for behavioral advertising (often referred to as interest based advertising). We will continue to only use any data in the manner that COPPA prescribes.â€

Following the 2013 COPPA amendments, InMobi introduced an option in its registration process where app developers could check a box to indicate the app was kid-directed: “My property is specifically directed to children under 13 years of age and/or I have actual knowledge that it has users known to be under 13 years of age.†Since then, thousands of app developers who use the InMobi SDK have checked that box.

The problem is that for kids’ apps, InMobi used the same surreptitious method for determining geolocation that it used in other apps. Not only that, but the company also collected location information directly from the location API when available. InMobi then combined all that location information with the device’s unique identifier, and served behavioral advertising within these kid-directed apps – all without parental consent. The upshot? Hundreds of millions of consumers downloaded thousands of kid-directed apps from which InMobi collected and used personal information, in violation of COPPA. According to the FTC, InMobi collected the data every time an app made a request to its network – typically every 30 seconds when an app was in use.

The complaint alleges that InMobi made false and misleading claims about its geo-targeting practices, in violation of the FTC Act. In addition, the FTC says InMobi violated multiple COPPA provisions.

The settlement includes a $4 million civil penalty for violations of COPPA, which is partially suspended based on InMobi’s financial condition, and prohibits misrepresentations related to InMobi’s privacy practices. The proposed stipulated order also requires the company to honor consumers’ location privacy preferences and establish a comprehensive privacy program subject to independent, biennial audits for the next 20 years.

Visit the Business Center’s Privacy and Security portal for compliance resources.

 

More from the Business Blog

Get Business Blog updates