FTC Finds Privacy Problems In Children’s Apps, But Suggested Changes Will Impact All

I believe the children are the future. (What, too soon?) But in the case of the new FTC report on mobile applications for kids, which references the current data handling practices employed by mobile developers, the children are the future. They’re the future indicators of how our personal information needs to be handled in today’s mobile app ecosystem.

Although the new report makes recommendations specifically for children’s applications, there’s obviously an undercurrent of outrage and violation underway now (thanks mainly to addressgate). People, not just parents, need to control and understand how and why their data is being collected, used, and shared, and what that really means. The question is, how is this done?

In the FTC report titled “Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing,” the agency found that applications on both the Apple App Store and the Android Market did little to provide information about their data collection practices. The majority of the time, there’s no way for a parent to read an app’s description to make any sort of informed decision about whether it’s appropriate for a child.

Apps are capable of accessing a variety of personal information, including a person’s name, their contacts list, their call logs, unique identifiers, precise location and more. Apps are also capable of connecting to social media and often contain ads, which opens up other vectors for data collection.

As Alexia already pointed out, much of the problem with the current set up and the privacy issues it causes has to do with a general lack of enforcement by the mobile application stores themselves (i.e., Apple and Google). The FTC backs this up, basically saying that without proper enforcement of the stores’ own guidelines, developers have little incentive to comply. And certainly, they have no incentive to go above and beyond to provide special disclosures to parents.

What’s more, the FTC suggests that mobile applications designed for use by children may even be in violation of COPPA, aka the Children’s Online Privacy Protection Act. This rule requires operators of online services, including interactive mobile apps, to provide notice and get parental consent prior to collecting information from children under 13. Over the next 6 months, the FTC will be reviewing apps to determine whether some are in violation. Likely, many are.

But the root of the concerns detailed in this report have to do with  the fact that there is no easy, common, standard method to understand what an app does behind the scenes or why. The FTC plans to address this issue in 2012 through a public workshop where it attempt to define how “mobile privacy disclosures” should work, including how they can be “short, effective, and accessible to consumers on small screens.”

Already, Apple and Google have their own ways to alert users to apps’ access requests. iOS apps display pop-ups when an app wants to use your location, for example, while Android asks you pre-installation if you agree to allow the app access the data it requests, which is then listed below. Neither of these means are effective enough to provide a deeper understanding of what data the app may be after, (like address book, as was the case with Path), but also whether that data is shared, with who, and for what purposes.

The creation of such a “mobile privacy disclosure” as the FTC describes would clearly not be a kids-only venture in light of recent events. And if additional parental controls are needed to thwart potentially invasive data-sharing behaviors as related to children’s apps, it’s an obvious next step to make those same controls available to all.

That may not be a bad thing. After all, why shouldn’t developers implement the technical solutions that prevent problems like addressgate from occurring in the first place? Why shouldn’t apps be more careful about which data they request? The answer, of course, is that they should, and probably more of them would if rushing to market in the app gold rush wasn’t such a priority.

In many cases, apps in violation aren’t “evil” so much as they’re a result of lazy coding, young developers and mistakes getting overlooked. Put governmental requirements in their way, and the exponential app growth slows down. Code has to be checked again and again. The app review progress gets longer. Maybe Google has to actually implement an app review process for Android apps. Apps will be submitted, rejected, and resubmitted over and over while code gets cleaned up. All in the name of privacy. Again, that may not be a bad thing, but it will definitely change the game.