A coalition of 35 consumer advocacy groups along with 64 experts in child development have co-signed a letter to Facebook asking the company to reconsider its plans to launch a version of Instagram for children under the age of 13, which Facebook has confirmed to be in development. In the letter, the groups and experts argue that social media is linked with several risk factors for younger children and adolescents, related to both their physical health and overall well-being.
The letter was written by the Campaign for a Commercial-Free Childhood, an advocacy group that often leads campaigns against big tech and its targeting of children.
The group stresses how influential social media is on young people’s development, and the dangers such an app could bring:
“A growing body of research demonstrates that excessive use of digital devices and social media is harmful to adolescents. Instagram, in particular, exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers,” it states. “The platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing. Younger children are even less developmentally equipped to deal with these challenges, as they are learning to navigate social interactions, friendships, and their inner sense of strengths and challenges during this crucial window of development,” the letter reads.
Citing public health research and other studies, the letter notes that excessive screen time and social media use can contribute to a variety of risks for kids including obesity, lower psychological well-being, decreased quality of sleep, increased risk of depression and suicide ideation, and other issues. Adolescent girls report feeling pressured to post sexualized selfies for attention from their peers, the letter said, and 59% of U.S. teens have reported being bullied in social media, as well.
Another concern the groups have is the use of the Instagram algorithm, which could suggest what kids would see and click on next, noting that children are “highly persuadable.”
They also point out that Facebook knows there are already children under 13 who have lied about their age using the Instagram platform, and these users will be unlikely to migrate to what they’ll view as a more “babyish” version of the app than the one they’re already using. That means Facebook is really targeting an even younger age group who don’t yet have an Instagram account with this “kids version.”
Despite the concerns being raised, Instagram’s plans to compete for younger users will not likely be impacted by the outcry. Already, Instagram’s top competitor in social media today — TikTok — has developed an experience for kids under 13. In fact, it was forced to age-gate its app as a result of its settlement with the U.S. Federal Trade Commission, which had investigated Musical.ly (the app that became TikTok) for violations of the U.S. children’s privacy law COPPA.
Facebook, too, could be in a similar situation where it has to age-gate Instagram in order to properly direct its existing underage users to a COPPA-compliant experience. At the very least, Facebook has grounds to argue that it shouldn’t have to boot the under-13 crowd off its app, since TikTok did not. And the FTC’s fines, even when historic, barely make a dent in tech giants’ revenues.
The advocacy groups’ letter follows a push from Democratic lawmakers, who also this month penned a letter addressed to Facebook CEO Mark Zuckerberg to express concerns over Facebook’s ability to protect kids’ privacy and their well-being. Their letter had specifically cited Messenger Kids, which was once found to have a design flaw that let kids chat with unauthorized users. The lawmakers gave Facebook until April 26 to respond to their questions.
Zuckerberg confirmed Facebook’s plans for an Instagram for kids at a congressional hearing back in March, saying that the company was “early in our thinking” about how the app would work, but noted it would involve some sort of parental oversight and involvement. That’s similar to what Facebook offers today via Messenger Kids and what TikTok does via its Family Pairing parental controls.
The market, in other words, is shifting toward acknowledging that kids are already on social media — with or without parents’ permission. As a result, companies are building features and age gates to accommodate that reality. The downside to this plan, of course, is once you legitimize the creation of social apps for the under-13 demographic, companies are given the legal right to hook kids even younger on what are, arguably, risky experiences from a public health standpoint.
The Campaign for a Commercial-Free Childhood also today launched a petition that others can sign to push Facebook to cancel its plans for an Instagram for kids.
Facebook, reached for comment, offered the following statement:
We’ve just started exploring a version of Instagram for younger teens. We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it. In addition, we will not show ads in any Instagram experience we develop for people under the age of 13. The reality is that kids are online. They want to connect with their family and friends, have fun, and learn, and we want to help them do that in a way that is safe and age-appropriate. We also want to find practical solutions to the ongoing industry problem of kids lying about their age to access apps. We’re working on new age verification methods to keep under-13s off Instagram and we’re exploring an Instagram experience for kids that is age-appropriate and managed by parents.