TikTok and Snapchat need more parental controls, a letter signed by 44 attorneys general stated.
Yesterday, the National Association of Attorneys General (NAAG) sent a series of concerns to the two social apps, which are commonly used among teens.
The attorneys general cited a range of problems they had with the social media apps, including, more broadly, the negative impact they can have on the physical, emotional and mental well-being of children and teens. They also noted that content depicting abusive sexual relationships can seriously harm a child’s view of a healthy relationship, helping to perpetuate domestic abuse and human trafficking. And the letter stressed that TikTok and Snapchat don’t effectively collaborate with third-party parental control apps to allow parents to monitor and restrict what their children can do on their platforms.
The NAAG cited a study from one such app, Bark, which analyzed 3.4 billion messages in 2021 across 30 apps to find that 74.6% of teens were involved in a self-harm/suicidal situation, 90.73% encountered nudity or sexual content online and 93.31% engaged in conversations about drugs or alcohol.
“Parental control apps can alert parents or schools to messages and posts on your platforms that have the potential to be harmful and dangerous. Apps can also alert parents if their child manifests a desire for self-harm or suicide,” the letter reads.
Snapchat already has some in-app parental controls, as does TikTok, but this group of attorneys general wants the platforms to be more compatible with third-party parental control apps, though they didn’t endorse a particular product. They suggested that parental control apps could access social media app features like private messaging, which the built-in parental controls don’t monitor. Plus, third-party apps could better filter the user-generated content that appears in the apps’ main feeds.
Still, third-party control apps introduce their own swath of problems regarding which tactics they use to surveil children.
Though TikTok and Snapchat did already have parental controls, its competitor Instagram did not. After a series of Senate hearings about the impact of social media on teen mental health, Meta recently began rolling out parental controls on Instagram, a long-overdue safety measure.
But teen safety on these platforms remains a concern in the U.S. government, regardless of whether or not parental controls are in place. President Biden even mentioned the threat of social media to teen mental health in his State of the Union address, which former Facebook employee and whistleblower Frances Haugen attended as an honored guest.
TikTok is more popular among teens than Instagram, which has poured money into its TikTok clone Reels to keep up. But as TikTok holds its ground as the fastest-growing social app, Meta has gone to even more dramatic lengths to maintain its prominence. The Washington Post reported today that Meta hired Targeted Victory, a Republican consulting firm, to turn the public against TikTok. In some cases, Targeted Victory attempted to influence public opinion by claiming that certain dangerous viral trends began on TikTok, though they actually started on Facebook. TechCrunch also reported in 2018 that Facebook had worked with Targeted Victory to help slow progress on legislation that would impact the platform’s political ad spending.
“Targeted Victory’s corporate practice manages bipartisan teams on behalf of our clients. It is public knowledge we have worked with Meta for several years and we are proud of the work we have done,” the firm’s CEO Zac Moffatt said in a statement.
In any case, there’s only so much that parental controls can do to keep apps safe for teens — it’s on the apps themselves too to make sure that they aren’t serving teens unsafe content.
Update, 3/30/22 at 4:38 PM EST with statement from Targeted Victory.