In hearing with Snap, TikTok and YouTube, lawmakers tout new rules to protect kids online

Fallout from revelations around teen mental health on Instagram continues — and not just for Facebook. On Tuesday, policy reps from YouTube, Snap and TikTok faced Congress to talk about kids and online safety, marking the first time the latter two companies appeared in a major tech hearing.

The hearing, held by the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security, managed to stay on topic about half of the time. The committee’s Republican members were keen to steer their rare time with a TikTok executive toward questions about privacy concerns over the company’s relationship with the Chinese government.

Diversions notwithstanding, a few of the hearing’s more useful moments saw the three policy leads pressed to answer yes/no questions about specific policy proposals crawling through Congress. The hearing featured testimony from Snap VP of Global Public Policy Jennifer Stout, TikTok’s VP and Head of Public Policy Michael Beckerman and Leslie Miller, who leads government affairs and public policy at YouTube.

Both YouTube and TikTok called for the U.S. to create comprehensive laws around online privacy, with Beckerman deeming a legal framework for national privacy laws “overdue.” All three companies agreed that parents should have the ability to erase all online data for their children or teens, with Stout pointing out that Snapchat data disappears by design. Still, Snapchat’s own privacy page mentions that the company retains location data for “different lengths of time based on how precise it is and which services you use.”

Senator Ed Markey (D-MA), himself an unlikely TikTok sensation, pushed for what he calls a kids’ “privacy bill of rights for the 21st century” during the hearing. Markey pointed to his proposed changes to the Children and Teens’ Online Privacy Protection Act (COPPA) that would bolster protections for young social media users. That legislation would ban tech companies from collecting the data of users between 13 and 15-years-old without explicit consent, implement an “eraser button” that would make it easy to delete minors’ personal data and more broadly restrict the kind of information that social media platforms can collect to begin with.

Markey pressed each company rep on if they would support the COPPA changes. Speaking for TikTok, Beckerman said the company does support the proposal but views a standard method for platforms to verify the age of their users as just as essential, if not more so.

Snap wouldn’t commit to the COPPA proposal, and Markey derided Stout for playing “the old game” of tech companies refusing to commit to specifics. YouTube, which was slammed with a historic $170 million FTC fine for COPPA violations in the past, didn’t explicitly commit to anything but pointed to “constructive” talks the company has had with Markey’s staff.

In the hearing, Markey and Blumenthal also highlighted their reintroduction of the KIDS (Kids Internet Design and Safety) Act last month. That bill would protect online users under 16 from engagement-juicing features like autoplay, push alerts and like buttons. It would also ban influencer marketing to kids under 16 and force platforms to create a reporting system for the instances in which they serve harmful content to young users.