TikTok pushes bundle of teen safety measures internationally

TikTok is making a promotional push in Europe and Australia around a bundle of safety-focused features, some of which it announced in the US earlier this month, and which it says are aimed at protecting teenage users from dangerous challenges.

The company remains the target of a major consumer protection complaint in the region — which has led to active monitoring of its policies by the European Commission.

The measures being (re)announced by TikTok include a permanent in-app guide which pushes teens to engage with a ‘4-step’ process (aka: “stop, think, decide, act“) before engaging in online challenges; a dedicated policy category for dangerous acts and challenges in the reporting menu to make it easier for users to report problem challenges; and dedicated safety videos from curated creators being pushed to users who are under 18 via their ‘For You’ feeds to further raise awareness of safety issues around challenges.

In a sample video from the #SaferTogether campaign, TikTok creator @maddylucydann can be seen sketching a scenario in which a medic in an emergency department is figuring out what to tell the young patient who’s been admitted with serious injuries after falling attempting to imitate some parkour they saw in an online video but without having the necessary skills to pull off the tricks safely — and putting heavy emphasis on making kids think before they attempt something similarly silly.

Also today, in what looks like a new announcement, TikTok said it’s making a financial contribution to Western Sydney University to support further research into online challenges, and sharing research data with the university’s Young and Resilient Research Centre to that end.

It says this data “formed the basis” of an earlier report, authored by Dr. Zoe Hilton and published by Praesidio Safeguarding.

“We believe these two contributions will help the Western Sydney University Young and Resilient Research Centre in their interdisciplinary approach to developing evidence-based policies and practices to strengthen the resilience of young people in the digital age,” TikTok said in a blog post attributed to Alexandra Evans, its head of safety public policy, Europe.

The blog post also quotes Amanda Third, the co-director of the Centre which will be benefitting from the platform’s financial and data-based largess — stating that TikTok’s contribution will “assist it to explore the challenges involved in keeping young people safe online with real world data”; and “help us develop research to inform policies, programs and interventions to minimise the risks and maximise the benefits of the digital age for young people”.

TikTok’s blog post does not specify how much money it is donating to the university — but when asked for the figure the company (and Third) told us it’s 108,420 AUD (circa ~$78k).

The video-sharing platform has been facing months of scrutiny by regulators in the EU following consumer protection and privacy complaints; and an emergency intervention in Italy last year related to concerns over a ‘blackout challenge’ which local media had linked to the death of a child.

In the latter case, TikTok refuted any link with its platform but ended up removing more than half a million accounts in Italy which it had been unable to verify did not belong to children under the age of 13.

We reached out to the Italian data protection authority for an update on its monitoring of the company’s safety measures but at the time of writing it had not responded.

Update: Italy’s data protection authority has confirmed its monitoring of TikTok’s commitments following last year’s emergency procedure is ongoing — although it noted that the Irish Data Protection Commission is the lead authority for that and for the investigation of TikTok’s handling of children’s data, under the GDPR’s one-stop-shop mechanism for cross-border cases.

“Within this procedure TikTok shared further information relating to the technological tools it intends to deploy to ensure a better assessment of the age of its users,” the spokesperson also told us.

“Any decision will be adopted in the context of collaboration with the other European authorities,” it added, saying it’s unable to predict when a decision might be issued.

Guido Scorza, a member of the Board of the Italian Data Protection Authority, also suggested the new safety measures announced by TikTok are “likely” to be linked to its intervention last year, noting: “At the end of our meetings during May 2021, [TikTok] made the commitment to kick off new communication initiatives both in-app and through radio and newspapers in order to educate to a conscious and safe use of the platform and to remember that it is not suitable for an under 12 years old audience.”

We also contacted the European consumer protection umbrella association, BEUC, which declined to give an assessment of the specific measures TikTok has announced today — saying it prefers to wait for regulators to weigh in on its concerns.

“We prefer to wait for the assessment by the consumer protection authorities who are following up on our complaints that the video sharing platform is breaking consumer law,” Alexandre Biard, team leader for enforcement at BEUC, told us, adding: “We expect the authorities to take measures to make sure the platform respects consumer rights.”

We also contacted the Commission asking for a progress update on its scrutiny of TikTok’s ToS and will update this report with any response.

Update: A spokesperson for the Commission confirmed that it and authorities in the pan-EU Consumer Protection Cooperation Network (CPC) have “engaged” with TikTok since May 28, 2021, adding that they are “intensively working with the company to solve the consumer problems identified with their practices”.

“TikTok has been cooperative and the dialogue has proved useful but not all problems have been solved yet,” the Commission also told TechCrunch. “The CPC authorities expect TikTok to improve its proposals made so far before they decide on the next steps.”

TikTok declined to comment on the Commission’s remarks.

The Commission does not have direct enforcement powers in the area of consumer protection — so it will be up to Member States’ consumer protection authorities to decide whether to take enforcement action on any outstanding issues at the end of the CPC “dialogue” with the company.

Attention to online child safety has been dialling up in multiple jurisdictions in recent years.

In the US, tech execs from major platforms have been grilled by lawmakers on the issue — which has led to a number of bills being proposed, including most recently the Kids Online Safety Act.

While, elsewhere in Europe, the UK is now enforcing a Children’s Code which aims to regulate platforms’ design choices and defaults by forcing them to prioritize privacy and safety.

The country has much broader Online Safety legislation in the pipe, too, also with a major focus on child safety, which will introduce a legal duty of care on platforms towards users.

Down under, in Australia, there’s another Online Safety Bill on the slate — introduced at the end of 2021 — which similarly puts emphasis on tightening the law to protect children from cyberbullying and other online safety risks.

So there’s a clear, global consensus emerging around regulating platforms under a child protection rubric.