The European Commission has sent another couple of formal requests for information to major platforms subject to the bloc’s rebooted online governance and content moderation rulebook, the Digital Services Act (DSA).
The latest requests, which are focused on child safety, have been sent to TikTok and YouTube.
“The Commission is requesting the companies to provide more information on the measures they have taken to comply with their obligations related to protection of minors under the DSA, including the obligations related to risk assessments and mitigation measures to protect minors online, in particular with regard to the risks to mental health and physical health, and on the use of their services by minors,” the Commission wrote in a press release.
The EU has given the companies until November 30 to respond with the data. Regulators will then assess next steps — which could include opening formal investigations.
The DSA establishes a governance framework for platforms to respond to reports of illegal content or products. Larger platforms have additional responsibilities — including in relation to algorithm-driven features like recommender engines. This includes performing risk assessments and mitigations in respect of children’s safety and well-being.
The regulation also explicitly bans targeted advertising on minors.
Confirmed breaches of the DSA can attract fines of up to 6% of global annual turnover. Penalties can also be issued for failure to provide data on request.
It’s the second such info request the EU has sent TikTok since the regime began applying on the company. Last month the EU asked the video sharing platform for “general aspects” concerning the protection of minors, as well as requesting information on its response to the Israel-Hamas war.
The Commission’s follow-up request to TikTok regarding child protection suggests it’s seeking more detail about how the platform is fulfilling its obligations to protect minors.
Last month the Commission sent information requests to Meta and X (formerly Twitter), following reports about the spread of terrorism content, hate speech and disinformation targeting the Israel-Hamas war. In Meta’s case the EU also asked for information on its approach to election security.
The DSA is already enforceable on so-called very large online platforms (VLOPs), a designation that applies to the aforementioned four tech giants.
The full list of 19 VLOPS and VLOSE (aka very large online search engines) was announced by the Commission in April.
So far, the EU has not confirmed any formal investigations under the DSA. But the blitz of information requests since the regulation became enforceable on VLOPs at the end of August suggests it may be preparing to take that next step.
The Commission also recently put out a call for designated Member State level bodies, which, from early next year, will be responsible for DSA oversight on smaller digital services based in their countries, to help support its enforcement of the regime on tech giants.
TikTok and YouTube were contacted for comment on the latest Commission DSA information requests.
Update: A TikTok spokesperson said:
Our CEO had positive discussions with the European Commission this week and we’re pleased that our efforts to keep people on TikTok safe, engage with these important topics and comply with the DSA, have not gone unnoticed. We will continue to work closely with the Commission, including on this latest request.