UK publishes safety-focused rules for video-sharing platforms like TikTok

Video-sharing platforms that offer a service in the U.K. have to comply with new regulations intended to protect users and under-18s from harmful content such as hate speech and videos/ads likely to incite violence again protected groups.

Ofcom, the country’s comms, broadcast and — in an expanding role — internet content regulator, has published the guidance for platforms like TikTok, Snapchat, Vimeo and Twitch today.

Among the requirements for in-scope services are that they must take “appropriate measures” to protect users from harmful material.

Terrorist content, child sexual abuse material, racism and xenophobia also fall under the “harmful content” bracket.

In a press release the regulator said its research shows that a third of U.K. internet users say they have witnessed or experienced hateful content; a quarter claim they’ve been exposed to violent or disturbing content; while one in five have been exposed to videos or content that encouraged racism.

There is no prescriptive list of what measures video-sharing platforms must use to prevent users being exposed to such content.

But there are a number of recommendations — such as clauses in terms and conditions; functionality like the ability for uploaders to declare if their content contains an ad; and user-friendly mechanisms for viewers to report or flag harmful content, as well as transparent complaints procedures.

Age assurance systems are also recommended, as are the inclusion of parental controls — as the regulation has the specific aim of protecting under-18s from viewing videos and adverts containing restricted material.

Ofcom is also recommending “robust” age-verification for video-sharing platforms that host pornography in order to prevent under-18s from viewing adult material.

A list of video-sharing platforms that have notified themselves to Ofcom as under scope of the regulations can be found here. (As well as the aforementioned platform giants it also includes the likes of OnlyFans, Triller and Recast.)

“We are recommending providers put in place systematic risk management processes to help providers to identify and implement measures that are practicable and proportionate,” Ofcom goes on to say in the guidance to video-sharing platforms.

“While we acknowledge that harmful material may not be completely eradicated from a platform, we expect providers to make meaningful efforts to prevent users from encountering it,” it adds.

“The VSP [aka video-sharing platform] Regime is about platform’s safety systems and processes, not about regulating individual videos, however evidence of a prevalence of harmful material on a platform may require closer scrutiny.”

The regulator says it will want to understand measures platforms have in place, as well as their effectiveness at protecting users — and “any processes which have informed a provider’s decisions about which protection measures to use”. So platforms will need to document and be able to justify their decisions if the regulator comes calling, such as following a complaint.

Monitoring tech platforms’ compliance with the new requirements will be a key new Ofcom role — and a taster of what is to come under incoming and far more broad-brush safety-focused digital regulations.

“Along with engagement with providers themselves, we expect to inform our understanding of whether users are being effectively protected, for example by monitoring complaints and engaging with interested parties such as charities, NGOs and tech safety groups,” Ofcom also writes, adding that this engagement will play an important part in supporting its decisions about “areas of focus”.

Ofcom’s role as an internet content regulator will be fleshed out in the coming years as the government works to pass legislation that will impose a wide-ranging duty of care on digital service providers of all stripes, instructing them to handle user-generated content in a way that prevents people — and especially children — from being exposed to illegal and/or harmful stuff.

A key appointment — the chair of Ofcom — has been delayed as the government decided to rerun the competition for the role.

Reports have suggested the government wants the former editor of the Daily Mail to take the post but an independent panel involved in the initial selection process rejected Paul Dacre as an unsuitable candidate earlier this year. (It is unclear whether the government will continue to try to parachute Dacre into the job.)

Ofcom, meanwhile, has been regulating video on-demand services in the U.K. since 2010.

But the video-sharing framework is a separate regulatory instrument that’s intended to respond to differences in the level of control as video-sharing platforms provide tools to allow users to upload their own content.

However, this newer framework is set to be superseded by new legislation under the incoming online safety regulatory framework.

So these regulations for video-sharing platforms are something of a placeholder and a taster as U.K. lawmakers grapple with laying down more comprehensive online safety rules which will apply much more widely.

Still, in the guidance Ofcom describes the VSP Regime as “an important precursor to the future Online Safety legislation”, adding:  “Given the two regimes’ shared objective to improve user safety by requiring services to protect users through the adoption of appropriate systems and processes, Ofcom considers that compliance with the VSP regime will assist services in preparing for compliance with the online safety regime as described by Government in the draft Online Safety Bill.”

The U.K.’s data protection regulation is also already enforcing a set of “age appropriate” design requirements for digital services that are likely to be accessed by children.