UK revives age checks for porn sites

In a fresh addition to incoming U.K. legislation that will set sweeping internet content rules for platforms, the government has revived a long-standing ambition to put porn websites behind age gates — saying today that it will mandate the use of “age verification technologies” on adult sites to make it harder for children to access or stumble across pornography.

The Department for Digital, Media, Culture and Sport (DCMS) made the announcement to coincide with Safer Internet Day, as it dials up its populist rhetoric around child protection.

A legal duty on porn sites to prevent underage access will be brought in by bringing such sites into scope of the Online Safety Bill, DCMS said — further expanding what is already very expansive draft legislation that the government set out in May last year.

The draft bill was initially focused on a range of “harms” attached to user generated content. However child safety campaigners warned that excluding actual porn from the proposed legal duty of care would undermine the whole regime — and the government appears to have accepted the point.

In a statement, digital minister Chris Philp said: “It is too easy for children to access pornography online. Parents deserve peace of mind that their children are protected online from seeing things no child should see.

“We are now strengthening the Online Safety Bill so it applies to all porn sites to ensure we achieve our aim of making the internet a safer place for children.”

DCMS said the incoming legislation will not specify which age verification technologies porn sites must use for “robust” age checks to ensure their users are 18 years or older.

“This could include adults using secure age verification technology to verify that they possess a credit card and are over 18 or having a third-party service confirm their age against government data,” it suggested.

“The onus will be on the companies themselves to decide how to comply with their new legal duty. Ofcom may recommend the use of a growing range of age verification technologies available for companies to use that minimise the handling of users’ data,” the government also said, adding: “The bill does not mandate the use of specific solutions as it is vital that it is flexible to allow for innovation and the development and use of more effective technology in the future.”

While child safety campaigners have been pushing for the Online Safety Bill to go further, civil liberties and digital rights campaigners continue to warn over the risks to people’s rights and freedoms, as well as challenging the feasibility of an over broad plan to regulate all sorts of internet content — arguing the legislation could have both a chilling effect on online expression and create barriers to doing digital business in the U.K.

Albeit, on the latter point, the age verification industry at least looks set to do very well if the bill, as currently envisaged — with mandatory age checks for porn sites and a ‘duty of care’ regime that will nudge platforms towards using some form of age assurance to shrink their liability — does indeed become law.

In what looks like a bid to address some of the criticisms around privacy and civil liberties, DCMS’ press release claims the age verification technologies that porn sites will be required to implement “do not require a full identity check”.

It also stipulates that while users “may” need to verify their age using identity documents, the measures companies put in place “should not process or store data that is irrelevant to the purpose of checking age”.

“Solutions that are currently available include checking a user’s age against details that their mobile provider holds, verifying via a credit card check, and other database checks including government held data such as passport data,” it goes on, further emphasizing that “any age verification technologies used must be secure, effective and privacy-preserving”.

Companies that “use or build” age verification technology will, DCMS notes, be required to comply with U.K. data protection regulations — else it says they could face enforcement action from the Information Commissioner’s Office (ICO).

However it’s worth highlighting that the government is simultaneously consulting on weakening the national data protection regime — with minister suggesting that lowering people’s privacy might somehow fuel digital business ‘innovation’; rather than considering the rather more obvious impact, i.e. if you green-light unfettered commercial data mining, of massively expanding the risks of abuse and misuse of people’s data, and lining up a pipeline of security breaches and scandals which would surely boost public mistrust in digital services…

DCMS’ claim that existing U.K. privacy laws will be able to prevent any abusive tracking of porn/internet users — i.e. as a result of mandatory age/identity checks — also requires critical scrutiny given the ICO has so spectacularly failed to stop adtech’s abusive tracking of internet users. Quite a considerable suspension of disbelief is required to imagine an ICO volte-face that would have the caution-loving, business-friendly regulator pro-actively interrogating porn sites’ age verification standards in order to robustly defend U.K. porn users’ right to pleasure themselves privately, to put it politely.

The government press release doesn’t dwell on such details, though. Instead DCMS merely seeks to normalize the proposed expansion of online tracking by suggesting: “Online age verification is increasingly common practice in other online sectors, including online gambling and age-restricted sales.”

It follows that by saying it is “working with industry to develop robust standards for companies to follow when using age assurance tech, which it expects Ofcom to use to oversee the online safety regime”. (Or, put another way, the government line is basically ‘don’t worry about how industry might make use of age assurance tech because we’re working with industry to figure that out!’ so, er… )

The U.K. government has tried before to make porn sites carry out age checks. But was forced to quietly drop the plan back in 2019 — following criticism of the technical and regulatory challenges of mandating age verification, as well as privacy concerns attached to requiring all users of adult websites to identify themselves. Although it also said it would work to bake the concept into the more comprehensive approach to online harms it was already brewing.

In spite this chequered history with ‘porn blocks’ — and the current controversy over the ever-increasing scope and expression-chilling sprawl of the Online Safety Bill — the department is doubling down on internet content regulation, by adding even more of it, rather than proposing any U-turns.

At the weekend, secretary of state for DCMS, Nadine Dorries, did a round of media interviews to trumpet new additions to the list of criminal content — which she said would be added to the face of the bill to further beef it up — including: Online drug and weapons dealing; people smuggling; revenge porn; fraud; promoting suicide; and inciting or controlling prostitution for gain.

Dorries also announced that new criminal offences would be added to the bill, including with the aim of tackling domestic violence and threats to rape and kill (i.e. when they’re being made online or through the use of digital devices).

The planned changes will greatly expand the types of content that in-scope platforms will be required to proactively identify and remove — rather than merely acting on user reports of problem content.

So, in short, it’s a sea change in the legal liability regimes platforms have enjoyed for decades — with huge implications for their operations and costs and for online speech.

“To proactively tackle the priority offences, firms will need to make sure the features, functionalities and algorithms of their services are designed to prevent their users encountering them and minimise the length of time this content is available,” DCMS said in a press release Friday, suggesting: “This could be achieved by automated or human content moderation, banning illegal search terms, spotting suspicious users and having effective systems in place to prevent banned users opening new accounts.”

The stick proposed in the bill for enforcing this highly regulated content moderation regime includes major financial penalties — of up to 10% of global annual turnover for platforms that fail to comply — and even, potentially, jail time for senior executives whose algorithms or processes are found failing to prevent people from being exposed to a growing list of harmful stuff, which now runs the gamut from terrorism and child sexual exploitation to suicide promotion and fraud.

The bill is also set to empower the U.K.’s new internet content regulator, Ofcom, to be able to block sites from being accessible in the U.K. So, as well as fuelling age verification tech, it will likely put a growth rocket under VPN services.

In another regulation-fuelled niche business opportunity, the government has been trying to encourage the development of technologies that can be embedded into end-to-end encryption services to enable the law to reach content that’s robustly encrypted — and would therefore escape oversight by the service in question — by making it scannable for child sexual abuse material. (Albeit, once you can scan for one type of illegal content the political pressure to feature creep and bolt on checks for other criminal or just harmful content will likely be hard to resist.)

The attempt to take on and tackle so many highly nuanced and complex issues in a single piece of legislation (in fact there will likely be plenty of secondary legislation too) should, by rights, be political suicide. But the ease with which tech giants can be painted as irresponsible profiteers — thanks to neverending content scandals — has created ideal conditions for the government to put overly simplistic ‘child protection’ spin on the plan and build populist support for a long overdue schooling of Facebook et al.

The ‘kitchen sink’ nature of the bill combined with fuzzily drawn definitions also makes it exceptionally hard to predict outcomes — which gives further cover to ministers to push the proposal forward since there is no clear steer available on what all the interlocking requirements will actually mean yet. 

Two parliamentary committees which scrutinized the Online Safety Bill before the latest additions have raised a number of concerns — including, most recently, the DCMS committee warning the proposal falls short on protecting speech and tackling harms.

In December another parliamentary committee also pressed for a series of changes — while supporting the broad aims of the government to end the era of big tech’s self regulation.

The government says it factored such feedback — along with wider concerns being expressed about porn site access by child protection campaigners — into the latest additions to the draft bill.