Australia fines X for failing to provide information about child abuse content

eSafety, the Australian regulator for online safety, issued a $386,000 fine against X (formerly Twitter) Monday for failing to answer “key questions” about the action the platform is taking against child abuse content.

The watchdog issued legal notices to Google, TikTok, Twitch, Discord and X (which was known as Twitter back then) under the country’s Online Safety Act in February. The notices asked these companies to provide answers to questions about tackling child sexual abuse material (CSAM).

While the monetary value of the fine might not be significant, X will have to engage in reputation management with the platform already struggling to retain advertisers.

In a press release, eSafety said that X left some sections of responses “entirely blank” and others were incomplete or inaccurate. The Elon Musk-owned company was also criticized for not providing timely answers to the regulator’s questions.

Critically, the platform didn’t provide information on CSAM detection tech in livestreams and said it doesn’t use any tech to detect grooming.

The report also found Google guilty of providing generic responses — eSafety stated that these answers weren’t adequate. However, the regulator has issued a formal warning against Google instead of a fine, indicating that Google’s shortcomings weren’t as serious.

eSafety Commissioner Julie Inman Grant criticized Twitter/X for failing to meet its own promises about battling CSAM.

“Twitter/X has stated publicly that tackling child sexual exploitation is the number 1 priority for the company, but it can’t just be empty talk, we need to see words backed up with tangible action,” she said in a statement.

“If Twitter/X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation they either don’t want to answer for how it might be perceived publicly or they need better systems to scrutinize their own operations. Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.”

Last month, X removed an option for users to report political misinformation. An Australian digital research group called Reset.Australia wrote an open letter to X expressing concern that this move might “leave violative content subject to an inappropriate review process and not labeled or removed in compliance with your policies.”

After Musk took over, X/Twitter let go of a bunch of staff working on trust and safety issues. Last December, the company also dispersed the Trust & Safety Council, an advisory group that consulted the platform on issues like the effective removal of CSAM. As part of cost-cutting, the social media company closed its physical office in Australia earlier this year.

Earlier this month, India also sent a notice to X, YouTube and Telegram to remove CSAM from their platforms. Last week, the EU sent a formal request to X to provide details under the Digital Services Act (DSA) about steps the social media company is taking to tackle misinformation surrounding the Israel-Hamas war.