The PACT Act is a new bipartisan effort to reform Section 230, the crucial liability shield that enables internet platforms to exist, approaching the law’s shortcomings “with a scalpel rather than a jackhammer,” as Senator Brian Schatz (D-HI) describes it. It is a welcome alternative to the dangerous EARN IT Act and risible executive order also in the running.
Section 230 protects companies online from being liable for content posted by their users, as long as those companies remove illegal content when it is pointed out to them. Politicians have recently characterized it as an excuse for companies like Facebook and Twitter to control speech on their platform and avoid responsibility for shoddy or arbitrary moderation policies.
But the two most high-profile attempts to change this law, which arguably made the modern internet possible, are riddled with problems. The EARN IT Act is widely understood to be an end run against encryption by an impotent and furious Justice Department. President Trump’s recent executive order, in addition to plainly being retaliation against Twitter for fact-checking his tweets, doesn’t actually appear to do much of anything.
Yet there is growing consensus that Section 230, while it has filled its purpose admirably for two decades, needs to be adjusted to accommodate for a changed digital environment. To that end, Sen. Schatz and his colleague Sen. John Thune (R-SD), leaders of the Communications, Technology, Innovation and Internet Subcommittee, are proposing a reasonable alternative.
“The best thing we can do for the internet, and for the law that enabled the internet to happen, is to modify this law so that it works for another 20 years instead of pretending that it’s perfect just the way it is,” Sen. Schatz said in a call with press.
Their Platform Accountability and Consumer Transparency Act focuses more on exposing the process rather than changing it. Under the proposed law, companies using Section 230 would have to:
- Publicly document their moderation practices and issue a standardized quarterly report on actions they’ve taken and the complaints that prompted them.
- Make and report moderation decisions within 14 days of user reports, and allow appeals.
- Remove “court-determined illegal content and activity” within 24 hours, with some flexibility allowed for smaller platforms.
The act would also limit the scope of Section 230 in protecting companies when they are facing action from federal regulators and state attorneys general, or when they are probably aware of the illegal nature of the content.
It would not affect or involve changes to encryption, which is another tool companies have to distance themselves from illegal content: If they can’t read the data, they can’t tell if it’s illegal. But attempts to weaken encryption or reduce its use have been met by polite but firm rejection by the tech industry — it’s clear that we have been traveling down a one way street in that regard.
“This is not designed to attract people who want to bully tech companies into political submission,” said Sen. Schatz. “It’s designed to improve federal law.”
“Here’s why we think this bill is significant,” he continued. “First, because we believe it is the most serious effort to retain what works in 230, and try to fix what is broken about 230. Second, you have the chair and ranking member of the subcommittee introducing the bill, which is not a trivial matter. And third, because we do think there is an appetite to legislate here. Though the volume gets turned up when someone wants to beat up on the platforms via cable TV or Twitter, the serious work of the Commerce Committee has always been bipartisan.”
You can read the full text of the bill here. We’ll soon hear whether the senators’ effort bears any fruit.