A new piece of bipartisan legislation aims to protect people from one of the sketchiest practices that tech companies employ to subtly influence user behavior. Known as “dark patterns,” this dodgy design strategy often pushes users toward giving up their privacy unwittingly and allowing a company deeper access to their personal data.
To fittingly celebrate the one-year anniversary of Mark Zuckerberg’s appearance before Congress, Senators Mark Warner (D-VA) and Deb Fischer (R-NE) have proposed the Deceptive Experiences To Online Users Reduction (DETOUR) Act. While the acronym is a bit of a stretch, the bill would forbid online platforms with more than 100 million users from “relying on user interfaces that intentionally impair user autonomy, decision-making, or choice.”
While this particular piece of legislation might not go on to generate much buzz in Congress, it does point toward some regulatory themes that we’ll likely hear more about as lawmakers build support for regulating big tech.
The bill, embedded below, would create a standards body to coordinate with the FTC on user design best practices for large online platforms. That entity would also work with platforms to outline what sort of design choices infringe on user rights, with the FTC functioning as a “regulatory backstop.”
Whether the bill gets anywhere or not, the FTC itself is probably best suited to take on the issue of dark pattern design, issuing its own guidelines and fines for violating them. Last year, after a Norwegian consumer advocacy group published a paper detailing how tech companies abuse dark pattern design, a coalition of eight U.S. watchdog groups called on the FTC to do just that.
Beyond eradicating dark pattern design, the bill also proposes prohibiting user interface designs that cultivate “compulsive usage” in children under the age of 13 as well as disallowing online platforms from conducting “behavioral experiments” without informed user consent. Under the guidelines set out by the bill, big online tech companies would have to organize their own Institutional Review Boards. These groups, more commonly called IRBs, provide powerful administrative oversight in any scientific research that uses human subjects.
“For years, social media platforms have been relying on all sorts of tricks and tools to convince users to hand over their personal data without really understanding what they are consenting to,” Senator Warner said of the proposed legislation. “Our goal is simple: to instill a little transparency in what remains a very opaque market and ensure that consumers are able to make more informed choices about how and when to share their personal information.”
The full text of the legislation is embedded below.
[scribd id=405606873 key=key-8FOvB4PbplLv0yxpiQ5f mode=scroll]