Danah Boyd: How Technology Makes A Mess Of Privacy and Publicity

Today at SXSWi, keynote speaker Danah Boyd took the stage to talk about privacy and publicity, and how they intertwine online. Boyd is a Social Media Researcher at Microsoft Research New England, and has studied this space extensively for years. It was a compelling talk that challenged the notion that personal information is on a binary spectrum of public or private. To help underscore her points, she recalled and discussed a number of major privacy blunders from Facebook and Google. You can find my notes from the presentation below.

Boyd says that privacy is not dead, but that a big part of our notion of privacy relates to maintaining control over our content, and that when we don’t have control, we feel that our privacy has been violated. This has happened a few times recently.

How The Buzz Launch Failed

As a first example Boyd brought up Google Buzz. She says that nothing with the launch was technologically wrong — you could opt out of Buzz, elect to hide your friend list, and so on. But the service resulted in a PR disaster because Google made non-technical mistakes, doing things that didn’t meet user expectations:

  • Google integrated a public facing system in one of the most private systems you can imagine. Lots of people thought Google was exposing their email to the world.
  • Google assumed people would opt out if users didn’t want to participate. “I can’t help but notice that more technology companies think it’s ok to expose people tremendously and then back pedal when people flip out”, she says.
  • You want to help users understand the proposition. You need to ease them in, invite them to contribute their content.

Boyd says that years ago, researchers noticed people in a chat room would often ask “A/S/L” (age, sex, location). So some services, looking to streamlines things a bit, started building user profiles that had this information. What they failed to understand is that this “A/S/L” was a sort of chatroom icebreaker. Users lost that, and putting that information in a profile — even if they would have shared it to answer that chat message — could creep them out.

With Buzz, Google found the social equivalent to the famous “uncanny valley” (where things seem almost natural, but aren’t quite close enough, so they’re creepy). They collapsed articulated networks (email) and assumed it was a personal network.

Boyd then transitioned to talk a bit about the fuzzy lines between what is public and private. She says that just because people put material in public places doesn’t mean it was meant to be aggregated. And just because something is publically accessible doesn’t mean people want it to be publicized.

The Facebook Privacy Fail
Boyd’s second case study was Facebook’s privacy changes in December, when Facebook changed ‘everyone’ to the default. We’ve written extensively on this fiasco, which may take years to really reveal the extent of the damage it has done.

  • Facebook said 35% of users had read the new privacy documentation and changed something in the privacy settings. Facebook thinks this is a good thing, but it means 65% of population made their content public. Boyd has asked non-techie users to tell her what they thought their settings were. She has yet to find a single person whose actual privacy settings matched what they thought they were.
  • Boyd recounted a story of a young woman who had moved far away from an abusive father. The young woman talked with her mother (who had moved with her) about possibly joining Facebook. They sat down to make the content as private as possible, which worked well. But in December, the young woman clicked through Facebook’s privacy dialog (as most people did) and had no idea her content was public. She only found out when someone who should not have seen the content told her.

Boyd then discussed how different groups of people think about privacy. She says that teenagers are much more conscious about what they have to gain by being in public, whereas adults are more concerned about what they have to lose.

As an example, Boyd talked about a teenage girl who often put risqué, sometimes illegal content online. When Boyd asked why she’d want to do something, the girl replied, “I want to get a modeling contract just like Tila Tequilla”. Her calculation wasn’t about what she could potentially lose, but rather what she stood to gain.

Boyd says that most techies think about Personally Identifiable Information, but that the vast majority of people are thinking about personally embarrassing information. People often share private information with their friends in part because it allows them to bond, it makes them somewhat vulnerable and establishes trust. But when it’s through technology (e.g. Facebook’s public by default setting) it’s a huge technology fail.

Boyd also called out the presence of racism in social media. On the night of the BET awards last year, all of the trending topics were dominated by terms relating to the event and the black community. In response, some Twitter users made very racist comments — clearly even these open communication platforms are still prone to hate.

To conclude the talk, Boyd pointed out some of the challenges we will continue to face with regard to privacy online. She asks whether or not teachers can be expected to maintain a professional, pristine presence online — something that is very difficult to do while leading a normal life.

Ultimately, she says, “neither privacy nor publicity is dead, but technology will continue to make a mess of both.” We’ve been looking at privacy and publicity as a black-or-white attribute for content, when really it’s defined by context and the implications of what we’ve chosen to share.

Image by Adam Tinworth