Like everyone else I am shocked, shocked!, to learn that Facebook’s latest policy switch was one which will lead to users posting more public data. This time it’s teenagers, who now “have the choice to post publicly on Facebook.” which they didn’t before. Why? Simple: as the NYT puts it, “fundamentally, Facebook wants to encourage more public sharing, not less … to attract consumer advertisers.”
I don’t intend to get all moral-panic on you here. Letting teens post publicly sounds pretty reasonable to me, as does Google+’s recent moves to surface your friends’ +1s in your stream, and use their faces in ads. Both are good examples of how social media providers keep trying to make their users’ activities both more visible, more public, and more intimately tied to the advertising that keeps the money rolling in. As Josh Constine recently put it, “you’re not just the product, you’re the ads.”
But there’s more to it than that. Advertisers, marketers, and Facebook and Google themselves are already analyzing all the data you’ve given them, to better target you with ads. Facebook has commissioned an “AI team” working on using deep-learning neural networks to find hidden patterns and insights in that data; Google, which already uses deep-learning technology in various domains, is presumably ahead of them.
What people don’t realize — yet — is that many of the things they haven’t explicitly told Facebook or Google can be deduced from the things they have. Got a medical condition, legal issue, sexual predilection, substance habit, or professional failure that you want kept private? It seems very likely that the ever-improving pattern-recognition systems Facebook and Google are building (and others will inevitably follow) will be able to extrapolate those secrets, with a high degree of confidence, from the subtle cues and nuances inherent in your visible online profile and chatter.
Put another way, you may think you’re sitting privately in your home, telling the online world only what you want it to see…but, in fact, every little morsel of seemingly innocuous knowledge you give the Internet is being used to build a window in the wall beside you, opaque to human eyes, but all too transparent to the deep-learning systems of the future. Like a one-way mirror.
People think ‘big data’ avoids the problem of discrimination, because you are dealing with big data sets, but in fact big data is being used for more and more precise forms of discrimination … it is possible to generate a detailed picture about a person’s health, including information a person may never have disclosed to a health provider.
Similarly, a Cambridge University paper reports:
Facebook Likes, can be used to automatically and accurately predict a range of highly sensitive personal attributes including: sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age, and gender.
Oh, yes, and:
The volunteers completed a common personality questionnaire through a Facebook application and made their Facebook status updates available so that researchers could find linguistic patterns in their posts … the researchers built computer models that predicted the individuals’ age, gender, and their responses on the personality questionnaires with surprising accuracy.
That’s all with today’s technology. Imagine that of a decade hence.
The conclusion seems clear. Any individual comment of yours, or tweet, or Like, might be meaningless outside of its context … but collectively, they can be used to determine almost everything about you, including whatever you don’t want other people to know. It’s very possible that you have already, accidentally, revealed your deepest secrets to the entire world.
Now, granted, the world was probably going to find out anyway. Mark Zuckerberg is right when he says we are moving towards a world which is more open and more connected — and therefore less private. In fact, between social media, deep learning, “big data,” and increasingly ubiquitous cameras and other sensors, on drones and embedded virtually everywhere, we’re moving headlong towards a world where privacy is a scarce and expensive commodity. Just as was foretold by the prophet Gibson, lo these many decades ago:
`Hey, that’s fine by the Finn, Moll. You’re only paying by the second.’
They sealed the door behind him and Molly turned one of the white chairs around and sat on it, chin resting on crossed forearms. `We talk now. This is as private as I can afford.’
– William Gibson, Neuromancer
On a personal and cultural level, we could probably just adjust to this. I can envision people a generation or two hence reacting to sex tapes with disinterested shrugs rather than outraged shaming, whether or not celebrities or high-school students are involved. It’ll be heaven for stalkers, but that appears to be part of the social price that tomorrow’s technology demands.
On a professional level, however, things will get messy. What happens when would-be employers run tomorrow’s pattern-recognition analyses on the sum total of all your lifetime online activity to see if you’re a likely job-hopper, or your health is sketchy, or you’re “not a good cultural fit,” or you’re a binge drinker, or you’re pregnant?
Simple: people will inevitably wind up carefully tailoring their online lives to seem professionally desirable — meaning that real sharing, and especially public sharing, will wither away and die, because no one will want to take the chance that they inadvertently share far more about themselves than they wanted to. Which is of course the exact opposite of what Facebook and Google want. Facebook’s AI Team may boost its ad sales now, until people begin to realize the consequences of that kind of analysis…but in the long run, that team could conceivably be carrying the seeds of Facebook’s demise.