Editor’s note: Tadhg Kelly writes a regular column about all things video game for TechCrunch. He is a games industry consultant, freelance designer and the creator of leading design blog What Games Are. You can follow him on Twitter here.
I seem to be on the wrong side of the Facebook experiment issue. I’m referring to the news which broke recently that Mark Zuckerberg himself (no, not really) conducted an experiment in 2012 to test whether influencing what a user sees in turn affects what they post. If they were shown more negative material, for example, did they become more negative? The answer is yes. The conclusion? Facebook seems able to influence our moods (well, sort of).
The news has resulted in many folks being up in arms. They feel invaded, manipulated, spied-upon or otherwise violated. This always happens with this kind of news though. A large degree of technical ignorance pervades our society which, when occasionally punctured by a meme like this, tends to react from the hip. The naturally Orwellian among us foresee terrible futures. The professionally outraged seize their chance to explain why the news is yet another tech industry travesty. Where some data scientists manipulated an algorithm and saw users respond with a couple of grouchier posts than normal, those who fear the worst see the worst and cry havoc.
I’m on the wrong side of all this because I don’t take it seriously. Because this stuff happens all the time, yet the sky conspicuously does not fall on our heads. It’s called metrics, analytics, finding product/market fit and increasing engagement. You and me and everyone we know (in the tech world) are largely in the business of increasing engagement and that means to some degree we are in the manipulation business.
Take, for example, Buzzfeed. From the title through to the word/image count, Buzzfeed articles are constructed to elicit laughs, and thus bring more visits and shares. There are clear formulae being used, and those would have been tested to see what worked.
Take, for another example, Google. Google spends a considerable amount of resources tweaking and shaping its famously secret search algorithm to get better results. Why? Because if users feel Google is better then they will always come back to Google. But do you really think that the company that split test 41 shades of blue isn’t doing secret user tests all the time to see how the quality of results affects usage?
Take, for a third example, Amazon. Amazon legendarily stores huge amounts of data about its customers and uses that information to try and build a better shopping experience for them. The result is better service, selection and prices than its competitors, which keeps customers coming back. But to do so Amazon has clearly had to experiment with its users along the way to know for sure what the right path is.
Arthur C Clarke famously said that any sufficiently advanced technology is indistinguishable from magic, and for most of us this is true. We live in an age of wonder in many respects, of devices and services that bring new kinds of joy into our lives, yet with little clue as to how they actually work. Tools that all of us use many times a day are essentially black magic boxes powered on dreams and fairy dust and fueled by phlogiston.
The results can seem numinous. In games numina are the gateway to the thaumatic experience, that unique sensation in interactive entertainment where immersion verges on belief. The numinous quality of technology also fuel the sense of possibility. From Oculus Rift through to smartwatches, we always seem on the verge of the truly limitless. We animatedly discuss what it will be like in five years time when we’re all hooked up to Internet III and traversing the datum planes. We’re excitedly looking at glimpses through the veil of ignorance and speculating on what might be.
But equally the results can feel ominous. Trust and powerlessness become big issues. In the surveillance society, especially so. It’s all to see to make the leap of believing that there are entities behind the glass, great and powerful Ozzes manipulating the big head toward darker ends. Programming takes on the air of high sorcery. Silicon Valley becomes a technocrat idyll that must be resisted before it turns the rest of us into automata. The great and powerful Zuck is stealing our emotions with his flying data monkeys…
The difference in this case is that Facebook talked about its research whereas most of the time said research is confidential. Had it not, you might argue, Facebook’s veil of ignorance would have remained intact. Rather than opening their Top Stories feed this morning and eyeing it with newfound suspicion, users might have laughed at those latest baby photos that their sisters uploaded, or sent birthday wishes to their friends. And otherwise carried on none the wiser.
It is a little weird to be arguing for obfuscation, but I appear to be. What I’m talking about here is not whether it’s right or wrong to be transparent, but rather how knowing affects customer relationships for better or worse. Customers “see” a product in a different light once they’ve had a peek behind the curtain (or think they have) and often they never view it the same way again. In games particularly it’s important to realize that: You can be too transparent.
Game companies experiment with their users all the time. They have to. Players are a moving target so developers always have to change their approach, but nevertheless they are still trying to entertain. So game makers are heavily invested in trying to affect players’ emotional state for many reasons. Perhaps the game maker wants to encourage her players to be better versions of themselves, to give them a fun experience or to suck them into her lair and then make them pay to stay. Mostly it’s simply because she wants them to keep playing.
Game companies work hard to get the emotions that they need (whether for good or evil) by studying players closely. They listen, observe, measure, adjust and redeploy. Whether they do so through judging the response to big graphics trailers, data science or even simple playtesting, the player is always a sort of scientific subject in the maker’s psychological lab. But if players become aware of how much they are toyed with, their game is often ruined.
For example, to walk the halls in Gone Home is to walk in another world and to see and feel as the older sister terrified that her sibling has done something to herself. The very fact that most of us don’t know how the game works gives it this power. Gone Home becomes so much less of an experience when you grok that it’s just a level with a few triggers and text assets that don’t really affect your progress.
Games lose their magic when players fully comprehend how they work, robbing them of their mystery. Apparently so do social networks.