Listen Carefully

The way technology shapes language is fascinating. And often colloquially cringeworthy (to wit: ‘I don’t have bandwidth right now’… ‘Let’s take this offline’… ‘I’d totally swipe that’…). At bottom it’s a reflection of how our tools influence and shape our social interactions.

The reverse is also true, of course. Technology platforms end up being moulded by their human users in ways their makers did not envisage or necessarily intend when they built the initial feature set and structure.

Hashtags on Twitter and the Retweet function were essentially user-generated, for instance. The people with early access to an MVP grabbed the baton and ran with it — and that combination of rough-and-ready technology plus interested early adopters resulted in something fascinating and genuinely useful being created. There’s a lesson there in these times of increasingly prescriptive algorithms.

And, yes, as a counterpoint, we have the gaming of social platforms by hate speech groups bent on using the reach of mainstream social media to amplify fringe agendas, and silence the targets of their abuse, via co-ordinated bullying. Even Twitter now thinks it needs to up its game to deal with that kind of corrosive service misuse.

Stepping out from there, there are also parallel lessons to be drawn from how people are talking about the technology services they are interacting with. Not just in general terms — whether they like or dislike a service and so on; but what their more subtle linguistic choices might be expressing on their behalf. Reactions that might be evident in language before users are consciously articulating their feelings as fully formed thoughts.

One fascinating current example involves dating apps, and the use of a particular phrase  — “going dark” — to describe what happens when a match stops responding. That is to say when the signaling between two users stops because one of them stops responding, despite there having been a degree of priorly stated mutual interest. (One wonders how Jane Austin might have interrogated such hinged moments.)

The developer behind a machine-learning Tinder bot which hacks the dating app to automatically filter potential matches based on the user’s prior aesthetic preferences, used the phrase in a blog about his project — noting that it was this characteristic of dating app interactions that encouraged him to add an additional feature to his bot. This feature auto-starts conversations by sending pre-set messages to test the waters. An automated interest primer if you will.

Describing the auto-messaging feature, he wrote:

The advantage of this? It removes the time involved in filtering new Tinder matches since a lot of people tend to drop off and “go dark” early in the process. Extended conversation is a strong indicator of interest.

It’s a pitch perfect engineer’s fix for what is actually a far more interesting social problem that dating apps are exposing to a greater degree of social scrutiny by increasing the frequency of/encounters with this ‘dropping off’ behavior.

‘Going dark’ in fact hints at a kind of social breaking point which apps are stumbling into, at the cutting edge of human emotional interactions. It’s the digital equivalent of a mild snub or social slight — very slight, given it’s not public and is doled out to someone you don’t yet have any strong ties to.

The fact these types of apps make it easier to meet new people means it necessarily follows their users are going to encounter a higher volume of explicit social rejections than they normally would. And so “going dark” is the phrase they have come up with to describe an antisocial byproduct of using a socially ‘catalyzing’ service — and, perhaps, to take some of the sting out of all those ‘I’m not/no longer interested’ signals.

Since dating apps enable surface connections to be made more quickly it logically follows they are also responsible for more speedy social disconnections. And so the necessity to coin a phrase for this now more commonly experienced behavior — ergo: “going dark”.

Going dark is, in my view, a pretty great descriptor for this dating app foible, since it characterizes other users as so many pixels that can either light up or not, when you ping them, rather than as flesh and blood humans who have stopped responding and are therefore actively snubbing you. Which of course they (probably) are. But the point is it shows users taking a sophisticated approach to what can be very throwaway interactions — on account of these apps making it so easy to engage with randoms.

In essence, the humans are defusing all the subtle disconnections that such a high volume of digital interactions inevitably entails. After all, virtual social snubs aren’t really all that. And digital dating relations are necessarily cheap, given they aren’t (yet) deep.

If there is any cause for concern here, it’s that “going dark” entering the dating app user lexicon may also signify interaction overload. That people are being swamped with so many signals they feel unable to respond, politely or otherwise, to each and every match — and so are resorting to switching off the messaging channel to enable them to move on and keep using the app. Aka they are taking the path of least resistance.

(Silence may also be the most tactful way to say ‘no thanks’ to a prospective date — rather than having to explain at length that while they seem(ed) interesting their follow up signals have not struck such an interesting chord. Another example of this sort of subtle social signaling at play in the digital space is the way Twitter users deploy the star feature as a word-free meta-communication method. A way to say something without actually saying anything.)

Through a lens, darkly

The underlying concept of “going dark” also brings to mind the latest Black Mirror episode — Black Mirror being Charlie Brooker‘s satirical TV series which imagines how technology tools might warp our social interactions in a not-too-far-flung future. In the holiday special, screened before Christmas, a fictional story unfolds involving people who have had embedded cameras (called Z-Eyes) implanted into their eyes — which enables them to activate advanced augmented reality functions such as filtering out the sight and sound of particular individuals.

When activated, instead of a person they see a fuzzy silhouette and hear garbled noises rather than actual words. Brooker calls this feature “being blocked”. It’s like an extreme manifestation of going dark. But one that absolutely stings — given it distorts actual human face-to-face interactions. Brooker’s point is that applying digital social management tricks at the point of flesh and blood contact absolutely crosses a line. Or to put it another way, it’s a man in the middle attack on human communications.

Another linguistic signifier for technology that is failing to appreciate the nuance of social interaction is the pejorative term ‘Glasshole’ — used to brand and mark out for social censure users of Google’s (not at all fictional) face computer, Glass.

Like the fictional Z-Eyes, Glass is a hybrid device which uneasily straddles the digital and physical worlds. The social unease it generates has been obvious, and amply explains why consumers have generally shunned it. (And why Google has been forced to have a major rethink.) No one is coining sensitive wording such as “go dark” to describe Glass’ influence on their social lives. Because consumers aren’t allowing it into their lives. The problems inherent with the technology are writ-large in the language that does, and does not, coalesce around it.

Developing products that are not isolated and separated from their social context is the crux here. And the tl;dr? Linguistic alienation can be a symptom of a far greater disconnect.

Listen carefully to the language your product generates and it might, as in the case of all those dating app users flicking the kill switch on matches, flag up potential service stress points. Or even — in the case of Glass and its surface tension of ‘Glassholes’ — illuminate the antisocial crux of your abject failure.