A Journey To Intelligent Design

“Did you win your sword fight?”
“Of course I won the fucking sword fight,” Hiro says. “I’m the greatest sword fighter in the world.”
“And you wrote the software.”
“Yeah. That, too,” Hiro says.
— Neal Stephenson , Snowcrash

If you’re looking for zeitgeist biblical metaphors for the tech sector, you can’t do much better than the Tower of Babel. Human civilization a hundred years after the flood, monolingual and gathered in a city called Babel. They become possessed with the mission of building a tower to the heavens. Which they begin…

Until their control-challenged sky God, understanding that humans are capable of anything with one unified language and a will for transcendence, curses them with dialects. (I’m not Christian but am a disciple of narrative.)

The Tower of Babel by Lucas Van Valckenborch

The Tower of Babel by Lucas Van Valckenborch

The Lord said, “If as one people speaking the same language they have begun to do this, then nothing they plan to do will be impossible for them. Come, let us go down and confuse their language so they will not understand each other.” So the Lord scattered them from there over all the earth, and they stopped building the city. (Genesis 11:6-8)

This is God as benevolent systems architect-turned-royalist-hacker, dumping toxic malware onto the human social matrix. And we’re still living in it.

Sure, wired humans have been united into a “global village” across shared social platforms in the new millennium. And while we have Twitter and Facebook to thank for the global reach of the Arab Spring, #EricGarner, and the Veronica Mars film project crowdfund, this is not a return to evolutionary Tower-building by an empowered world citizenry. Nor is this technology being developed to unchain us from the holy hack by vanishing linguistic barriers.

Quite the opposite. These platforms exist primarily (post-IPO) to service enterprises, institutions and government agencies that place a premium on our written words.

In the modern paradigm, companies like Facebook, Twitter and Google are the primary beneficiaries of Babel. After all, it’s the machine-driven analytics of our words and their nuanced sentiments  —  which are used to compartmentalize, analyze and ultimately engineer user behaviors  — that make us the product that they sell to advertisers and other paying customers.

In other words, they have a large incentive to maintain the Old Testament status quo because it bestows considerable insight. And power.

Engineering contagion

Last year, a minor outrage erupted when members of Facebook’s Core Data Science Team published the report of a secret study on their users. The study, which manipulated News Feed displays of 700,000 people in order to test the impact of emotions on friend networks, proved that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”

In total, over 3 million posts were analyzed, containing over 122 million words, 4 million of which were positive (3.6 percent) and 1.8 million negative (1.6 percent).

The network effect of this contagion was studied, naturally, “via text-based computer-mediated communication” and was limited to English-speaking users.

This is all made possible by sentiment analysis: the deployment of algorithms that mine text for meaningful signals about the writer.  Now, as many people pointed out after the initial wave of (media-driven) indignation, sentiment analysis is still in its infancy. It’s neither accurate nor effective in helping artificial intelligence systems to learn and predict human behavior.

But it won’t be that way for long.

Facebook and Google are leading a new wave of R&D in the field of deep learning, an aspect of AI research, which MIT defines as:

“software [that] attempts to mimic the activity in layers of neurons in the neocortex, the wrinkly 80 percent of the brain where thinking occurs. The software learns, in a very real sense, to recognize patterns in digital representations of sounds, images, and other data.”

Over the past couple of years, Google has completed acquisitions of a few high-profile deep learning labs, including one run by Geoffrey Hinton who is widely considered, to quote WIRED, the “central figure in the deep learning movement.”

Meanwhile, Facebook has created its own AI research lab, led by Yann LeCun, one of Hinton’s former employees, and the guy who taught computers how to read written numbers (early AI tech that is now licensed to banks for ATM check-scanning).

Predictably, there’s been a backlash against the corporatization of these computer scientists. But there’s no point in blaming visionary geniuses like LeCun and Hinton. They’re being funded to code-replicate the human brain, which is a powerful enticement.

The motivation for Silicon Valley’s C-suiters is less… romantic. Whatever their future aspirations, the foundation of Babel’s corporate revenue structure is still built on an old business model: “delivering relevant, cost-effective online advertising,” to quote a recent Google annual report. And whatever technological advances are made in the AI field will be subsidiary to those which advance the cause of their shareholders: which is profit.

At least, if push came to shove.

Conditional kingdoms

As CEO of the world’s largest social platform, Mark Zuckerberg has one prime directive from his shareholders: Expand your market base.

So was it any surprise that the latest PR narrative developed by his Sun Tzu-wielding strategists was to showcase his command of Mandarin to Chinese millennials? In learning an entirely new language to court his company’s largest unpenetrated market, Zuck proved he‘s not only the ultimate citizen of Babel, he’s officially the mayor.

But his is a conditional kingdom. To keep the market happy, Zuck has to mine Facebook for every byte of value it has. Which, in the current social network paradigm of text-based programming, means developing the most aggressive artificial intelligence (deep artificial neural nets) and proxy agents to track and tag our behavioral patterns while accurately projecting, and guiding, future impulses.

Search and social network data scientists succeed by fragmenting and classifying us as individuals. They are the high priests of demographics and psychographics. But when their algorithms use our language to predict and direct our behavior before we, ourselves, have consciously made those choices, they cross the line into the zone of social engineering.

With the accelerated development of computational technology that can process and analyze big data in unprecedented ways, this poses a heavy existential threat to human civilization. Because even if the current social networks have the best intentions for their communities, billions of dollars have been marked against the economies generated by mining and exploiting that data.

And any diversion from those objectives would be ruthlessly punished by the market. This isn’t conspiracy theory, it’s market theory 101.

The tragic end of this story could well be of a group of brilliant technologists forced into a Faustian deal that ultimately takes the global communication platform once envisioned as evolutionary tech for our species and hacking it for a myopic, profit-driven end-game.

In short, like Old Testament gods, their baser instincts could lead them to fuck with the natural order of things. Another civilizational hack.

Which brings me back to the Bible.

A prophecy

There’s a little-known corollary to the Tower of Babel story. It comes at the other end of the Bible… thousands of years later, in the New Testament. Taking the form of a prophecy:

Humankind is again gathered, “all in one place” when suddenly:

…there came a sound from heaven as of a rushing mighty wind, and it filled all the house where they were sitting. And there appeared unto them cloven tongues like as of fire, and it sat upon each of them. And they were all filled with the Holy Ghost, and began to speak with other tongues, as the Spirit gave them utterance… Now when this was noised abroad, the multitude came together, and were confounded, because that every man heard them speak in his own language. And they were all amazed… (Acts 2: 2-4; 6-7)

The Coming of the Holy Spirit at Pentecost by Charles Nicolas Cochin II

The Coming of the Holy Spirit at Pentecost by Charles Nicolas Cochin II

Sticking with the zeitgeist Biblical-tech metaphor theme, this is a profound narrative arc that suggests the re-congregation of the divided tribes of Babel on a single (technological) platform. One with the potential to return us to our original vision and mission, of building a Tower out of this place: our self-authored extinction algorithm.
And companies like Facebook, Google and Twitter may well represent that Tower-building potential. But we have to ask the question: does the DNA of a thing constrain its future iterations and utility?
Put less symbolically, if these platforms profit through observing and tracking and engineering our behavior, can they simultaneously initiate in us the kind of “evolutionary moment” which is in some ways oppositional to the interests that they serve? Because the market isn’t interested in kumbaya technologies that liberate us from false desire and competition. You don’t need to be Jerry Seinfeld to know that.

It thrives from, and exists for, those impulses.

I have no doubt that Facebook acquired Oculus for the potential it has for humans to assemble virtually, post-linguistically, across space-time. The only question is whether they’ll neutralize its potential by gating it with Facebook Connect and fortressing it into the confines of a system-controlled UX? Because evolution is anarchic. And that’s Orwellian.

I don’t have answers to the questions I am musing about here. I do have a lot of ideas about what a platform like Facebook could become if it moved beyond its linguistic limitations and toward a generative object-based system.

But I would like to suggest that the Tower of Babel is not a story of our past, but rather one of our future. The divine kibosh that the “creator” put on our transcendent project wasn’t about killing our desire to be gods. It was about seeding the ultimate narrative: a collective hero’s journey to reconnect as a human community despite the extreme trauma of our separation, and those who naturally find themselves in positions that exploit it.

That’s what I call intelligent design.

Editor’s note: Stephen Marshall is the co-founder and product lead for ORA, a Seattle/London-based start-up innovating in the realm of dimensional data visualization, and portfolio company of the DataElite accelerator.