An Inconvenient Proof

Nobody expects The Spanish Inquisition!

I still remember the first time I watched this Python sketch in my basement recorded off of PBS onto my VCR. As a Christian, I suppose I could have been offended that these guys were mocking some aspect of the church. But it was too funny not to enjoy. And too true. Killing in Christ’s name is the equivalent of binge eating for Gandhi. It’s off brand. It dishonors the teachings of the words and actions attributed to Jesus. And in my studies of New Testament scripture, I have as yet to find any translations of Matthew 28:19 in Latin, Greek or Aramaic that state, “Go and make disciples of all the nations…or kill them.

I went to college planning on becoming a minister and majored in history. I fell in love with research, especially of New Testament scripture. Initially, ironically, this fascination stemmed from my lack of faith. Like so many individuals clinging to religiosity versus honest introspection, I felt if I memorized enough archaeological details I could prove via empirical fact that Christ was the Son of God. And scientifically, there is a great deal of validation for the historicity of many of the books of the New Testament. But in the same way I might want to introduce two good friends to each other in hopes they might fall in love, I’ve come to realize I can’t force the decision for someone to believe in God.

That’s up to them.

Free Will’s A Bitch

An Inconvenient Proof refers to the fact that personalization algorithms proselytize via code. Designed to scrutinize our lives, they’re also programmed to influence our behavior. Created by humans, every algorithm is imbued with the biases, business goals and personal agendas of their manufacturers. This doesn’t make artificial intelligence malevolent. But unless individuals are allowed to control their personal information, the algorithm economy is a data dictatorship. There’s no free will involved when you’re clandestinely tracked and subconsciously manipulated.

Our ID Is Their IP

I used to work as an EVP in a top-10 public relations firm, so I can say the following statement from experience:

No marketing funnel ends in abstinence. 

I was never in a client meeting with a global CMO where someone pointed to a chart and said, “It’s at this point in the customer journey where we leave them alone.” Nope. I remember a major ad buy we once did with a client where men introduced to a new product would be tracked online via ads that appeared wherever they surfed for up to six months until they clicked on our spot.

Fact: Today, our individual personal data is a commodity. We’ve been trained to give our data away, whether in exchange for “free” services or simply out of convenience. But the fact that it’s so easy to utilize is a huge boon for artificial intelligence. Studying human behavior en masse has never been simpler.

Fact: An organization’s data is their intellectual property. And because our data is so freely available, this means the insights generated from our unique identities are becoming the property of whatever organization that’s created the devices we use.

John Deere recently galvanized this precedent by claiming that farmers buying their computer-laden tractors don’t own the vehicles, but receive an implied license for the life of the vehicle to operate the vehicle. This implies any actions farmers take within the tractors can be used as a form of free Research & Development for John Deere to improve their vehicles. While this data will ostensibly be used to improve tractors for everyone’s use, farmers aren’t additionally compensated for the monetary benefits their insights provide.

Whether it’s regarding God or Google, free will can’t be forced or controlled to be real.

Now move this model beyond tractors to autonomous cars and companion robots. Throw in ubiquitous corporate facial recognition identity, unchecked by any federal laws regarding harvesting of personal identification. Myriad personalization algorithms controlled by organizations we may or may not know harvest our actions willy-nilly and our personal data is a commodity we can’t control or even fully access.

Fact: Whatever the noble aspirations of artificial intelligence, the algorithm economy is built on this model of data obfuscation by design.

Our Chief Weapon Is Surprise

Unless individuals are offered personal clouds or methodologies that provide privacy by design, it’s time to recognize that keeping people from controlling their personal data means we remove their ability to control their identity. This goes beyond issues of privacy to a person’s sense of agency and mental well-being. It’s one thing if we’re dealing with a single personalization algorithm, wondering how it’s affecting our opinions and sense of choice. It’s another when we’re confronted by thousands of algorithms, invisible yet influential. Soon we’ll risk losing our sense of subjective truth about who we are because we’ll have so many outside opinions on the subject.

We need an ethical standard for artificial intelligence for the algorithm economy. It’s not Christian, Buddhist, Atheist, technological or Luddite in nature. It’s human. We need to create a technological framework for the exchange of affective (emotion-based) and personal data that allows every individual to determine what data they share, with whom, and for how long. This is the equivalent of a free and open society versus a dictatorship.

This won’t hinder the development of artificial intelligence. Quite the opposite. Obfuscation by design means we eventually don’t need humans in the mix to analyze their data. We’ll already know what they’re going to do. Letting humans retain control over their data means we’ll still be tracked, but we’ll retain the ability and infrastructure to speak our truth.

Our messy, glorious, human truth.

Whether it’s regarding God or Google, free will can’t be forced or controlled to be real. While it may be inconvenient to provably align artificial intelligence to human values, it’s the only way to move forward in good faith.