Crunch Network

The Ultimate Interface Is Your Brain

Next Story

Product Hunt Moves Beyond Tech With The Launch Of Snoop Dogg’s New “Bush”Album

Editor’s note: Ramez Naam is the author of five books, including the Nexus trilogy of sci-fi novels.

The final frontier of the digital technology is integrating into your own brain. DARPA wants to go there. Scientists want to go there. Entrepreneurs want to go there. And increasingly, it looks like it’s possible. You’ve probably read bits and pieces about brain implants and prosthesis. Let me give you the big picture.

Arkady flicked the virtual layer back on. Lightning sparkled around the dancers on stage again, electricity flashed from the DJ booth, silver waves crashed onto the beach. A wind that wasn’t real blew against his neck. [Adapted from Crux, book 2 of the Nexus Trilogy.]

Neural implants could accomplish things no external interface could: Virtual and augmented reality with all five senses; augmentation of human memory, attention, and learning speed; even multi-sense telepathy – sharing what we see, hear, touch, and even perhaps what we think and feel with others.

Sound crazy? It is… and it’s not.

Start with motion. In clinical trials today there are brain implants that have given men and women control of robot hands and fingers. DARPA has now used the same technology to put a paralyzed woman in direct mental control of an F-35 simulator. And in animals, the technology has been used in the opposite direction, directly inputting touch into the brain.

The implications of mature neurotechnology are sweeping.

Or consider vision. For more than a year now, we’ve had FDA-approved bionic eyes that restore vision via a chip implanted on the retina. More radical technologies have sent vision straight into the brain. And recently, brain scanners have succeeded in deciphering what we’re looking at. (They’d do even better with implants in the brain.)

We’ve been dealing with sound for decades, sending it into the nervous system through cochlear implants. Recently, children born deaf and without an auditory nerve have had sound sent electronically straight into their brains.

Nor are our senses or motion the limit.

In rats, we’ve restored damaged memories via a “hippocampus chip” implanted in the brain. Human trials are starting this year. Now, you say your memory is just fine. Well, in rats, this chip can actually improve memory. And researchers can capture the neural trace of an experience, record it, and play it back any time they want later on. Sounds useful.

In monkeys, we’ve done better, using a brain implant to “boost monkey IQ” in pattern-matching tests.

We’ve even emailed verbal thoughts back and forth from person to person.

Now, let me be clear. All of these systems, for lack of a better word, suck. They’re crude. They’re clunky. They’re low resolution. That is, most fundamentally, because they have such low-bandwidth connections to the human brain. Your brain has roughly 100 billion neurons and 100 trillion neural connections, or synapses. An iPhone 6’s A8 chip has 2 billion transistors. (Though, let’s be clear, a transistor is not anywhere near the complexity of a single synapse in the brain.)

The highest-bandwidth neural interface ever placed into a human brain, on the other hand, had just 256 electrodes. Most don’t even have that.

The second barrier to brain interfaces is that getting even 256 channels in generally require invasive brain surgery, with its costs, healing time, and the very real risk that something will go wrong. That’s a huge impediment, making neural interfaces only viable for people who have a huge amount to gain, such as those who’ve been paralyzed or suffered brain damage. This is not yet the iPhone era of brain implants. We’re in the DOS era, if not even further back.

But what if at some point, technology gives us high-bandwidth neural interfaces that can be easily implanted?

The implications of mature neurotechnology are sweeping. Neural interfaces could help tremendously with mental health and neurological disease. Pharmaceuticals enter the brain and then spread out randomly, hitting whatever receptor they work on all across your brain. Neural interfaces, by contrast, can stimulate just one area at a time, can be tuned in real-time, and can carry information out about what’s happening.

We’ve already seen that deep brain stimulators can do amazing things for patients with Parkinson’s. The same technology is on trial for untreatable depression, OCD and anorexia. And we know that stimulating the right centers in the brain can induce sleep or alertness, hunger or satiation, ease or stimulation, as quick as the flip of a switch. Or, if you’re running code, on a schedule. (“Siri, put me to sleep until 7:30, high priority interruptions only. And let’s get hungry for lunch around noon. Turn down the sugar cravings, though.”)

Implants that help repair brain damage are also a gateway to devices that improve brain function. Think about the “hippocampus chip” that repairs the ability of rats to learn. Building such a chip for humans is going to teach us an incredible amount about how human memory functions. And in doing so, we’re likely to gain the ability to improve human memory, to speed the rate at which people can learn things, even to save memories offline and relive them – just as we have for the rat.

We’re likely to gain the ability to improve human memory, to speed the rate at which people can learn things, even to save memories offline and relive them.

That has huge societal implications. Boosting how fast people can learn would accelerate innovation and economic growth around the world. It’d also give humans a new tool to keep up with the job-destroying features of ever-smarter algorithms.

The impact goes deeper than the personal, though. Computing technology started out as number-crunching. These days the biggest impact it has on society is through communication. If neural interfaces mature, we may well see the same.

What if you could directly beam an image in your thoughts onto a computer screen? What if you could directly beam that to another human being? Or, across the Internet, to any of the billions of human beings who might choose to tune into your mind-stream online? What if you could transmit not just images, sounds and the like, but emotions? Intellectual concepts? All of that is likely to eventually be possible, given a high enough bandwidth connection to the brain.

That type of communication would have a huge impact on the pace of innovation, as scientists and engineers could work more fluidly together. And it’s just as likely to have a transformative effect on the public sphere in the same way that email, blogs and Twitter have successively changed public discourse.

Digitizing our thoughts may have some negative consequences, of course.

With our brains online, every concern about privacy, about hacking, about surveillance from the NSA or others, would all be magnified. If thoughts are truly digital, could the right hacker spy on your thoughts? Could law enforcement get a warrant to read your thoughts? Heck, in the current environment, would law enforcement (or the NSA) even need a warrant? Could the right malicious actor even change your thoughts?

The ultimate interface would bring the ultimate new set of vulnerabilities. (Even if those scary scenarios don’t come true, could you imagine what spammers and advertisers would do to an interface of your neurons, if it were the least bit non-secure?)

Everything good and bad about technology would be magnified by implanting it deep in brains. Is the risk of brain-hacking outweighed by the societal benefits of faster, deeper communication, and the ability to augment our own intelligence?

For now, we’re a long way from facing such a choice. In fiction I can turn the neural implant into a silvery vial of nano-particles that you swallow, in and which then self-assemble into circuits in your brain. In the real world, clunky electrodes implanted by brain surgery dominate, for now.

That’s changing, though. Researchers across the world, many funded by DARPA, are working to radically improve the interface hardware, boosting the number of neurons it can connect to (and thus making it smoother, higher resolution, and more precise), and making it far easier to implant. They’ve shown recently that carbon nanotubes, a thousand times thinner than current electrodes, have huge advantages for brain interfaces.

They’re working on silk-substrate interfaces that melt into the brain. Researchers at Berkeley wrote a proposal for neural dust. And the former editor of the journal Neuron has pointed out that carbon nanotubes are so slender that a bundle of a million of them could be inserted into the blood stream and steered into the brain, giving us a nearly 10,000-fold increase in neural bandwidth, without any brain surgery at all.

Researchers across the world, many funded by DARPA, are working to radically improve the interface hardware, boosting the number of neurons it can connect to, and making it far easier to implant.

Even so, we’re a long way from having such a device. We don’t actually know how long it’ll take to make the breakthroughs in the hardware to boost precision and remove the need for highly invasive surgery. Maybe it’ll take decades. Maybe it’ll take more than a century, and in that time, direct neural implants will be something that only those with a handicap or brain damage find worth the risk to reward. Or maybe the breakthroughs will come in the next 10 or 20 years, and the world will change faster. DARPA is certainly pushing fast and hard.

Will we be ready? I, for one, am enthusiastic. There’ll be problems. Lots of them. There’ll be policy and privacy and security and civil rights challenges. But just as we see today’s digital technology of Twitter and Facebook and camera-equipped mobile phones boosting freedom around the world, and boosting the ability of people to connect to one another, I think we’ll see much more positive than negative if we ever get to direct neural interfaces. In the meantime, I’ll keep writing novels about them. Just to get us ready.

Featured Image: StudioSmart/Shutterstock