Facebook will assemble an independent Ethical, Legal and Social Implications (ELSI) panel to oversee its development of a direct brain-to-computer typing interface it previewed today at its F8 conference. Facebook’s R&D department Building 8’s head Regina Dugan tells TechCrunch, “It’s early days . . . we’re in the process of forming it right now.”
Meanwhile, much of the work on the brain interface is being conducted by Facebook’s university research partners like UC Berkeley and Johns Hopkins. Facebook’s technical lead on the project, Mark Chevillet, says, “They’re all held to the same standards as the NIH or other government bodies funding their work, so they already are working with institutional review boards at these universities that are ensuring that those standards are met.” Institutional review boards ensure test subjects aren’t being abused and research is being done as safely as possible.
In any new technology you see a lot of hype talk, some apocalyptic talk and then there’s serious work. Regina Dugan, head of Facebook's Building 8 lab
Facebook hopes to use optical neural imaging technology to scan the brain 100 times per second to detect thoughts and turn them into text. Meanwhile, it’s working on “skin-hearing” that could translate sounds into haptic feedback that people can learn to understand like braille. Dugan insists, “None of the work that we do that is related to this will be absent of these kinds of institutional review boards.”
So at least there will be independent ethicists working to minimize the potential for malicious use of Facebook’s brain-reading technology to steal or police people’s thoughts.
During our interview, Dugan showed her cognizance of people’s concerns, repeating the start of her keynote speech today saying, “I’ve never seen a technology that you developed with great impact that didn’t have unintended consequences that needed to be guardrailed or managed. In any new technology you see a lot of hype talk, some apocalyptic talk and then there’s serious work which is really focused on bringing successful outcomes to bear in a responsible way.”
In the past, she says the safeguards have been able to keep up with the pace of invention. “In the early days of the Human Genome Project there was a lot of conversation about whether we’d build a super race or whether people would be discriminated against for their genetic conditions and so on,” Dugan explains. “People took that very seriously and were responsible about it, so they formed what was called a ELSI panel . . . By the time that we got the technology available to us, that framework, that contractual, ethical framework had already been built, so that work will be done here too. That work will have to be done.”
In just the span of a week, Facebook went from being criticized for not innovating and just copying Snapchat, to merely using its social network monopoly to squash the innovation of others, to innovating so far into the future that it scares us and conjures dystopic thoughts.
Worryingly, Dugan eventually appeared frustrated in response to my inquiries about how her team thinks about safety precautions for brain interfaces, saying, “The flip side of the question that you’re asking is ‘why invent it at all?’ and I just believe that the optimistic perspective is that on balance, technological advances have really meant good things for the world if they’re handled responsibly.”
Facebook’s domination of social networking and advertising give it billions in profit per quarter to pour into R&D. But its old “Move fast and break things” philosophy is a lot more frightening when it’s building brain scanners. Hopefully Facebook will prioritize the assembly of the ELSI ethics board Dugan promised and be as transparent as possible about the development of this exciting-yet-unnerving technology.