An intelligent piece by BuzzFeed’s Mat Honan said that Google’s self-driving car would bring about the “ugly end of driving.” He’s correct in almost every way; cars are “giant, inefficient, planet-and-people-killing death machines,” killing, as he cited, 1.2 million people a year. This is a good enough argument that removing a whole bunch of drivers from the road is a bloody good idea.
Uber’s goal is to remove 1 million of them from New York City with UberPOOL alone (or is it 400,000?). If nobody drove at all and we were all ferried around in magical little boxes that drove us places, that would be a better future. It may be one that comes true, too, as Google originally wanted to do what Tesla has done — create an autopilot system that takes care of your highway driving (which I’ve now “driven” a few hundred miles). Except the humans relied on it too much, so they plan to go whole hog and make cars drive themselves the entire way.
Just to be clear, the technology Google originally planned is currently in a vehicle you can purchase, if you have at least (or can get a loan for) around $75,000, and have the right charging gear.
However, according to a study cited in a great Deadspin article on autonomous driving, 96.2 percent of people polled actually want to retain the steering wheel, pedals and other controls in their cars. Though Deadspin (rightly) says that that’s the wrong way to go about autonomous cars, it shows that people aren’t quite ready to give up control to a vehicle.
Uber Over Google
While Travis Kalanick would certainly love to replace all those accursed costly humans, there’s a certain human connection that I believe makes Uber or Lyft more likely to replace someone’s willingness to own a car than an automated one. That personal connection — the feeling that a real-life human is there to talk to, direct and communicate with (regardless of personality or their driving) adds more trust than machines.
Every human being reading this has at some point had a computer fail them — a blue screen of death, a driver error — and that mental block will stymie in my mind people’s trust and willingness to be in one. As The New York Times reported, there are serious trust issues, and the first death related to one will trash trust in the entire industry. It’s sadly inevitable (not just because death is certain) due to the car having to choose if it hits an impossible-to-avoid human or wrecks, sacrificing its occupant.
This isn’t to say they won’t be successful. Far from it. They can replace cabs entirely, removing a ton of commuter traffic in cities like London and San Francisco and lower costs of travel and make everyone’s lives safer. The potential 10 million self-driving cars on the road by 2020 would be a boon to the entire human race, with each potential driver being a potential accident removed or fallible human removed.
The average driving test by country can never remove the wonders of the human brain to fuck up; an autonomous vehicle is a glorious, safer future. The Tesla Model S and X’s autosteering doesn’t just succeed at removing the monotonous, at times frustrating, elements of highway driving, but also operates as a great failsafe, much like an airplane’s autopilot. If we can make a car that can drive better than the average human, that’s going to make our world better — even if a great deal of people don’t trust them.
Removing a whole bunch of drivers from the road is a bloody good idea.
In the end, people trust other people to drive them, which is why a study by the Eindhoven University of Technology found that people trust autonomous vehicles that focused on human tendencies (driving style, how the car looked, etc.). While letting my Tesla take the wheel, I found it made very smooth turns of the wheel, even when changing lanes, that still resembled how I’d do it, except better than me. However, my passenger didn’t trust it. She was worried. Maybe she’ll get used to self-driving cars. Maybe she won’t.
As humans, we’re accustomed to someone else controlling a vehicle we’re driving. The Docklands Light Railway, London’s autonomous railway system built in the late 1980s, has only had two accidents in 10 years (someone tragically fell onto the tracks during an epileptic fit). Ultimately, it runs only in a straight line, and still has an attendant to watch the train, apply emergency brakes if necessary and take tickets.
Parts of London’s Underground also operate in a similar, yet less sensor-based way, but with drivers who do indeed stay at the front, control the doors and watch the train’s movements for potential accidents. It’s scary to completely give up control; even if the vehicle is being driven by a “perfect” computer, that computer was still programmed by an imperfect human. And we’d rather that imperfect human was there to blame, or react, or not malfunction (though humans, as you know, do), or not crash (we also do that) or “do their best.” Trust in one human doesn’t make us distrust a vehicle — trust in one vehicle may make us distrust the entire fleet.
People like to drive. People crave control.
That’s why it’s taking so long. Many of us are used to the breakneck speed of the Valley, but one cannot move fast and break things when the things in question could be bones and organs. That’s why it’s silly to complain about the government being slow to move on legislation. The results here aren’t a shitty CRM application or an app that crashes. They’re careening, several-tons robots. It’s not impossible to do, and it’s getting closer to being real — but it will require both insane amounts of testing and an inherent change in the psychology of what “driving” means.
And that’s before you get to sites and magazines like Jalopnik (more than 10.1 million unique monthly readers according to Quantcast) and Car and Driver (more than 9 million readers), which heartily prove that there’s a large minority of people who like driving.
In the end, we’re not going to become the fat people from Wall-E because of self-driving cars. People like to drive. People crave control. People love banal tasks done for them, but the transfer to a computer of a task that can threaten lives with one wrong turn will take a long time.
I’m even cynical that we’re five years from seeing millions of totally autonomous vehicles legally swamping our cities. It’s an exciting idea, a realistic idea — but the scariest idea in the world to rush.