Waymo’s UX challenge is getting people to enjoy the ride

Driverless mobility requires insights into human factors and behavioral psychology

Google has been working on autonomous vehicles — one of the biggest challenges in AI — for more than a decade, but it’s learning that the hardest part might just be getting people to enjoy the ride.

“This is an experience that you can’t really learn from someone else,” Waymo’s Director of Product Saswat Panigrahi told TechCrunch, while explaining the work he oversees around driverless development. “This is truly new.”

The sheer novelty of designing a UX (user experience) for driverless mobility has drawn Waymo away from hard science-based technologies where tech giants often feel most comfortable. In the place of data, sensor and neural net development, Waymo finds its driverless development gated by painstaking research into human factors and behavioral psychology. Despite making critical decisions to avoid delving into the mysteries of human behavior and interactions, Waymo is finding that such research is an unavoidable challenge on the road to driverless mobility.

“User research has always been a big part of the development process,” said Ryan Powell, the company’s head of UX Research and Design.

In 2012, when the Google Self-Driving Car program was “dogfooding” a highway-only driver assistance system called “AutoPilot,” its in-car cameras found that employees were over-relying on the limited automation in dangerous ways. As a result of videos showing Googlers putting on makeup, using multiple devices and even falling asleep while using the system that they’d been told required constant observation, the decision was made to cancel AutoPilot product plans and focus on fully autonomous driving. “That was a big moment for the user research team because we had a big impact on the work that we were doing at Waymo in terms of making that commitment to Level 4 autonomy,” Powell recalls.

Waymo’s first in-house vehicle design after the decision to go fully driverless, the low-speed and human-control-free “Firefly,” marked the debut of several key features that would be included in all future Waymo UX designs. A “pull over now” button, which ends the ride as safely as possible, as well as a “help line” button that connects the rider with a human-staffed assistance hotline, have become mainstays of Waymo’s current in-vehicle and app-based interface. Developed for the current Chrysler Pacifica fleet with the Waymo One mobility service in mind, this more mature app- and interior screen-based user interface shows how central the UX research team’s work has become to Waymo’s product.

Because of the novelty of driverless vehicles, Waymo couldn’t simply follow the example of other ride-hailing apps which reduce user engagement to a bare minimum in order to deliver the lowest-possible friction for users. To reassure riders of the driverless system’s capabilities, Waymo wanted to highlight the huge amount of sensor data that feeds its algorithms without overwhelming riders with a firehose of information.

The compromise they landed on was a simplified rendering of sensor data that shows higher levels of detail in a regular pulse, and brings forward the most important elements of the driving scene to the rider’s attention. Things like the status of the traffic light a vehicle is stopped at, or the presence of pedestrians (a relative rarity in the Phoenix suburbs) and construction barriers, are presented in more eye-catching ways to show riders that the self-driving system is aware of not just their presence, but their importance as well.

As the UX team began including research from driverless rides, they found that showing not just what the car could see but explaining its decisions became increasingly important. “At least on the first couple of driverless rides, there is definitely a heightened rider engagement,” says Naomi Guthrie, a Waymo UX researcher. Without the automatic correlation between conditions outside the car and the actions you take as a human driver, it is becoming increasingly important to highlight the conditions and actors that are determining changes in the vehicle’s behavior. “We try to think of these moments where someone might look up from what they’re doing and make sure we’re showing the information people want to have when they look up,” Powell says.

The in-vehicle user interface is just one of the elements of Wamyo’s driverless mobility service that requires intensely human-focused study. Understanding the best pick-up and drop-off zones for different locations, situations and even temperatures opens another Pandora’s box of complexity. Then, there’s the leading use of Waymo’s help line: how to seamlessly reroute a trip on the fly, something even the most slovenly ride-hailing driver can usually accomplish with only a few words.

There’s more than a hint of irony in the fact that Waymo avoided any kind of driver assistance program rather than delve into the complexities of human behavior, only to find that full autonomy required at least as much insight into the mind and society to become commercially viable.

Though they all emphasized that the plan is to roll out more and more features to their vehicles over time (for example, Google Play Music is currently the only entertainment option), Waymo’s product experts say there are no immediate plans to try to solve these problems purely with technology like a Google voice assistant. For the moment, at least, self-driving cars seem to be a surprisingly human-focused business.