The most serious crash to date involving a self-driving truck might have resulted in only moderate injuries, but it exposed how unprepared local government and law enforcement are to deal with the new technology.
On May 5, a Class 8 Waymo Via truck operating in autonomous mode with a human safety operator behind the wheel was hauling a trailer northbound on Interstate 45 toward Dallas, Texas. At 3:11 p.m., just outside Ennis, the modified Peterbilt was traveling in the far right lane when a passing truck and trailer combo entered its lane.
The driver of the Waymo Via truck told police that the other semi truck continued to maneuver into the lane, forcing Waymo’s truck and trailer off the roadway. She was later taken to a hospital for injuries that Waymo described in its report to the National Highway Traffic Safety Administration as “moderate.” The other truck drove off without stopping.
While Waymo’s autonomous semi truck was not at fault in the hit and run, the incident highlights gaps in reporting mechanisms, and raises questions about how ready the public and law enforcement are to cope with heavy, fast-moving vehicles that have no human driver.
The stakes for the autonomous trucking industry, which is still in its infancy, couldn’t be any higher. One crash, even if the company is not at fault, could tarnish the public’s image of the technology.
Waymo’s trucking origins
Waymo started testing its driverless technology with semi trucks in 2017, beginning in California and Arizona. At the time, it was in the middle of an epic legal battle with Uber over technology allegedly taken from Waymo by engineer Anthony Levandowski, and subsequently purchased by Uber as part of self-driving truck startup Otto.
Waymo’s self-driving trucks, which are part of a delivery and logistics division the company calls Waymo Via, rely on similar technologies to its robotaxis: a suite of sensors, including cameras, radars and lidars, and powerful on-board computers. All have qualified truck drivers — known as autonomous specialists — in the driver’s seat.
In 2018, Waymo began hauling freight in Georgia, and it branded its delivery business Waymo Via in 2020. It then expanded into New Mexico and Texas, and inked deals with logistics companies like J.B. Hunt, UPS and C.H. Robinson. Earlier this month, it committed to a long-term strategic partnership with Uber and announced a pilot delivery program with home goods e-tailer Wayfair.
That pilot is due to start in July on the same stretch of I-45 highway where the May crash occurred.
Inside the crash
Using reports from local police and the Department of Transportation, and data supplied by Waymo to NHTSA, TechCrunch has attempted to reconstruct the worst self-driving truck crash on U.S. roads to date.
According to Waymo, the Peterbilt 579 truck was not carrying freight for any customers or partners; it was conducting “standard” testing with a weighted load.
Behind the wheel was a 40-year-old autonomous specialist with a decade of truck driving experience; there was also a software operator on board. Like many workers in Waymo vehicles, both were actually employed by Transdev, a multinational transit and mobility company.
Although the ultimate aim of automated trucks is to eliminate, or at least greatly reduce, staffing costs, self-driving truck startups today operate with a safety driver and an engineer or technician on board.
Waymo reported that its truck was driving in autonomous mode at 62 miles per hour, slightly below the speed limit, when the other truck entered its lane and forced it off the road.
Waymo told TechCrunch that the safety operator did not take control of the truck from its autonomous system.
“The technology was not a factor, as this collision was caused by a human driver of another vehicle when they crossed the lane line and collided with the cab of Waymo’s vehicle and continued driving,” spokesperson Katherine Barna wrote in an email.
Ennis PD photos, obtained under public records laws, show the Waymo truck and trailer by the side of the highway. They appear to have been prevented from sliding onto a parallel suburban road by a crash barrier. An Ennis police officer noted the truck itself sustained only minor damage: one picture shows damage to the truck’s lidar laser-ranging sensor.
The driver, however, was taken to a nearby hospital with unspecified, moderate injuries. The attending officer classified the incident as a hit and run. Waymo told TechCrunch that it understands the driver is doing well, following their injury. The driver did not respond to a request from TechCrunch for comment.
Because the system was active during at least some of the 30 seconds preceding the collision, Waymo was required to report it to NHTSA, to comply with the agency’s Standing General Order on Crash Reporting for automated vehicles.
Gaps in the system
There are no checkboxes on a Texas Department of Transportation crash report to record whether the vehicles involved are operating with full or partial automation, and that information was not recorded in the narrative section of the Waymo crash report.
Ennis PD Detective Paul Asby, who later investigated the incident, told TechCrunch that he did not know the truck was operating autonomously at the time of the collision.
At the hospital, the Waymo driver told police the hit-and-run vehicle belonged to Helwig Trucking, a local carrier with about 15 trucks. (Waymo also confirmed that the truck’s cameras captured enough details to identify the other vehicle.) Helwig did not respond to a request for comment.
The driver left her phone number with the police and was released from the hospital, and the Waymo truck was towed away. Waymo also provided a contact number to the police. Detective Asby was assigned to the case, and quickly established that the crash was the fault of the Helwig driver. He contacted the company to get its side of the story, and its insurance details. But when it came to Waymo, Asby met a wall of silence.
“I was going to speak to the driver because she was taken to the hospital but I’ve tried to contact her cell phone and it says it’s not a valid number,” he said. “The same thing for the passenger who was in there with her.”
Subsequent calls to Waymo itself went unanswered. “They never did return my calls. I inactivated the case, but the insurance information is in there if they want it,” he says. “Maybe they’re so rich they don’t care.”
Waymo told TechCrunch that it is not aware of any attempt by Ennis PD to contact it for information, and that it did not have any need to contact the department itself.
How it’s going
The Ennis crash is not the only one to have involved a Waymo semi truck. In February, a similar Waymo Peterbilt 579 traveling southbound on Interstate 10 near Sacaton, Arizona, was struck by a box truck traveling in the adjacent lane, and which had just also hit a motor coach. The Waymo vehicle was traveling 50 mph in a 75 mph limit zone. TechCrunch was not immediately able to source a police report detailing the crash; there were no reported injuries.
If Waymo had not been required to report the crashes to NHTSA, there is a chance they might never have come to light. The official crash reports gathered by Texas, which has welcomed multiple self-driving truck operations to its highways, appear insufficient to fully record incidents involving driverless vehicles. Local law enforcement has historically been similarly ill-equipped to deal with driving systems instead of driving humans.
Waymo is trying to close those gaps, says Barna. “Waymo has built the Waymo Driver to interact with First Responders; and has worked closely with public safety officials to ensure the safe introduction of our technology in every market that we operate in,” she told TechCrunch. “We have a team with decades of law enforcement experience that has provided training to hundreds of officers and firefighters in California, Arizona and Texas detailing best practices for safe interactions with Waymo vehicles.”
“We’ve got a mountain of work to do integrating these things into society,” said Steve Viscelli, a sociologist at the University of Pennsylvania who studies trucking, and acts as an advisor to Aurora’s self-driving truck effort. “We need to talk a lot more about what they mean for supply chains, for workers and for the highway. There are a lot of people who are going to do stupid and aggressive stuff around them because they don’t like self-driving vehicles.”
Waymo has told the U.S. Department of Transportation that it has 47 trucks, which have driven more than 1.6 million miles. It would not disclose to TechCrunch how many of those miles were driven under some level of automated control.
Automated trucking companies have “got the basic driving stuff down,” says Viscelli. “It’s what happens with the family on vacation and the tire’s off, or when there’s construction that changes the shape of the road, or debris on the highway. It’s when you have confidence in those issues that’s going to determine when they’re on the road. But I would not be surprised to see trucks without drivers on lanes next year.”
Updated: TechCrunch updated the article to reflect that Waymo also provided a contact number to the police.