Cameras help autonomous vehicles read street signs and the color of traffic lights. But LiDARs, aka light detection and ranging systems, do the important work of sensing and helping cars avoid obstacles, whether that’s a fallen tree, drunk driver, or a child running out into the road.
Now, a startup called Luminar Technologies Inc., is unveiling a high-resolution LiDAR sensor that was five years in the making. The startup, which has raised $36 million in seed-stage funding so far, built its LiDAR systems from scratch. That means the company engineered its own: lasers, receivers, chips, packaging and more, rather than incorporating off-the-shelf components.
Investors in Luminar have included: Canvas Ventures, GVA Capital, and the Peter Thiel-backed 1517 Fund. According to Luminar’s CEO and cofounder Austin Russell, the company’s LiDARs give cars the ability to see obstacles ahead in much greater detail, and at much greater distances than any other systems on the market today. The LiDARs also work in inclement weather, through fog and dust. The company has not yet disclosed the price it will charge per sensor.
Predecessors and competitors offering LiDAR for self-driving vehicles include Quanergy, Velodyne, and Alphabet-owned Waymo, among others. At twenty-two years old, Russell is confident his technology has them all beat in terms of performance.
At the age of two, Russell had already memorized the periodic tables, by twelve he wrote his first patent, and perhaps predictably he later dropped out of Stanford with a $100,000 Thiel Fellowship.
He said, “My role models were primarily in physics and engineering. I admire the classics like Newton, Tesla and Einstein. They all came up with a new way of thinking about a problem in order to be able to advance an entire field, rather than making incremental improvements to some specific technology.”
The problem most LiDAR makers are trying to solve, Russell points out, is one of affordability. Many auto-grade LiDARs still cost tens of thousands of dollars. Instead of simply making easier-to-manufacture, and cheaper versions of existing tech, Luminar sought to build a higher-performing sensor, even if it used exotic materials and operated at a different wavelength than industry standards.
Luminar invited TechCrunch to see a demonstration of its systems at San Francisco’s Pier 35, a terminal with a long straightaway. There, employees had arranged mannequins of different heights dressed in different colors, as well as life-sized decoy deers, car tires, and signs that marked the distance every 25 meters or so. At the distance of 200 meters, Luminar placed a large canvas painted the same hue as a common black car.
The company had purchased vehicles straight from BMW, Mercedes-Benz, Tesla and other dealerships, equipped them with its own LiDAR sensors, and was driving those cars around Pier 35. Monitors in the cars displayed maps of the world around the moving vehicle, drawn in real-time from the LiDAR data.
The Luminar system, operated by CTO and cofounder Jason Eichenholz, clearly portrayed: a bicyclist weaving in and out of the road, at 100 meters and further away; a small pigeon that suddenly scurried about 40 meters in front of their car; and they clearly showed the human form of the mannequins, even those dressed in dark garb; as well as the black-painted canvas at the end of the pier.
By contrast, data streaming in from other LiDAR makers would only show a few dots or a meager line indicating that an object lay ahead. Other systems could not specify objects or even detect dark walls and the mannequins dressed in darker clothes. And earlier LiDAR sensors only showed what was happening within about a 35- to 50-meter range.
If you’re not familiar with how LiDAR works, TechCrunch has a handy primer here. But at the highest level, LiDARs send beams of light out, which bounce back and generate data points that build a picture of a car’s surroundings and the distance and density of objects ahead.
Luminar’s LiDARs send out millions of laser pulses per second. They give the vehicle a 120-degree range of view. They also allow optical zoom, or more concentrated analysis of any obstacles near a car.
Russell said, “If you notice, a lot of self-driving vehicles are limited to operating at or below 25 miles per hour. These are things like delivery robots, or robots used in warehouses. We can see objects at 200 meters, while driving 75 miles per hour. That means you have about 7 seconds of time for your car to react. Other lidars only give one second of reaction time.”
Luminar investor Rebecca Lynn, a partner at Canvas VC, said now that the company is putting its systems out on the market, she expects Luminar will become a major strategic player in automotive. Indeed, the CEO said car companies have already offered to buy up, at any price, all the LiDARs that Luminar plans to produce in its first run, starting this year.
The startup is using some of its significant seed funding to build a 50,000 square foot factory in Orlando, Fla. Its headquarters office, however, is in Portola Valley, Calif.
While some argue that self-driving cars will be able to rely on cameras alone for perception, Lynn said: “LiDAR will be able to save more lives than any technology being developed today, from genomics to AI. There are many things that will make our lives easier or better. But looking at tech that can truly save lives, it is autonomous driving. And you cannot have fully safe autonomous driving without LiDAR.”