With Intel Mobileye’s newest chip, automakers can bring automated driving to cars

Intel subsidiary Mobileye plans to bring a new supercomputer to market designed to give passenger cars, trucks and SUVs autonomous driving powers.

The company introduced Tuesday at the 2022 CES tech trade show a new system-on-chip called EyeQ Ultra that is purpose-built for autonomous driving. The company said the first silicon for the EyeQ Ultra SoC, which is capable of 176 trillion operations per second (TOPS), is expected at the end of 2023 with full automotive grade production in 2025.

The company also introduced at CES its next-generation EyeQ system-on-chip for advanced driver-assistance systems called EyeQ6L and EyeQ6H. The EyeQ6L is designed to support so-called Level 2 ADAS and is expected to reach the start of production by mid-2023. The EyeQ6H, which won’t go into production until 2024, will support ADAS or partial autonomous vehicle capabilities. This higher-performing chip will be able to provide all advanced driving assistance functions, multi-camera processing (including parking cameras), and will host third-party apps such as parking visualization and driver monitoring.

Mobileye is perhaps best known for supplying automakers with computer vision technology that powers advanced driver assistance systems. Is first EyeQ chip launched 2004 and was used in vehicles to prevent collisions. It’s been a booming business for Mobileye, which as of late last year the company has shipped its 100 millionth EyeQ SoC.

In recent years, the company has pursued what seemed like a dual strategy of supplying automakers with the chips they need for advanced driving assistance system while it developed and tested its own autonomous vehicle technology. In 2018, Mobileye even expanded its focus beyond being a mere supplier to becoming a robotaxi operator.

Those two paths are now converging, fulfilling a longtime strategy of Mobileye president and CEO Amnon Shashua who describes consumer AVs as the “end game for the industry.”

Mobileye has been developing automated vehicle technology for several years now. Its full self-driving stack — which includes redundant sensing subsystems based on camera, radar and lidar technology — is combined with its REM mapping system and a rules-based Responsibility-Sensitive Safety (RSS) driving policy.

Mobileye’s REM mapping system crowdsources data by tapping into consumer and fleet vehicles equipped with its EyeQ4, or fourth-generation system on chip, to build high-definition maps that can be used to support in ADAS and autonomous driving systems. That data is not video or images but compressed text that collects about 10 kilobits per kilometer. The mapping technology, which has informed the development of this new EyeQ Ultra chip, is accessed via the cloud to provide, in real time, up-to-date information on the drivable paths ahead.

Mobileye has agreements with six OEMs, including BMW, Nissan and Volkswagen, to collect that data on vehicles equipped with the EyeQ4 chip, which is used to power the advanced driver assistance system. On fleet vehicles, Mobileye collects data from an after-market product it sells to commercial operators. Today, more over 1 million vehicles harvesting REM data – now up to over 25 million kilometers per day, according to Mobileye.

The EyeQ Ultra continues to build off of previous generations of its SoC architecture. The EyeQ Ultra crams the processing power of 10 EyeQ5s in a single package, according to Mobileye. The company said EyeQ Ultra, which is engineered with Mobileye software, are paired with additional CPU cores, ISPs and GPUs and are capable of processing input from two sensing subsystems — one camera-only system and the other radar and lidar combined — as well as the vehicle’s central computing system, the high-definition REM map and RSS driving policy software.

Automakers keen on selling cars, trucks and SUVs to consumer that are capable of autonomous driving would theoretically use this yet-to-be-available chip to execute that goal. The EyeQ Ultra doesn’t include the sensors like radar and lidar. Instead it processes all of that incoming information. It’s up to the automaker customer to decide exactly how the EyeQ Ultra chip might be used. For instance, an automaker might offer new vehicles capable of autonomous driving only on highways while another might focus on automation in urban areas.