This is how a self-driving car ‘sees’ the road

Ever wonder what an autonomous vehicle ‘sees’ via its sensors, on-board computing and sensor fusion system? This video from Civil Maps, the high-definition maps technology company backed by Ford, reveals some of what’s going on when it comes to combining detailed 3D maps with sensor data culled from LiDAR, optical cameras, radar and other on-board vehicle hardware used to take stock of the world around an autonomous vehicle.

Civil Maps Product Manager Anuj Gupta explains how its technology localizes a car in six degrees of freedom (a term you may be familiar with if you follow the virtual reality industry), across both x,y and z movement axes as well as on roll, pitch and yaw rotational axes to help a car focus its sensors exactly where they need to be paying closest attention on the road at any given moment.

This results in computational power and sensor load savings, which is huge when you’re an automaker trying to balance cost of producing an autonomous driving system with putting something safe and effective on the road.

Of course, Civil Maps is trying to prove the usefulness of its product, which it wants to sell to autonomous technology companies for obvious reasons. The proof is in the pudding, however, and that’s why the team shot the video below, which shows the car’s localization and mapping tech at use in a test vehicle driving at speeds up to 70 mph on a major highway in Michigan.