When it comes to self-driving cars, the general axiom for sensors is “the more the merrier.” The safest systems are the ones that use a multiplicity of sensors, such as cameras, radar, ultrasonic, and LIDAR. Having these redundant sensors is the whole point: if one fails, the remaining sensor suite can help navigate the car to safety.
Mobileye, a company that specializes in chips for vision-based autonomous vehicles, believes in redundancy, but it also believes in the power of its camera-based system. At the Consumer Electronics Show in Las Vegas this week, the Intel-owned company demonstrated how one of its autonomous test vehicles navigated the complex streets of Jerusalem using cameras only.
THE VEHICLE’S SENSOR SUITE INCLUDES 12 CAMERAS... AND THAT’S IT!
The vehicle’s sensor suite includes 12 cameras.... and that’s it! No radar, no ultrasonic, and no LIDAR. LIDAR, which stands for light detection and ranging, are laser sensors that most tech and car companies see as an essential component for self-driving cars. The sensors are placed on the roofs of autonomous vehicles as well as the sides and grilles where they send out thousands of laser points to map the surrounding environment.
But for Mobileye, it’s all about the cameras. In the video, a Mobileye employee says that the two-dimensional information coming from the cameras is extracted into a 3D model of the environment using “a chain of algorithmic redundancies based on multiple computer vision engines and deep networks