Intel/Mobileye Announce Advanced Sensor Technology
Jan. 20, 2021—Automakers and technology companies are racing to develop the latest autonomous vehicle, but Intel and Mobileye have taken the path less traveled, improving existing technologies and offering a sustainable safety framework for the industry.
During the Consumer Technology Association’s 2021 Consumer Electronics Show, Amnon Shashua, senior vice president of Intel and chief executive officer of Mobileye, discussed the connections between advanced driver assistance systems and the autonomous technologies that power them.
The presentation, which was recorded from a Mobileye garage lab in Israel, outlined future goals for the companies as they progress with autonomous vehicle technology in respect to its Responsibility-Sensitive Safety framework.
Shashua announced that Intel and Mobileye have developed new sensors for autonomous vehicles in an attempt to bring down software costs to make autonomous technology more affordable.
When creating the sensors, Shashua said Mobileye started with camera sensors only.
“Don’t combine all the sensors at the beginning,” Shashua warned. “Solve the difficult problem using an end-to-end camera system, then add radar and LiDAR and so on.”
Shashua also noted that the company is now developing radar systems that are software-defined, as opposed to analog.
SlashGear reports, “A so-called LIDAR SoC will put active and passive laser elements onto a silicon chip, with the resulting “photonic integrated circuit” having 184 vertical lines of scanning, moved through optics.”
Shashua said the new software chip technology may be found on Mobileye’s autonomous vehicles as soon as 2025. He noted that the company believes by 2025, “one front-facing LiDAR would be sufficient for a level 4 autonomous vehicle.”
Responsibility-Sensitive Safety Framework
During the final half of the presentation, Shashua paid credit to Intel’s Responsibility-Sensitivity Safety framework, which he said is one of the organization’s crown jewels.
“Regulators will not approve autonomous vehicles if they have the same crash rate as human drivers,” Shushua said. “What is an acceptable failure rate for a system?”
The RSS answers this question and more by mathematically defining “what it means to be careful,” Shushua said. Humans have implicit rules that they understand through body language and common sense, whereas the RSS defines what implicit rules should look like for autonomous vehicles.
“Safety should not be a secret sauce,” said Shashua. That is why Intel has deliberately made the RSS accessible to anyone in the industry. To find it, click here.