Mentor in Robo-Car Race with Mobileye, Nvidia (EETimes)

"As more automakers start integrating different sensors into ADAS/autonomous cars, they often justify the decision to apply sensor fusion as “critical to the safety” of highly automated driving.
Often left unsaid, though, are details on the data — raw or processed — they are using and the challenges they face in fusing different types of sensory data. As Ian Riches, director of the global automotive practice at Strategy Analytics, confirmed, “Sensor fusion today is not done on the raw sensor data. Each sensor typically has its own local processing.”
Mentor Graphics Corp. will come to SAE World Congress in Detroit this week to demonstrate how “raw data fusion” in real time from a variety of modalities – radar, lidar, vision, ultrasound, etc. – can provide “dramatic improvements in sensing accuracy and overall system efficiency.”
Mentor is rolling out an automated driving platform called DRS360, designed to “directly transmit unfiltered information from all system sensors to a central processing unit, where raw sensor data is fused in real time at all levels,” the company said.”

04/04/2017

Read the full article »