Sensor Fusion

Real Time Embedded Data Fusion for Automated Vehicles

Highly automated and connected vehicles of the future will have safe, robust and ultra-efficient computing systems. In today's cars, they are called Advanced Driver Assistance Systems (ADAS) and are basically closed boxes with a dedicated set of sensors for a specific purpose. Adaptive Cruise Control (ACC) with long-range radar that helps maintain a safe distance with vehicles in front of the car is a popular example; so is Automatic Emergency Braking (AEB) with a LIDAR sensor that detects obstacles up to 100m in front of the vehicle.

Bild: CEA/Léti
Functional partitioning of a decision structure in a car.

To achieve a higher level of automation, the car architecture is evolving from a distributed architecture, in which every ADAS function is implemented using a specific system, to a centralized architecture, in which all the sensors are connected to a central system. This central system can be compared to the vehicle's brain. It monitors the entire vehicle and its surroundings and makes appropriate decisions to navigate safely through real-time traffic. To do this, the central system is connected to the vehicle's entire set of sensors. It performs the environment perception in a demanding procedure called sensor fusion. In parallel, the vehicle state and its precise localization are computed from GPS data, speed monitors, and/or remote communication. Knowing precisely the environment and the vehicle states is crucial for comprehensive scene analysis: is the vehicle in the correct lane? where are other vehicles? are safety distances checked? etc. Based on this analysis, the central system decides how to operate the vehicle, including controlling its actuators.

CEA / Léti

Dieser Artikel erschien in inVISION 1 2017 - 08.03.17.
Für weitere Artikel besuchen Sie www.invision-news.de