Skip to Main Content

      

ANSYS BLOG

December 17, 2019

Optical Simulations Validate ADAS and Self-Driving Car Sensors

Advanced driver assistance systems (ADAS) enable cars to perceive the world around them. In a sense, these systems act as a co-pilot — they take over the wheel when they sense dangers that are unseen by drivers.

To learn more about ADAS, visit ANSYS at CES.

Cars from the big four, and other automotive companies, have a plethora of ADAS systems to help people:

Ensuring that these systems are safe takes a lot of effort. Simulations can prove the accuracy of the sensors that inform these systems. However, it can be a challenge to reproduce all the potential road conditions these sensors will face in day-to-day and extreme circumstances.

For instance, engineers will have to gather all of the optical properties of objects around the road to properly produce these simulations. One source of this data is the Ansys Speos road library for sensor simulations.
 

How the Optical Properties of the Road Differ

Various objects on the road will have differing optical properties — a tree will not reflect light the same way as a stop sign.

To recreate this scene, simulations will need to know the optical properties of the vests, road and sign.

In fact, that stop sign and other road signs have retroreflective films to direct light toward a passing car. These films are classified based on their reflectivity, from least to most reflective:

  • Engineering grade
  • High intensity
  • Diamond grade

To properly model the world around the car, engineers need to know these retroreflective properties as well as the optical properties of other objects.

Once this virtual world is created, engineers can use it to digitally test how ADAS and autonomous vehicle sensors would react to various scenarios.
 

Simulating Self-Driving Car Sensors

Using the Ansys Speos road library for sensor simulations, engineers can model the optical physics of the road.

An optical simulation of safety vests and road signs using ANSYS VRXPERIENCE.

For instance, the library contains data that explains how the surface of signs reflect light, both visible and infrared, toward a car’s cameras, lidar or passengers.

The library contains firsthand optical data for:

  • Road markings
  • Road signs
  • License plates
  • Safety vests
  • Car paint
  • Vegetation
  • Metals
  • Asphalt

To create a virtual environment to test these sensors, engineers can use Ansys VRXPERIENCE. The software can also iterate virtual realities to ensure ADAS systems react properly in various conditions.

Implementing optical sensors into ADAS and autonomous vehicles will be a point of discussion at CES 2020.

For instance, there will be an invitation-only panel featuring senior management from Analog Devices, Ansys, McKinsey & Company and more. The discussion will center around consumer adoption, sensor/software/hardware technology, regulations and startups driving development.