Advanced driver assistance systems (ADAS) enable cars to perceive the world around them. In a sense, these systems act as a co-pilot — they take over the wheel when they sense dangers that are unseen by drivers.
Cars from the big four, and other automotive companies, have a plethora of ADAS systems to help people:
- Stay in their lane
- Park their car
- See into blind spots
- Avoid accidents
Ensuring that these systems are safe takes a lot of effort. Simulations can prove the accuracy of the sensors that inform these systems. However, it can be a challenge to reproduce all the potential road conditions these sensors will face in day-to-day and extreme circumstances.
For instance, engineers will have to gather all of the optical properties of objects around the road to properly produce these simulations. One source of this data is the ANSYS SPEOS road library for sensor simulations.
How the Optical Properties of the Road Differ
Various objects on the road will have differing optical properties — a tree will not reflect light the same way as a stop sign.
In fact, that stop sign and other road signs have retroreflective films to direct light toward a passing car. These films are classified based on their reflectivity, from least to most reflective:
- Engineering grade
- High intensity
- Diamond grade
To properly model the world around the car, engineers need to know these retroreflective properties as well as the optical properties of other objects.
Once this virtual world is created, engineers can use it to digitally test how ADAS and autonomous vehicle sensors would react to various scenarios.
Simulating Self-Driving Car Sensors
Using the ANSYS SPEOS road library for sensor simulations, engineers can model the optical physics of the road.
For instance, the library contains data that explains how the surface of signs reflect light, both visible and infrared, toward a car’s cameras, lidar or passengers.
The library contains firsthand optical data for:
- Road markings
- Road signs
- License plates
- Safety vests
- Car paint
To create a virtual environment to test these sensors, engineers can use ANSYS VRXPERIENCE. The software can also iterate virtual realities to ensure ADAS systems react properly in various conditions.
Implementing optical sensors into ADAS and autonomous vehicles will be a point of discussion at CES 2020.
For instance, there will be an invitation-only panel featuring senior management from Analog Devices, ANSYS, McKinsey & Company and more. The discussion will center around consumer adoption, sensor/software/hardware technology, regulations and startups driving development. To request an invitation, click here.