For autonomous driving to be safe, engineers need to design systems and sensors that can detect, interpret and react to hazards on the road.
That is why AEye is using simulation to create an Intelligent Detection and Ranging (iDARTM) platform that mimics how human eyes focus on the road.
The company will be using ANSYS SPEOS to model the optics of its sensor platform and ANSYS VRXPERIENCE to test and validate it within a realistic virtual environment.
The iDAR Platform Helps Autonomous Systems Assess the Road
Traditionally, engineers would have to test and validate their sensor systems using physical prototypes that consume a lot of time and budgets.
However, with the iDAR platform, a sensor system can be tested virtually over millions of miles in a few days.
Thanks to this autonomy on-demand, AEye and its OEM and Tier 1 customers will be able to address use cases systematically and gain more intelligence from the edge.
This systematic approach is important because you can’t put an autonomous system on the road and expect to test it through every potential scenario it could experience. The road is an unpredictable place so it’s impossible to tell when an edge case could pop up. Setting up these scenarios using physical prototypes could be impossible, expensive and dangerous.
As customer adopt iDAR, ANSYS pervasive simulation software will be there to fully validate its performance. This will help to reduce development time and optimize autonomous implementation.
To see how this autonomous driving technology works, visit ANSYS at CES.
To learn more about iDAR, read: AEye and ANSYS Accelerate Autonomous Driving Safety.