Test and Validate Sensors Perception for Self-Driving Cars
To accelerate the development of safe, autonomous systems, while reducing the amount of physical testing required, Ansys VRXPERIENCE Sensors enables engineers to create high-fidelity, physics-based sensors models to test and analyze their performance. In the virtual world, use realistic driving scenarios to investigate Radar, LiDAR and Camera sensors perception in a MIL, SIL, or HIL context.
VRXPERIENCE Sensors readily integrates the physics-based simulation of cameras, radar and lidar sensor types and delivers accurate outputs that enable you to virtually assess your complex ADAS systems and autonomous vehicles by connecting perception, fusion and control function in a single drive simulator.
This webinar demonstrates how VRXPERIENCE Sensors helps to mitigate the risk that that real-world tests pose to other drivers and pedestrians, enabling you to measure the limits of sensor performance and customize your drive tests with a variety of scenarios and road conditions to provide a safer drive. Additionally, we will discuss how to speed development of safe autonomous driving technologies with high-performance simulations solutions that deliver realistic sensor testing and validation.
Lastly, our expert will also present the new real-time radar sensor simulation, based on Ansys HFSS SBR+ and real-time, closed-loop simulation in Ansys VRXPERIENCE, which empowers ADAS engineers to simulate advanced edge case scenarios.
Speaker: Kishor Ramaswamy