Skip to Main Content

Navigate Edge Cases with Physics-Based Sensors Simulation for AVs: Camera 

Join us to hear about the challenges of obtaining accurate real-life data for edge case evaluation and how synthetic data can fill this gap. We'll highlight the relevant camera edge cases when simulated with a synthetic data approach. 

TIME:
May 28, 2024
11am EST

Venue:
Virtual

Sign Up

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Overview

Training and evaluating perception systems requires a large variety of accurate & annotated data to ensure the safety of Autonomous Vehicles (AV) in their Operational Design Domain (ODD). Acquiring enough real-life data for sensor perception edge cases or dangerous (critical) driving situations is challenging. Synthetic data can fill this gap. However, a physics-based simulation approach is necessary to obtain trustable synthetic data.

This synthetic data is used for multispectral light and electromagnetic wave propagation in 3D synthetic scenes and sensor modeling. In this webinar, we intend to highlight the relevant camera edge cases when simulated with this approach and the necessity of accurate synthetic data to solve this problem.

What attendees will learn

  • Safety by validation: the necessity of accurate synthetic data
  • End-to-end real-time physics-based approach for accurate synthetic data generation
  • Bridging the gap between Tier 1s and OEMs by enhanced camera sensor simulation
  • Highlights of challenging edge cases for camera perception testing and simulation

Who should attend

All AVxcelerate users, ADAS & AD engineers, HiL, SiL test engineers should attend to learn why synthetic data is necessary for safety by validation especially while evaluating the edge cases.  

Speakers

Lionel Bennes