Multimodal Synthetic Sensor Data for Safe Flight Operation
Leonardo is presenting an approach to develop advanced environmental perception functions to improve external situational awareness of pilots and autonomous flight mission management systems, enhance safety, increase survivability, and expand the envelope of potential operations achievable by the vehicles. Their approach relied on an Ansys-based simulation environment capable of simulating different sensor modalities (such as visible camera, LiDAR, and RADAR), supported by sensor data acquired in the real world.
Leonardo’s ultimate goal is to understand the world around the vehicle(s), allowing for better decision-making in everyday and critical situations in all-weather conditions.
What you will learn
- Why use synthetic data for autonomy functions development
- Why did Leonardo choose AVXcelerate to generate synthetic sensor data
- Learn how they interface their algorithms with AVXcelerate
- Examples of generated synthetic data
Who should attend
- Technical managers who lead research and engineering teams devoted to autonomy, to understand how they could exploit synthetic data to iterate faster in their development
- Technical specialist dealing with autonomy, in particular environmental perception, for the same reason
- Marco Ciarambino, Leonardo Company S.p.A.