Skip to Main Content

Ansys AVxcelerate Sensors
Test and Validate Sensor Perception for Autonomous Vehicles

Ansys AVxcelerate provides accurate sensor simulation capabilities, enabling you to test your autonomous systems, including sensor perception, faster than relying only on actual driving or recorded data.

Ansys AVxcelerate Sensors

Realistic Driving Scenarios using the Driving Simulator of Your Choice

To reduce the need for costly physical prototypes while speeding the design process, the Ansys AVxcelerate Sensors solution offers the opportunity to virtually experience sensors to test and analyze their performance. In the virtual world, use realistic driving scenarios to investigate radar, lidar and camera sensors perception in a Mil Sil or Hil context.

  • Realistic Driving Scenarios
    Realistic Driving Scenarios
  • HiL, MiL and SiL Connectivity
    HiL, MiL and SiL Connectivity
  • Radar, Lidar and Camera Sensors
    Radar, Lidar and Camera Sensors
Ansys AVxcelerate Sensors

Quick Specs

Discover high-fidelity physics-based sensor simulation with ground truth information for autonomous vehicles. Enable the use of multiple sensor simulations amidst realistic driving scenarios to ensure the physical prototype meets expectations.

  • Ground-Truth Sensors
  • Multi-Sensor Simulation
  • MiL Connectivity
  • Camera Sensor
  • Lidar Sensor
  • SiL Connectivity

January 2024

What's New

This release covers notable enhancements in overall integration and connectivity, radar sensor simulation, and industry mandated virtual headlamp regulation.

avsimulation-avxcelerate-sensors-r1-2024-beam-forming.jpg
Radar Sensor – Beam Forming

Users can now dynamically simulate, control, and manipulate the behavior of beamforming capability in real- time, to achieve desired orientation and maximize detection. 

Audio wave
Radar Sensor – Arbitrary Waveform

Within the simulation environment, users can now create radar sensors emitting unique sequences with both Arbitrary FMCW and Arbitrary Pulse Doppler Waveforms. Radar makers can use their final DSP that considers arbitrary waveform filtering in simulation, thereby minimizing the likelihood of false positives.

avsimulation-avxcelerate-headlamp-r1-2024.jpg
FMVSS 108 ADB Virtual Regulation

AVxcelerate Headlamp can provide a digital twin of the test track to test the requirements needed for compliance with FMVSS 108 virtual regulation, saving time and cost and improving the efficiency of the Adaptive Driving Beam (ADB) headlight system.

avsimulation-avxcelerate-sensors-r1-2024-nvidia-drive-sim-integration-stuttgart.jpg
Platform & Connector Compatibility

For SiL, compatible with Nvidia DRIVE Sim over Omniverse and Open USD. For HiL, compatible with NI test bench using RDMA, CarMaker for IPG XPack4, dSpace Scalexio target, and dSPACE ASM.

Validate Autonomous Vehicle Safety

Integrated simulation scenarios speed and improve the ability to confirm safety features across vehicle creation, including autonomous vehicles.

Autonomous vehicles combine sensors and software to control, navigate and drive safely. To ensure prototype vehicles are real-world ready, sensors must be designed with an immense variety of test cases.

Autonomous driving systems rely upon sensors and embedded software for localization, perception, motion planning and execution. They can only be released to the public after developers have demonstrated their ability to achieve high levels of safety. 

Today’s hands-off autonomous driving systems are largely built with deep learning algorithms that can be trained to make the right decision for nearly every driving situation. These systems, however, lack the detailed requirements and architecture that have been used up until now to validate safety-critical software, such as the kind that control commercial airliners. Road testing is clearly an essential part of the development process, but billions of miles of road testing would be required to validate the safety of autonomous driving systems and software. Simulation serves the need to make verification and validation of the operation of autonomous vehicles a practical effort. 

Applications

View all Applications
2021-01-fusa-window.jpg

Autonomy System Development

Model-based safety and cybersecurity assessments using Ansys simulation help to accelerate autonomous system development and certification.

Autonomous vehicle sensor simulation including lidar, radar and camera design

Autonomy Sensors

Ansys provides a comprehensive autonomous vehicle sensor simulation capability that includes lidar, radar and camera design and development.

2020-11-ansys-stock-20201123_0076.jpg

High-Performance Simulation Solutions for Realistic Sensor Testing and Validation

AVxcelerate Sensors readily integrates the simulation of ground-truth sensors and camera, radar, lidar and ultrasonic sensor types while accurate outputs enable you to assess your complex ADAS systems and autonomous vehicles virtually by connecting perception, fusion and control function into the driving simulator of your choice like IPG Automotive CarMaker or Carla.

 

Key Features

Benefit from powerful ray-tracing capabilities to recreate sensor behavior and easily retrieve sensor outputs through a dedicated interface.

  • Driving Scenarios  using the driving simulator of your choice
  • Ray Casting Ideal Sensor
  • Camera Sensor
  • Lidar Sensor
  • Radar Sensor
  • HiL, SiL and MiL Connectivity 

To test sensors in scenarios, add several cars to create complex situations, such as following a car and monitoring the path of a crossing car simultaneously. Each vehicle in a scenario can be either static or automatic, enabling evaluation at a specific point of interest or along a predefined trajectory. Sensor simulation follows ego-vehicle dynamic motion.

Sensor data output, such as lightning strike hit points, material properties, etc., and vehicle parameter input or output, such as position, orientation, speed, and steering wheel, are all available. Thanks to the consistent combination of data from multiple sensors, the simulation enables you to validate the model of a smart sensor’s behavior or its fusion algorithms. Deterministic and real-time modes are both supported.

Software allows you to simulate the actual camera model in edge case driving situations. It simulates all components of a camera, such as the lens system, imager and pre-processor. For automotive front-facing cameras, the windshield can also be considered in simulation. Consider the optical and spectral properties of the environment in visible range, along with the optical properties of the lens system and optoelectronic properties of the imager. With the addition of plugins, the simulation can manage dynamic adaption. Camera simulation creates raw images, which are used to test and validate perception algorithms either as models-in-the-loop, software-in-the-loop or hardware-in-the-loop. 

You will benefit from powerful ray-tracing capabilities to recreate sensor behavior and be able to easily retrieve sensor results through a dedicated interface. IR emitter world model IR properties (is this “The IR emitter will model IR properties?” and receiver electronics are considered in the simulation which can output from raw signals – waveforms – to point clouds. This solution provides a unique way to collect virtual sensor information during real-time drives and use the information to develop autopilot code. 

VRXPERIENCE’s GPU Radar feature provides the capability to perform full physics-based radar scenario simulations in real time at frame rates greater than 30 frames per second. The simulations consider multi-bounce reflections and transmissions from dielectric surfaces. Multichannel and MIMO radars can be simulated using the linear scalability of GPU Radar. With the addition of GPU radar, VRXPERIENCE now provides the ability to perform ADAS and Autonomy scenario simulation with full-physics models of all key sensors - cameras, radars and lidars. The data collected out of the radar model is used to efficiently stimulate the algorithm of radar ECU digital signal processing to quickly improve the accuracy and robustness of automotive radars in edge cases. Ansys VRXPERIENCE comes with a library of objects with dielectric properties defined.

Model-and Software-in-the-Loop (MiL,Sil): Perform massive scenario variation by leveraging cutting-edge testing — on-premise and in the cloud. Assess perception performance by varying parameters across countless driving scenarios.

AVXCELERATE SENSOR RESOURCES & EVENTS

Featured Webinars

Ansys AV Simulation
Ansys 2023 R2: Ansys AV Simulation (AVxcelerate) What’s New

Join us to hear about new capabilities and features in our 2023 R2 release of Ansys AV Simulation (AVxcelerate). With this release, we introduce improvements in Camera, Thermal Camera & Radar models and the simulation ecosystem to allow users to perform more accurate and much higher fidelity simulations.

On Demand Webinar
Ansys Webinar
Sensor and HMI Development & System Validation

For a sustainable business model solution, an intensive trade-off between performance and safety is made in the development of AD systems. Learn how Ansys solutions address critical technical challenges in areas such as sensor and HMI development and system validation.

On Demand Webinar
Ansys Webinar
Physics-Based Sensor Simulation for In-Cabin Sensing Systems Development

In-cabin sensing system requirements are increasingly becoming an essential part of governments policies and car safety rating organizations. Learn in-cabin sensing systems requirements andwatch physics-based sensor simulation for in-cabin monitoring systems development andvalidation process.


White Papers & Articles

Physics-based radar

Physics-Based Radar Modeling: Driving Toward Increased Safety

An innovative solution from Ansys, AVxcelerate, is designed to speed up the sensor development process — without compromising safety — by simulating product performance under varying road and environmental conditions.


Videos


Ansys software is accessible

It's vital to Ansys that all users, including those with disabilities, can access our products. As such, we endeavor to follow accessibility requirements based on the US Access Board (Section 508), Web Content Accessibility Guidelines (WCAG), and the current format of the Voluntary Product Accessibility Template (VPAT).

Découvrez ce qu'Ansys peut faire pour vous

Contactez-nous aujourd'hui

* = Champ requis

Merci de nous avoir contacté !

Nous sommes à votre disposition pour répondre à toutes vos questions. Un membre de l'équipe commerciale Ansys vous contactera sous peu.

Image de pied de page