VRXPERIENCE Tests Automotive and Autonomous Vehicles within Virtual Reality

VRXPERIENCE offers engineers virtual tools to test automotive and autonomous vehicle designs.

The ANSYS 2019 R1 release of ANSYS VRXPERIENCE provides engineers with the tools they need to develop advanced driver-assistance systems (ADAS) and autonomous vehicles.

Specifically, the latest release enables engineers to develop models and virtual environments that test:

  • Sensors.
  • Human-machine interfaces (HMI).
  • ADAS.

New Sensor Models Focus on Software and Hardware-in-the-Loop

The latest release of VRXPERIENCE integrates new camera models that perform hardware-in-the-loop (HiL), software-in-the-loop (SiL) and model-in-the-loop (MiL) testing.

A VRXPERIENCE model that is set up for HiL camera testing

The first model is a high-fidelity physics-based camera. It can be set up for HiL camera testing to generate a raw image that is processed by the imager of a real camera.

The model is able to validate the camera’s perception in various lighting conditions, making it easier to test nighttime and edge-case scenarios.

By simulating the lens system and imager of the camera, the model accurately replicates:

  • The blooming effects of bright lights.
  • The image distortions created by windshields.
  • The dynamic management of a camera’s image signal processor (ISP).

Engineers can use this model to test and validate the perception algorithms of a smart camera in virtual realities.

The second VXPERIENCE model interfaces with ANSYS SCADE’s model-based development software. The model is set up for SiL and MiL testing of headlamp control laws.

As SCADE creates embedded code that is tested and certified to automotive standards — like AUTOSAR and ISO 26262 — the model will help to rapidly prototype intelligent lighting systems,

ADAS and autonomous vehicle control laws.

HMI Tests in Virtual Reality Places the Human-in-the-Loop

Testing a car’s HMI with a human-in-the-loop

VREXPERIENCE also has virtual reality tools that put the human-in-the-loop. These tools help engineers ensure the safety and reliability of a car’s HMI.

For instance, physics-based ray tracing helps engineers determine how HMI systems interact with light in a virtual reality.

The simulation detects undesirable lighting effects like reflections and wash out on HMI screens. These undesirable lighting effects can affect driver safety. If engineers can detect these lighting effects early, they are better equipped to eliminate them using ANSYS SPEOS optics simulation software.

VRXPERIENCE also enables engineers to get into the virtual cockpit of a vehicle. The engineer can then test the driving experience in various virtual environments. These experiences give engineers a better understanding of how to optimize the vehicle’s field of view and lighting performance.

These virtual environments also give engineers the ability to play around with a vehicle’s HMI. Engineers can track how people interact with buttons, head-up displays, tactile displays and the embedded software. This will help ensure the fluidity of the user interface which significantly improves driver safety.

To learn more about the ANSYS 2019 R1 release of VRXPERIENCE, watch the webinar: ANSYS VRXPERIENCE 2019 R1 Update.