ANSYS VRXPERIENCE HMI

Prepare and Appraise Virtual Prototypes of a Cockpit HMI in a Real-Time, Immersive VR Environment

With ANSYS VRXPERIENCE, you can test and validate a full cockpit HMI design, including virtual displays and actuators, through visual simulation, eye and finger tracking, and haptic feedback. ANSYS VRXPERIENCE provides a full HMI evaluation for next-generation vehicles using virtual reality. This tool reduces the time and cost of design, since design evaluation is mostly performed on virtual prototypes, dramatically decreasing the number of expensive physical prototypes necessary to validate the product. ANSYS VRXPERIENCE offers collaborative driving scenarios based on virtual HMIs, taking into account human factors analyses and cognitive workloads. A test driver can interact directly with the virtual interfaces, from touchscreens to switches, thanks to a fine, high-resolution finger-tracking system. As the system records the behavior of the driver and displays driving and infotainment information, it identifies and interprets the driver’s actions and triggers the adapted HMI reaction automatically. This makes it easy for you to evaluate the relevance of the displayed information, in real time, for a safer drive.

Undergo a visual experience in virtual reality

Visualize your virtual HMI prototype in a real-time, physics-based simulation. Apply optical properties and lighting to your prototype in a virtual environment using immersive devices, such as CAVE and HMDs. Thanks to the physics-based results of VRXPERIENCE, you will be able to perform accurate, comprehensive design reviews and lighting studies in VR, leading to optimal design decisions.

Interact with your cutting-edge HMI system

Within this immersive environment, you can efficiently evaluate the pilot’s or driver’s responses to new intelligent systems or advanced proactive safety systems, to ensure that important information will be instantaneously understood. Working with everything from advanced finger-tracking to full-body tracking systems, you can interact naturally with the HMI of your cockpit and accurately test user interactions in a variety of scenarios, without endangering lives or damaging expensive equipment.

Test your embedded software

Eliminate physical prototypes in your system design process by running the HMI’s embedded software inside your virtual prototype. Interact with your embedded software in VR thanks to out-of-the-box compatibility with ANSYS SCADE and ANSYS SCADE Display.


ANSYS VRXPERIENCE HMI in Action:

Fine tune your global cockpit architecture

Fine-tune your global cockpit architecture by exploring a variety of cockpit layouts. Compare and optimize the reachability and ease-of-use of any cockpit interface using the finger-tracking system. Switch instantly between cockpit layouts, from classic to disruptive designs, by moving displays and actuators and switching between display contents until you arrive at a suitable cockpit architecture.

Test your full cockpit user experience

Test the full cockpit user experience by navigating virtually using tactile displays and dynamic content to easily compare several workflow designs for look and feel. Interact with virtual actuators and joysticks, pushing or rotating them to validate your interface concept. Take your virtual prototype out for a ride and test it in dangerous driving or flying situations, with no risk to you or the vehicle.

Enjoy live interaction between user, virtual actuators and virtual prototypes

Experience live interaction with the HMI through natural finger motions. Simply connect your embedded HMI software and run it in VR mode. In this way, you can assess the usability of your software in context, while gathering user feedback early in the design process to improve the final software.

Minimize the impact of reflections by performing windshield or glass-house reflection studies

Minimize the impact of reflections by performing windshield or glass-house reflection studies based on physically correct reflection simulation. Save hours while comparing several dashboard material layouts, varying colors and trims, and assessing their impact on the driver’s visual comfort. Identify and optimize the source of disturbing reflections to correct them at once. Similarly, you can optimize rear vision thanks to accurate visibility studies, exploring and modifying positions and shapes of mirrors in real time, to assess the visibility features of your prototype.

Test and validate HUD systems

Test and validate head up display (HUD) systems, specifying and improving both the optical performance and quality of the displayed content. Observe the pilot’s responses to new intelligent or safety systems and easily evaluate the relevance of the displayed information, in real time, for a safer journey. This can be done, for example, by reacting to sensor data in an augmented reality head up display.

Optical HMI