Visual simulations and virtual reality (VR) enable engineers to see through the eyes of the end-user. This is great news for marketers looking to improve the visual appeal of a product. More importantly, the engineering applications of these tools can improve automotive safety.
What we see depends on many variables, including:
- Ambient light
- Time of day
The human observer could also have conditions that affect their ability to perceive the world. In short, different people see different details in an environment.
Instead of relying on their personal vision, engineers can use visual simulations to assess how everyone would perceive a car’s environment, control system and surroundings.
Think of it: You don’t want a warning signal on the car’s GPS to be invisible to people living with a form of color-blindness. Visual simulations can help engineers see through the eyes of various individuals — firsthand — so they can experience how the HMI fails a portion of the population.
Engineers can even create virtual reality representations of the car’s HMI system. These VR simulations enable the engineer to experience their designs in an immersive environment. This experience improves their abilities to optimize the HMI design to ensure that it is universally safe for all potential users.
How Visual Simulations Improve Driving Vision and Automotive Safety
ANSYS SPEOS can be used to model the color-sensitive cones and light-sensitive rods of the human eye. The software transforms the light information that would reach the eye into a virtual representation. This visual simulation can then be shared with the engineering team to optimize the optical safety of designs.
For instance, engineers could simulate a car’s dashboard and compare how light from external headlights, readouts, windshield reflections, mirrors and radio can all affect the driver’s vision. These simulations can be tweaked to assess how the driver’s ages or visual impairments could affect the results.
With age, people tend to become more sensitive to glare. They also tend to perceive certain objects with a yellowy tint. Age will often stiffen an eye’s lens, causing close images and writing to appear blurry. SPEOS can simulate all of these effect within its human eye model. The human eye model can also be tweaked to test color-blindness and other visual impairments.
Based on this information, engineers can improve driver safety by iterating the color, shape, brightness, glare or (when applicable) fonts of the:
- Dashboard lights
- Radio lights
Virtual Reality Helps Engineers Experience Their HMI Designs
When engineers design an HMI — for an aircraft or a car— it’s important to ensure the information will be perceived and understood by the end-user.
As a result, engineers need to fully experience their HMI designs. This experience can help them test and validate the human factors of the HMI to ensure they are optimized to every potential user.
To do this, engineers can use ANSYS VRXPERIENCE’s HMI VR testing capabilities. These VR simulations can provide engineers with haptic feedback, visual simulations and eye/finger tracking data. This information can then be used to optimize the car’s HMI system to reduce eye strain, road distractions and more.
VRXPERIENCE also provides a way to integrate with ANSYS SCADE and bring the HMI software-in-the-loop of the VR environment. Engineers can then use this environment to run, test and interact with the embedded software.
To learn about how ANSYS SPEOS and ANSYS VRXPRIENCE can optimize HMI systems, read the white paper: Making the Cockpit of the Future a Reality via Optimized Human-Machine Interfaces.
To learn more about ANSYS VRXPERIENCE’s HMI capabilities, watch the webinar: ANSYS 2019 R3: VRXPERIENCE HMI Update.