VRXPERIENCE Capabilities

Driving Simulator

VRXPERIENCE offers an immersive driving simulation environment with SCANeR, including scenarios, traffic, and vehicle dynamics run time. You can also create custom virtual road environments and testing scenarios. VRXPERIENCE is also interfaced to other vehicle dynamics (CarSim) and complete driver hardware simulator interfaces (SensoDrive) for the most immersive driving experience. You can also use the platform as a development showcase for your customers.

VIDEO: Co-simulation VRX & SCANeR

VRX driving simulator

Headlamp

A fully virtual driving lab for testing lighting systems in a controlled environment

Reduce the night road tests and virtually assess headlamp performance, with real-time comparison and a connection to logical simulation. Clearly report on night-driving simulations, and virtually analyze the efficiency of your headlamps compared to your previous headlamps or even to the headlamps of your competitors, simply with a switch of configuration. Optimize the lamp light distribution design, the development of intelligent headlamps and test drive replacement. This way, you can validate the control law for higher quality, and to strongly reduce the risk of finding problems too late in the development process. Easily assess your matrix Beam & ADB, playing with hundreds of ready-to-use light sources from leading references.

  • Real time LED Matrix Beam (>100 light sources), up to 500 pixels each side.
  • Real time dynamics lighting strategy (AFS/ADB).

VIDEO: Night Driving Simulator

VRX headlamp

HMI

With VRXPERIENCE, test and validate the full cockpit design for HMIs, including virtual displays and actuators, through visual simulation, eye and finger tracking, and haptic feedback. VRXPERIENCE provides a full HMI evaluation for next-generation vehicles, using virtual reality. This tool reduces the time and cost of design, since the evaluation of the design is mostly performed on virtual prototypes, reducing the number of expensive physical mock-ups necessary to create the product. VRXPERIENCE offers collaborative driving scenarios based on virtual HMIs, taking into account human factors analyses and cognitive workloads. The test driver can interact directly with the virtual interfaces, from touchscreens to switches, thanks to a fine finger tracking system. As the system records the behavior of the driver and displays driving and infotainment information, it identifies and interprets the actions of the pilot and triggers the adapted HMI reaction automatically. You can thus easily evaluate the relevance of the displayed information, in real time, for a safer drive.

Optical HMI

Sensor

Automate driving away from the road: assemble, test and experience optical sensors in a virtual driving experience. VRXPERIENCE readily integrates the simulation of ground-truth sensors and camera and lidar sensor types. Powerful graphical visualization capabilities enable you to assess your complex ADAS systems and autonomous vehicles virtually, by connecting optical and functional operations in a single drive simulator.

Use the ground truth sensor (GTS) simulation for all kinds of sensors:

  • Camera
  • Lidar
  • Ultrasonic
  • Radar

Benefit from powerful ray-tracing capabilities to recreate sensor behavior, and easily retrieve sensor results through a dedicated interface. This solution provides a unique way to collect virtual sensor information during driving and use the information to develop autopilot code.

VIDEO: ADAS Simulation

Optical sensor

Sound Dimension

Push the driving simulation experience further by adding the sound simulation of the vehicle to the virtual driving experience. VRXPERIENCE enables the creation of realistic sound sources and immersive 3D soundscapes, to get the desired engine sound car. Record and analyze the baseline sound — the original sound of the vehicle during a run up. Additionally, generate and control sounds for the driving simulator easily, and perform perception study and 3D sound playback from binaural recordings. Design and test any sound alarm to alert the driver. Create and validate intuitively engine sound enhancement, considering the frequency response into the car cabin and the baseline sound. Start, stop and change the sound, performing comparisons of different sounds to select the best options.

Sound dimension

Perceived Quality

VRXPERIENCE allows you to visualize, in scale 1 and in virtual reality, the impact of assembly and shape deviations on the perceived quality of your product, taking into account manufacturing variations. Accurately and visually see and present the influence of tolerances on perceived quality, based on the design and manufacturing data, including the material, fastening scheme and tolerances. Simulate complex deformation effects such as arching, bending and distorting, and easily identify the root cause of problem areas. Change all parameters freely, to enable all possible solutions to be tested in order to achieve the highest possible quality in the final product. From the early concept stage, continuously update and improve the model to accommodate styling changes and the development progress. Focus on the area visible to the driver and passenger and compensate with the areas that are difficult to see or aren’t seen at all, for build tolerances. In a word, improve the overall visual appearance, or perceived quality, of the final vehicle interior.

VIDEO: How does ANSYS SCADE Suite deal with Automatic Code Generation?

Perceived quality