ANSYS 19.2 Release Highlights

How to Train an Autonomous Vehicle’s AI with Virtual Reality

Why are automakers using virtual reality and simulation to train the artificial intelligence (AI) of their autonomous vehicles? It’s because, according to estimates, autonomous vehicles need to drive 8 billion failure-free miles before they will be ready for the consumer market.

Physically driving 8 billion miles would take hundreds of years, way beyond the five to 10-year target delivery dates of some commercial automakers.

No need to put your hands on the wheel when testing out an autonomous car’s artificial intelligence in a virtual reality environment.

"As a result, we are trying to understand how to train a neural network with virtual data," says Nicolas Dalmasso, director of technology at ANSYS. “The goal is to allow customers to use data from a virtual environment instead of real-world ones."

Simulations and digital worlds provide automakers with a space where their autonomous cars can safely fail, test and iterate.

These simulations can also run faster than a real-world test, reducing the time needed to rack up those 8 billion miles of driving. After all, a simulated drive from Tucson to Las Vegas doesn’t have to take eight hours. A computer can calculate the results much faster.

How to Test an Autonomous Vehicle’s Artificial Intelligence Using Virtual Reality

Automakers plan a virtual reality route for an autonomous car to try out.

First, simulation technology is used to model the car, its cameras, sensors and driving characteristics.

Much of this can be done using ANSYS VREXPERIENCE and other ANSYS simulation software.

“The software simulates the electronics of the vehicle’s camera and sensors. It also simulates how these sensors convert signals into data,” says Dalmasso. “This means that automakers can simulate the effects of noise and glare that may impact sensors.”

The virtual vehicle can then be given life as a virtual autonomous car. To do this, automakers insert the same AI system they are developing for their real-world model.

Automakers then populate a virtual reality to test out the autonomous car. This digital world includes road conditions, weather, time of day, people, buildings, traffic signals and other cars on the road.

Once the digital test is set up, automakers can use it to see how their AI systems handle the virtual road.

Virtual Prototypes of Autonomous Cars Are Much More Affordable

This set up is much more affordable and faster than physically prototyping an autonomous car in a few ways.

First, as previously mentioned, these simulations run faster than physically driving a car over the same number of miles. This means you can simulate thousands of miles in the time it takes to build your physical prototype.

Second, changes to the car, road or AI system can be made quickly or even automatically. Traditionally, changing the car or AI in a physical prototype could take hours, days or weeks to set up.

Finally, if there is a catastrophic failure in the digital world, you simply reset the simulation. In the physical world, you’ve lost your expensive prototype or worse.

In summary, simulations will bring autonomous vehicles to the market faster and on a tighter budget than physical prototypes.

To learn more about how simulation and virtual realities can influence autonomous vehicles, read up on the ANSYS VRXPERIENCE.