Skip to Main Content

        

ANSYS ADVANTAGE MAGAZINE

DATE: 2020

Engineer Perception, Prediction and Planning into ADAS

By Gilles Gallee, Autonomous Vehicle Business Developer, Ansys, La Farlede, France

 


Advanced driver assistance systems (ADAS)
 — such as forward collision warning (FCW), automatic emergency braking (AEB), lane departure warning (LDW), lane keeping assistance (LKA) and blind spot monitoring (BSM) systems — are estimated to have the potential to prevent more than a third of all passenger-vehicle crashes.1

According to a AAA Foundation for Traffic Safety report2, such a reduction would in turn prevent 37% of injuries and 29% of deaths in crashes that involve passenger vehicles. To fulfill the potential of ADAS, as well as the even greater potential safety and convenience benefits associated with fully autonomous driving, simulation is needed to ensure that vehicles can perceive the world around them, predict what might happen next and plan accordingly.

There are a lot of literal and figurative miles between the current state of ADAS and Level 5 fully autonomous vehicles. Simulation is critical to getting there.

An oft-cited report from Rand Corp. makes the case that autonomous vehicles would have to be driven hundreds of millions of miles and sometimes hundreds of billions of miles to demonstrate their reliability in terms of fatalities and injuries. For example, according to Rand3, to prove that fully autonomous vehicles would get into fewer serious crashes than human drivers would require a fleet of 100 autonomous cars traveling at 25 mph non-stop for 125 million miles — the equivalent of six years of driving. To provide the same evidence for fatal crashes, that same fleet would have to travel 8.8 billion miles, which would take about 400 years.

Currently, most ADAS functions fall in the Level 1 or 2 range. Even achieving Level 3 autonomy, in which a vehicle can take full control when certain operating conditions are met, is a challenging technological leap. It requires a combination of physical testing and simulation that includes hardware, software and humans in the loop. Each aspect of the autonomous vehicle technology stack is critical and requires people with different skills and knowledge to be involved.
 

Levels of Autonomy


SAE International, the Society of Automotive Engineers, first published J3016 autonomy level guidelines in 2014. They have since been adopted by both the U.S. Department of Transportation and the United Nations.
 

Solve the Perception and Planning Problems With Sensors and AI

Sensors are the eyes and ears of advanced driver assistance systems. Like our own senses, weather and complicated driving conditions can confuse and overwhelm them. In the automotive industry, suppliers have been challenged to develop sensors and sensing systems that function at a higher level so that they not only perform well on sunny days in light freeway traffic, but in blizzards, on busy city streets and under a multitude of “edge cases.” Edge cases encompass those unusual scenarios that don’t happen often, but often lead to accidents — a dog chasing the car in front of you, construction workers rerouting traffic, or a flash flood making a roadway impassable are just a few of many edge cases.

Ansys SCADE Vision powered by Hologram helps to identify the edge cases to pinpoint the weakness of AI. Armed with edge case information, SCADE Vision can then trigger more AI training actions and new testing scenario conditions. Read "Autonomous Safety in Sight" for more information.

Software developers are interested in generating synthetic data from simulation to more quickly train AI in various operation design domains (ODD), the term used to describe subsets of driving conditions with particular environmental, geographical, time-of-day, traffic and/or roadway characteristics. Defining and identifying ODDs are challenges for developers because they affect testing, compliance and real-world Level 3 autonomous driving. For a car to take over from a human driver under certain conditions, the sensors must perceive those conditions and the software must interpret those perceptions to determine whether ODD requirements have been met. Simulation helps developers explore those ODD edge cases.

Original equipment manufacturers (OEMs) rely on their supplier tiers to provide sensor sets. However, OEMs are ultimately responsible for the safety of the cars they produce, so they want to be sure suppliers have fully vetted those technologies. Suppliers are using simulation at the component and packaging levels to better understand the strengths and weaknesses of various sensing technologies, such as Ansys Speos for lidar and cameras and Ansys HFSS for radar (see "Take Simulation Underground" and "A New Kind of Eyes on the Road"). The goals are to improve individual sensing technologies and ensure the various technologies can be used together to help create a robust sensor array that can handle whatever edge cases come up.

The Ansys VRXPERIENCE Driving Simulator powered by SCANeR forms the basis of an ADAS development cycle. It provides ADAS development teams with the capability to recreate driving scenarios and enables testing against a variety of objectives and performance requirements. By replicating roads generated from high-definition maps and asset libraries, traffic situations, weather conditions, vehicle dynamics and more, ADAS development teams can validate sensor and AI modules, sensor systems and vehicle models, as well as human–machine interfaces (HMIs).
 

Simulate Adas Functions

ADAS functions are driven by software development. Custom vehicle models can be connected to Ansys VRXPERIENCE through FMI, C/C++, Ansys Twin Builder or MathWorks Simulink. Engineers can put vehicles in an environment with certain conditions — for example, on a highway arriving at a traffic jam at a certain speed — and quickly modify them for the scenarios and validation they’d like to perform. Based on that, they can simulate the scenario with different levels and types of sensors to assess sensor perception, sensor fusion and systems operations.

VRXPERIENCE can speed edge case exploration and sensor simulation. Take, for example, headlamp development. There are a lot of missed detection edge cases at night, so Ansys VRXPERIENCE has specific modules to simulate the physics of light. Intelligent lighting to automate when highbeams should turn on and off or automatically adjust to minimize glare may seem like simple conveniences, but lighting is an important piece of ADAS because the car’s camera sensors react to it. Cameras that identify signs, road lanes and oncoming vehicles are sensitive to headlamp design changes, for example. VRXPERIENCE reduces the time and cost of development by enabling a repeatable process for modified sensor inputs, such as lighting changes.
 

Automated Prevention

 

AAA Foundation research evaluated the potential that popular advanced driver assistance technologies have in helping to reduce or prevent crashes. The findings, which used U.S. data, show that if installed on all vehicles, ADAS technologies can potentially prevent more than 2.7 million crashes, 1.1 million injuries and nearly 9,500 deaths each year.

Another example is an emergency braking function that is part of an advanced driver assistance system. To develop it, the function is first described as a model, often in MathWorks Simulink or Ansys SCADE Suite. It is tested to meet objectives and then designed as a more detailed model. The coding of the emergency braking function can then be tested vs. scenarios with software-in-the-loop and hardware-in-the-loop.


The Ansys VRXPERIENCE Driving Simulator powered by SCANeR is the basis for ADAS and the continuum all along the ADAS development cycle.
 

With Ansys VRXPERIENCE Driving Simulator powered by SCANeR, customers have a seamless process to test the model, connect it with SCANeR and then keep the same vehicle test environment to connect software and hardware as they simulate different ODD edge cases. The streamlined workflow saves time and makes it easier for geographically dispersed teams and experts from different disciplines to collaborate.
 

Planning for Humans in the Loops

In addition to predicting what other motorists, cyclists and pedestrians will do, ADAS also needs to account for how people will behave inside their cars. According to University of Iowa research4, people’s behavior can change based on ADAS features. About 25% of the drivers surveyed who used blind spot monitoring or rear cross traffic alert systems reported feeling comfortable relying solely on the systems and not performing visual checks or looking over their shoulder for oncoming traffic or pedestrians. About 25% of vehicle owners using forward collision warning or lane departure warning systems also reported feeling comfortable engaging in other tasks while driving.5

Early adopters of ADAS technology proved that false positives or annoying alert sounds could cause them to ignore or disable safety features. As advanced driver assistance systems continue to take on a more prominent safety role, human–machine interaction becomes increasingly important. Here again, simulation can help automakers and suppliers implement ADAS by planning for how safety features will be used or misused.

Ansys VRXPERIENCE Driving Simulator powered by SCANeR integrates with driver hardware simulator interfaces to create an immersive driving experience with virtual reality. The Ansys VRXPERIENCE HMI module can be used to test and validate the full cockpit design for HMIs, including virtual displays and actuators, through visual simulation, eye and finger tracking, and haptic feedback. The virtual test driver can interact directly with the virtual interfaces, from touchscreens to switches, thanks to a fine finger tracking system. As the system records the behavior of the driver and displays driving and infotainment information, it identifies and interprets the actions of the driver and triggers the adapted HMI reaction automatically.


Ansys VRXPERIENCE Driving Simulator powered by SCANeR automates scenario variability creation for massive simulation.

Advanced driver assistance systems have the potential to prevent more than a third of all passenger-vehicle crashes. Such a reduction would prevent 37% of injuries and 29% of deaths in crashes that involve passenger vehicles.

ADAS developers can easily evaluate the relevance of the displayed information, in real time, for a safer drive. Ansys VRXPERIENCE reduces the time and cost of design because the evaluation of the design is mostly performed on virtual prototypes, reducing the number of expensive physical mock-ups necessary to create the product.
 

Create a Distinct Experience

Safety is paramount, but HMIs are also a way for OEMs to differentiate themselves in the market. Much like the sound of an engine or the feel of a car door closing, the way people and automation interact has become a means to build a brand.

Ansys VRXPERIENCE allows OEMs and suppliers to evaluate different driver and passenger experiences as part of the same overall development process. Beyond ADAS, Ansys VRXPERIENCE also allows users to visualize the impact of assembly and shape deviations on the perceived quality of a product, considering manufacturing variations.

Engineers can see and present the influence on perceived quality based on design and manufacturing data such as materials, fasteners and tolerances. They can simulate complex deformation effects such as arching, bending and distorting to identify the root cause of problem areas.

The Ansys VRXPERIENCE SOUND module provides an intuitive graphic display of sounds and a one-click magnification control feature to help create a sound signature. Users can also set up psychoacoustic tests based on a listener panel to obtain statistics about the real perception of sounds. Sound perception can be evaluated using tools based on time–frequency representations.
 

Collaborate to Combine Technological and Cultural Shifts

As with any innovation that has the potential to disrupt the status quo, technology is only part of the story with advanced driver assistance systems. How that technology is used ultimately determines its success or failure in the market. All aspects of autonomous systems — the sensors that perceive the environment, the AI that interprets the sensor data and plans for how humans will interact with automation — must be considered.

The Ansys VRXPERIENCE Driving Simulator powered by SCANeR brings it all together in a workflow that encourages the collaboration necessary to vet new technology and how drivers and passengers will react to that technology.

The development of fully autonomous vehicles is one of the greatest engineering challenges of our day. It’s a long road ahead, with significant milestones along the way. To meet the challenge, startups and established players are using autonomous vehicle simulation to get from here to there safely and efficiently.
 

SOURCES

  1. A Long Road to Safety,” RAND Review, July-August, 2016.
  2. Potential Reduction in Crashes, Injuries and Deaths from Large-Scale Deployment of Advanced Driver Assistance Systems” A. Benson, B. C. Teft, A.M. Svancara and W. J. Horrey. AAA Foundation for Traffic Safety, 2018.
  3. Driving to Safety. How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?” Nidhi Kalra and Susan M. Paddock. Rand Corp., 2016.
  4. Vehicle Owners’ Experiences with and Reactions to Advanced Driver Assistance Systems” A. McDonald, C. Carney and D. V. McGehee. University of Iowa, 2018.
  5. Understanding the Impact of Technology: Do Advanced Driver Assistance and Semi-Automated Vehicle Systems Lead to Improper Driving Behavior?” N. Dunn, T. A. Dingus and S. Soccolich. AAA Foundation for Traffic Safety, 2019.

查看 Ansys 的服務與產品

立即聯絡我們

* = 必填欄位

感謝您聯絡我們!

我們將在此解答您的問題,並期待與您交流互動。Ansys 的銷售團隊成員會立即與您聯絡。

Footer Image