February 22, 2023
Connecting information via high-speed wireless network infrastructure so data can be analyzed and shared could very well drive the future of autonomous vehicles. Several factors must be considered to reach high levels of autonomy, including advanced sensor technology, precise determinization of vehicle location, up-to-date mapping information, local perception of other vehicles and pedestrians, and planning and decision making. To give you an idea of the magnitude of these interactions, in 2019, there were 31 million vehicles operating with some level of automation. By 2025, almost 60% of global new vehicle sales will function at level two autonomy.1
At this level, we’ve made limited inroads to full autonomy, a huge computational task that will no doubt involve numerous in-vehicle applications requiring near real-time response. All this activity will be coordinated by artifical intelligence (AI) and a high level of connectivity supported by simulation at every turn — from safety validation to real-world sensor and antenna performance verification. But one thing is for sure: Advanced simulation of vehicle connectivity, sensor perception, sharing of information, and training AI decision making will have a huge impact on the ultimate delivery of autonomous vehicles.
Many conversations need to happen between an autonomous vehicle (AV) and other elements within a self-driving ecosystem, enabled by vehicle-to-vehicle (V2V), vehicle-to-infrastructure (V2I), vehicle-to-network (V2N), vehicle-to-people (V2P), and vehicle-to-everything (V2X) smart technologies to ensure safety. All these smart technologies rely on consistent low latency connectivity to expand perception beyond what is directly in front of the vehicle. This is especially true in dense 5G-and-beyond telecommunication environments of the future, where a lot of information can be exchanged.
“Fully autonomous environments of the future will all be dictated by a larger communication grid coordinating vehicle movement,” says Christophe Bianchi, Chief Technologist at Ansys. “The vehicle in the city talks to the city, talks to the other vehicles, and all of that involves a mission going from one place to another safely. So, how do you simulate all the events, everything that's happening in the environment, plus all the communication required, including the quality of signals between all these different parameters? That’s a mission for digital mission engineering software.”
Of course, we’re beginning to see some of these smart interactions play out now. Autotalks, a small apparatus that connects to the handlebar of your bicycle to avoid the risk of collision, is just one example of V2X connectivity in action. This system connects to vehicles and related vehicle infrastructure as part of a broader mobility ecosystem, so that all vehicles, including bikes, will be connected and talking to the infrastructure and each other, monitoring traffic and sharing critical information in real time.
Reliable service and coverage are the most important metrics the success of Autotalks, and other connected tech, will be measured by. Simulation enables 5G designers to achieve these objectives by helping them accurately model the real-world performance of millimeter-wave and beam-forming antennas, optimizing the power, performance, and cost of mixed-signal system-on-chips (SoC) and application processors. It also increases product reliability through chip-package-system, electro-thermal, and thermo-mechanical analysis.
“Ansys brings reliability and performance to the entire 5G ecosystem from devices to networks to data centers,” says Dr. Larry Williams, Distinguished Engineer at Ansys. “Fortune 500 high-tech companies, including the leading 5G players, are using our semiconductor and multiphysics simulation tools and workflows to deliver 5G connectivity on a large scale.”
The adoption of driverless technology will also cause a paradigm shift in the industry, from vehicle ownership to usership. With this realization, original equipment manufacturers (OEMs) and mobility suppliers are looking to create recurring revenue streams through more detailed service models. The only way to do this is to maintain a connection with the user, in this case, the customer/driver. Constant location and schedule sharing of the customer/driver can lead to personalized service recommendations from vendors, consequently unlocking additional revenue streams and opportunities.
So, in addition to being a technology facilitator for autonomous driving safety, connectivity is also a new business model enabler. The massive amount of information coming out of these digital interactions, such as real-time insights into vehicle health or driving preferences, will be monetized by OEMs in the form of subservices or subscriptions. Simulation will be a key enabler in optimizing these new, connected experiences, not only for autonomy, but also for any critical vehicle subsystem like battery management systems.
“In the future, for example, our ability to build hybrid digital twins that combine simulation models and real-time sensor information running in the cloud or on the edge will give OEMs a very accurate picture of what’s happening inside the battery,” says Bianchi. “This enables them to see and understand what’s happening throughout the battery pack and anticipate or detect a failure that can lead to poor charging — all of which can be instantaneously communicated to the user via service alert.”
Artificial intelligence relies on data, which is good for automotive because data-driven insights, coupled with machine learning (ML), have accelerated self-driving technology. One big challenge related to advanced driver assistance systems (ADAS) and autonomous applications is that AI/ML requires a massive dataset that covers all fringe corner cases and works in cohesion with rule-based systems to achieve better performance.
Ultimately, it’s through AI/ML-driven decision-making systems working closely with rule-based systems for AVs that the complexity of operating safely within a real driving environment is prioritized for safety. Sometimes AI/ML systems calculate the risk scores for any driver maneuvers which gives them the necessary control to make a decision.
“In automotive AI/ML, one of the critical challenges is that the discovery of those rules becomes a hard problem when the data is not covering the entire space correctly,” says Jay Pathak, Senior Director of Research and Development at Ansys. “On the road, multiple factors are in play, and the dataset coming out of any on-road interactions is a reflection of all these environmental factors happening together; whereas, to discover all of them independent of each other, we simply don’t have the data. This presents a big challenge in the automotive machine learning community.”
It’s important to acknowledge the potential of AI/ML to solve these problems. Despite progress made with OEMs such as Mercedes Benz in support of ADAS validation, we have not yet reached a level of maturity that would enable the necessary data in support of full autonomy. This will involve a shift away from big data to useful data as well as unsupervised learning methods augmented by simulations to create challenging dynamics in datasets which otherwise will be difficult to find in real situations.
To meet these challenges head on, you need to find a smart way to introduce machine learning into systems operating in the midst of many on-road disruptions, with lots of rules to discover. OEMs are using simulation and nonlinear solvers heavily to guide the data that informs this level of perception. All of the independent challenges associated with AV functionality must be broken down into subtasks that can be addressed by simulation using a simulator engaged in each of these dimensions and fed by data from lidars, radars, and sensors to inform perception.
Another area where AI/ML is making a profound impact is in the annotation of real-world driving maps and ground truth sensor data. Not only are AI/ML engines used to accurately annotate the maps. They’re also used to train and stress test the perception algorithms, using detailed, categorized and color-coded mapping data, in turn leading to robust perception outcomes and enhanced safety.
And, as we aspire to achieve full autonomy, vehicles of the future will become unrecognizable by today’s standards. Certain structural changes will impact the re-positioning of data-gathering systems that inform vehicle perception, as future vehicles must be designed with integrating sensors in mind. For example, a radar integrated behind the front fascia must be able to have mmWave signals propagate through that fascia without being disrupted by paint and other materials. These are situations that can be analyzed quickly and efficiently in virtual simulation environments.
Behind every on-road communication and every cloud-based data exchange is a corresponding Ansys solution.
These products are all part of our broader portfolio for advancing transportation and mobility innovation.
We came across some interesting research pointing to the top 10 trends in automotive last year. We decided to write a series of blogs based on four of them: electrification, autonomous vehicles (AVs), and vehicle connectivity and artificial intelligence, and how simulation is positioned to best support these trends.
Rounding out our third and final installment are vehicle connectivity and artificial intelligence. Insights presented in this blog come from the following thought leaders here at Ansys: