产品组合
查看所有产品Ansys致力于通过向学生提供免费的仿真工程软件来助力他们获得成功。
Original equipment manufacturers (OEMs) and suppliers are tirelessly working behind the scenes to advance self-driving technology. Gaining the traction needed to move entirely beyond level 2 to level 3 (eyes off) functionality and even further depends on the evolution of the operational design domain (ODD), or the set of operating conditions in which an autonomous vehicle will function safely.
To get to the point where a person sits behind the wheel but can take their eyes off the road, the industry must show evidence of safety of the system against safety of the intended function (SOTIF) and safety first for automated driving (SAFAD) requirements. Both SOTIF (ISO 21448) and SAFAD (ISO/TR 4804) emphasize the importance of safety by validation for automated vehicles, which is first validated during HiL testing and then, for selected use cases, during actual on-road testing.
As the ODD continues to evolve and expand, it will become even more critical to test perception performance under millions of scenarios and difficult-to-record edge cases. To address these new perception challenges the market is moving toward high-resolution cameras (8 megapixel and above), high-resolution imaging radars, and high-resolution lidar. Additionally, extended ODD validation coverage brings a strong emphasis on simulation. These solutions produce large amounts of data to stream and process in both simulation and the final vehicle.
Ansys AVxcelerate Sensors 2024 R1 autonomous vehicle sensor simulation software includes an important feature enhancement for perception software testing in a hardware-in-the-loop environment that combines simulation with real-world systems. This physics-based solution is now compatible with RoCE-based (Remote Direct Memory Access over Converged Ethernet) HIL infrastructure provide by NI to support the immense, real-time and low-latency data exchange needed to validate higher-level perception. Because Ansys camera simulation is already used in the production environment with industry-grade perception software on chips, RDMA compatibility will add to the usefulness of the solution.
As production nears, testing on the HiL bench is unavoidable for testing and validating complex software systems. An HiL test bench brings together synthetic simulation data inputs from various sensors — such as cameras, lidars, and radars — to more aptly assess autonomous vehicle (AV) systems.
With HiL testing, it’s possible to address the most challenging aspect of system validation: evaluating sensor perception based on deep neural networks that have been trained on data. It’s an invaluable tool during development as it’s the closest precursor step to mass production. Any issues coming out of HiL testing can be addressed with the electronic control unit (ECU) or embedded system responsible for controlling a specific function during this stage.
Due to the complexity of the perception algorithm, it cannot be validated based on traditional techniques that are theoretical in nature or show its limitations like the over-the-air (OTA) camera and monitor HiLs. Instead, it must be tested against a large amount of synthetic simulation data. This is where direct data injection via simulation comes in to ensure the quality of the injected data is preserved during transmission while eliminating the need for camera to monitor calibration.
Today’s high-resolution sensors create technological constraints for HiL benches. This challenge is linked to the traditional method of transferring synthetic video data to HiL benches via HDMI or DisplayPort. Full HD resolution and data transfer limits offered by these ports are inadequate to meet the need for the real-time response rates that a self-driving environment demands.
To test perception on HiL benches, direct data injection, based on a raw signal coming from simulation, presents a viable solution. Traditional HiL techniques for testing camera chips, such as over-the-air (OTA), do not allow engineers to preserve or simulate the high dynamic of a real camera image in particular but not exclusively for nighttime driving scenarios. Also, due to this lack of high-dynamic range, the camera’s image signal processing (ISP) cannot be stimulated appropriately, thus the back-channel data coming from the chip cannot be integrated in the simulation loop.
Using direct data injection, the environment and the hardware/optical aspect of the sensor is fully simulated and only the raw signal is injected into the final part of both the processing hardware and software, enabling more accurate HiL bench testing and validation of AV perception. With raw data injection, the chip and its image signal processor (ISP) are stimulated with relevant data. As a result, the overall behavior is coherent with reality, and the back-channel data can be taken into account in the simulation loop. This means that the ECU can be tested with its original series firmware, without a need to set it in a special “HIL mode” before testing.
For high-resolution cameras, or multi-sensor simulation, CPU intensive data transfer is often a challenge for the absolute real-time requirements of HiL benches. In 2024 R1, AVxcelerate software includes NI RDMA transfer capabilities to address this challenge and facilitate the seamless flow of data.
Ansys and NI (now part of Emerson) are partnering to offer real-time, physically accurate high-resolution camera synthetic data for HiL validation to address testing constraints. To do this, they developed a closed-loop simulation enabled by NI RDMA and Ansys AVxcelerate Sensors software that enables customers to inject real simulation data via NI real-time hardware camera interface boards directly into the input port of the device-under-test (DUT). To evaluate the relevant behavior of the ECU under test, accurate synthetic data must be injected. This is the prime reason why physically accurate simulations are required. AVxcelerate software's high-fidelity, physics-based simulation helps preserve full scene information across full dynamic of 24-bit raw data images. As a result, imager spectral range adaptation, HDR imager/DSP simulation, and multi-exposure perception strategies can be applied.
Within the AVxcelerate Sensors application, it is then possible to generate a subset of images in real time for fast, verifiable results. Ansys software enables validated camera computer vision (CV) in a fraction of the time compared with using traditional simulation techniques.
NI RDMA is the element of this closed-loop system that enables the transfer of large amounts of synthetic data with low latency and high bandwidth to host high-resolution camera feeds in real time. Essentially, NI RDMA driver software provides support for two or more systems to exchange data over converged ethernet using RDMA technology (RoCE). It abstracts the low-level details of programming an RDMA-compatible interface and features a simple and efficient application programming interface (API) for transferring data. NI has also further extended these capabilities through the development of a software development kit (SDK) which enables easy, fast, and vendor-agnostic connectivity to simulation environments that follow the same approach to openness and system compatibility.
Ansys AVxcelerate Sensors simulation software, NI hardware, and the RDMA over Converged Ethernet link create a connection or loop to the physical world that enables OEMs to place sensors on actual vehicles for further testing and validation. Through this closed-loop solution, customers can visualize, analyze, and verify how their virtual prototypes interact with hardware and real-time systems running perception algorithms. It amplifies testing environment and interface capabilities to include the full operating system, application stack, and hardware. It can also be expanded to replay tests to compare and contrast simulation data with recorded real-world data by testing perception algorithms and fully stimulating an advanced driver-assistance system (ADAS) ECU.
The Ansys-NI partnership is helping to bring vehicle perception into sharper focus, giving automotive companies a significant advantage by helping them get up to speed quickly and safely during development.
In addition to cameras, Ansys and NI are developing similar closed-loop solutions for other sensors in the ADAS and AV space, including radar and lidar technologies. You can read more about the partnership here.
The Ansys blog, featuring contributions from Ansys and outside experts, keeps you in the know with the latest industry information, including engineering articles and simulation news, thought leadership and trends, product development advances, tips to better use Ansys solutions, and customer stories.