December 3, 2021
When the world’s first “motorwagen” was introduced in 1885, the notion that a car would one day drive itself was laughable. Today, assisted and autonomous vehicles are the reality of an age where digital sensors can outperform human ability to perceive motion, distance, and speed.
When used together, sensor technologies including camera, lidar, radar, and ultrasonic, give vehicles one complete understanding of the world to navigate safely with little or no human intervention.
But as engineers and designers, identifying the right combination of these sensors to satisfy the end user's needs — including safety, functional performance, and price — requires thoughtful consideration of each sensor type's roles, capabilities, and limitations.
Examples of sensor applications in vehicles include:
High-resolution digital cameras help a vehicle "see" its environment and interpret the world around it. When multiple cameras are installed around the vehicle, a 360° view allows the vehicle to detect objects in its proximity, like other cars, pedestrians, road markings, and traffic signs.
There are several types of cameras to consider for meeting different design needs, including NIR cameras, VIS cameras, thermal cameras, and time of flight cameras. As with most sensors, cameras work best when used to complement each other.
Cameras are ideal for situations such as maneuvering and parking, lane departure, and recognizing driver distraction.
Lidar stands for “light detection and ranging,” and is a remote sensor technology that uses light pulses to scan an environment and produce a three-dimensional copy. It’s the same principle as sonar, except lidar uses light instead of sound waves. In autonomous vehicles, lidar scans surroundings in real-time, allowing cars to avoid collisions.
Components of lidar:
Lidar is very accurate with depth perception and determining the presence of an object. It can see at long distances and through poor environmental conditions, like nighttime or rain and fog. Because it recognizes and categorizes what it sees, it can tell the difference between objects like a squirrel and a stone and predict behavior accordingly.
Radar stands for “radio detection and ranging.” This sensor emits short pulses in the form of electromagnetic waves to detect objects in the environment. As soon as the waves hit an object, they are reflected and bounce back to the sensor. In autonomous vehicles, radar is used to identify other vehicles and large obstacles.
Components of radar:
Because it does not rely on light, radar performs well regardless of weather conditions and is most commonly used to enable cruise control and collision avoidance systems.
While radar uses radio waves and lidar uses light pulses, ultrasonic sensors evaluate the objects in an environment by sending out short, ultrasonic impulses that are reflected back to the sensor. They are very cost effective, excellent at detecting solid hazards, and are typically used on car bumpers to alert drivers of obstacles while parking. For best results in assisted driving applications, ultrasonic sensors are commonly combined with cameras.
Interesting fact: Many of the best ultrasonic sensors are found in nature. Bats, dolphins, and narwhals all use ultrasonic waves to identify objects (echolocation).
|Hi-res color imagery, highly detailed and realistic
|3D information, compact, long range
|Compact, no moving parts, unaffected by light or weather
|Unaffected by weather, high resolution at long range
|Poor performance in bad weather and low-light conditions
|Low resolution, no color
|No color, price
While individual sensors each have their strengths, the interaction of sensor information makes assisted driving possible. And as vehicles move towards total independence, choosing the right combination of sensors becomes even more critical to achieving the safety standards required for autonomy.
For the highest level of safety and performance, sensor fusion between camera, radar, lidar, and ultrasound will maximize each sensor type's strengths while compensating for others' weaknesses. For example, lidar alone provides poor results for lane tracking, but the combination of lidar and camera is very effective at this function.
The right combination depends on several factors:
To ensure prototype vehicles are real-world ready, sensors must be designed with an immense variety of test cases.
By providing a realistic physics-based sensor response in real time for camera, lidar, radar, and ultrasonic sensors, the Ansys simulation platform provides engineers with all the information needed to validate the safety of their autonomous driving system designs
Use Ansys simulations at the beginning of your design exploration to accurately see how each combination of sensors will perform in the real world. Then, based on your goals, you can evaluate the right sensor combination for your project.
To learn more about how to use physics-based sensor simulation to make autonomous vehicles safter, view our on-demand webinar: Making Autonomous Vehicle Safer Using Physics-Based Sensor Simulation at Scale.
Siamo qui per rispondere alle tue domande e non vediamo l'ora di parlare con te. Un membro del nostro team di vendita Ansys ti contatterà a breve.