Skip to Main Content

      

ANSYS BLOG

June 15, 2020

How to Improve Product Sounds Using Acoustic Simulations

Engineers tend to spend a lot of time tweaking and simulating a design to optimize its look, feel and performance. However, they should not underestimate the sounds made by their products by addressing them late in the development cycle.

Engineers can design the best dishwasher in the world. But if its too loud, or too quiet, people may not think it’s working properly.

The perceived quality that sound gives a product can be positive as well. We all wait in anticipation for the pop of a toaster, the snap of a seat belt or the click of a connector. In fact, these sounds are so satisfying that it’s common to hear them in commercials.

On the other hand, a bad sound could be disastrous to a product. Imagine a ceiling fan that cooled a room in seconds but sounded like a helicopter — no one would buy it.

Since sounds can impact product sales, it could be problematic if engineering teams use the traditional method of waiting until physical prototyping to listen to their design for the first time. At this stage of the development cycle, it could be too costly to fix the problem.

What if you could just listen to your product’s sound during the design stage of development? Yes, you can by using acoustic simulations. In fact, companies have started to create whole departments that focus on optimizing product sounds.

Without the satisfying snap, people would wonder if their seatbelt was working properly. With acoustic simulations, engineers can ensure the product sounds optimal.

The Science Behind Acoustic Simulations

Before we dig into how to listen to a simulation, let’s discuss where these sounds come from.

Products that contain vibrating parts, impacting components, fluid flows and electromagnetic fields can all produce sound by vibrating the air, or medium, around them. So, to simulate the noises that products make, you must first simulate their functions.

For instance, to simulate the sound inside a car, you must model the wind that brushes past it and the vibroacoustic behavior of the powertrain. Even the sounds of an air conditioning system are important for quieter electric vehicles (EV).

Once the product operations are modeled, all of these noises can be identified and studied for their sound level and quality. The levels can be calculated in decibels, a unit of sound that measures its pressure, power and intensity.

However, it’s harder and subjective to assess sound quality. Psychoacoustic indicators like loudness, sharpness, roughness, tonality and fluctuation strength can estimate the sound perception. However, to get an accurate rating of sound quality, engineers need to conduct listening tests with groups of people and statistical analysis. As sound quality is subjective, the best way to judge it is for each person in the test to listen to it.

That is why acoustic simulations are so important: They can help design the functional aspect of a product and its sound as well..

How to Listen to an Acoustic Simulation

Let’s say you want to assess the noise of an electric vehicle. The major components of the powertrain that contribute to the product’s sound are the motor and gearbox. So, to assess the sounds you first use multiphysics workflows that simulate these systems.

The acoustic simulation workflow to model the sounds from an electric vehicle powertrain.

For instance, engineers could simulate a car’s dashboard and compare how light from external headlights, readouts, windshield reflections, mirrors and radio can all affect the driver’s vision. These simulations can be tweaked to assess how the driver’s ages or visual impairments could affect the results.

With age, people tend to become more sensitive to glare. They also tend to perceive certain objects with a yellowy tint. Age will often stiffen an eye’s lens, causing close images and writing to appear blurry. SPEOS can simulate all of these effect within its human eye model. The human eye model can also be tweaked to test color-blindness and other visual impairments.

Based on this information, engineers can improve driver safety by iterating the color, shape, brightness, glare or (when applicable) fonts of the:

  • Dashboard lights
  • Radio lights
  • Readouts
  • Windshield

The acoustic simulation workflow to model the sounds from an electric vehicle powertrain.

Virtual Reality Helps Engineers Experience Their HMI Designs

When engineers design an HMI — for an aircraft or a car— it’s important to ensure the information will be perceived and understood by the end-user.

As a result, engineers need to fully experience their HMI designs. This experience can help them test and validate the human factors of the HMI to ensure they are optimized to every potential user.

To do this, engineers can use Ansys VRXPERIENCE’s HMI VR testing capabilities. These VR simulations can provide engineers with haptic feedback, visual simulations and eye/finger tracking data. This information can then be used to optimize the car’s HMI system to reduce eye strain, road distractions and more.

VRXPERIENCE also provides a way to integrate with Ansys SCADE and bring the HMI software-in-the-loop of the VR environment. Engineers can then use this environment to run, test and interact with the embedded software.

To learn about how Ansys SPEOS and Ansys VRXPRIENCE can optimize HMI systems, read the white paper: Making the Cockpit of the Future a Reality via Optimized Human-Machine Interfaces.

Ansys VRXPERIENCE can transform an HMI design into a virtual reality experience.

Ready to make the impossible possible?

Contact us

* = Required Field

Thank you for reaching out!

We’re here to answer your questions and look forward to speaking with you. A member of our Ansys sales team will contact you shortly.

Racecars on a track