August 18, 2022
At the height of the pandemic, isolation due to quarantine inspired entertainers to find more creative ways to reach their audiences. In 2020, rapper Travis Scott appeared to video game players as a character within Epic Game’s “Fortnite.” The singer performed in this virtual world for 15 minutes for 12.3 million attendees.1
Since then, the metaverse has continued to expand at a rapid pace, embracing virtual worlds where you can socialize, buy things, and even take a seat at the table of the virtual home of your choosing. Augmented reality (AR) and virtual reality (VR) are the fundamental technologies enabling the push to expand the metaverse based on interactions in our physical world. Moving forward, the expectation is that the optical systems supporting AR/VR head-mounted displays and smart devices will progress right along with them.
“I see a trend in cell phone cameras and other devices where manufacturers are trying to achieve higher and higher image quality in AR/VR,” says Michael Cheng, Lead Application Engineer in the Ansys Optical group. “Today they are really reaching the limit of performance possible with traditional lenses within an available size (the best smartphones have seven or eight lenses in their camera modules). Many companies are looking to novel solutions such as metalenses (quartz plates composed of microscopic titanium structures) to replace glass lenses in traditional optics for better performance in a more compact size.”
The focus of this discussion is centered around Ansys solutions aimed at the innovation required to improve the optics behind AR/VR systems embedded in consumer electronics devices, including smartphones. What’s driving consumer demands for these applications? And how can simulation help manufacturers better meet them?
AR and VR manufacturers face specific design challenges. The technology is not new, but it is evolving rapidly, as many trends with designers and companies take shape in the form of nontraditional solutions. The challenge from an optics perspective — whether you’re designing goggles or smartphones — is how to incorporate rapidly increasing performance expectations into an ever-shrinking package design.
Let’s start with AR, which uses computer-generated information to enhance real-world environments in real time. In these applications, perceptual information is essentially layered on top of a natural, 3D environment to enhance what already exists around you. It’s used heavily in marketing products, goods, and services. Retailers often use it to create more engaging shopping experiences for consumers — for example, to help them visualize furniture in a specific room of their home.
VR is a computer simulation of an artificial world with the power to immerse you in an entirely new, artificially created environment. VR is used not just for marketing or entertainment purposes, but also for its predictive power. Using a headset, VR enables the user to enter a computer-generated scene to learn important information experientially, whether its soldiers preparing for combat or doctors training for surgeries.
In AR, technological change is primarily driven by the need for these systems to be compact and light enough for consumer-based electronics applications, so they are both comfortable and stylish to wear. Smaller module designs and better imaging performance are the main drivers for AR systems. The images viewed by the wearer of an AR headset, for example, should be bright enough to be visible in sunny conditions. Making a lightweight, compact, high-brightness display suitable for a variety of settings is definitely a challenge.
There is an emphasis on lighter-weight applications for VR as well, yet due to consumer expectations, it’s slightly less (in part because consumers expect these systems to be a little bulkier). With VR, the challenge is achieving a high enough resolution and depth of field — basically the distance at which an object can be perceived in focus.
“Currently, most stereoscopic VR products (which provide a more realistic, immersive experience for the user without the bulk) on the market only support a single image position for each eye,” says Cheng. “Many companies and labs have, in fact, developed solutions with multiple image positions for better range of field of depth, but they are not yet mature enough for mass production.”
Ansys provides a comprehensive solution for the design and simulation of optical systems suitable for AR/VR applications. The ray tracing capabilities in Ansys Zemax OpticStudio can be used to simulate the paths light takes through these complex systems, from the picture generation unit (PGU) — a module for providing raw imaging — through lenses and waveguides and into the wearers’ eye. Within OpticStudio, a full analysis of the optical performance and brightness of the system can be performed to determine a number of variables related to imaging performance, including resolution and depth perception. Recent software enhancements also enable direct integration of Ansys Lumerical photonics simulations into OpticStudio, creating a powerful workflow for companies to improve their AR/VR systems.
“Adding Lumerical enables full simulations of novel diffractive optical elements (DOEs) used in waveguides to better understand overall system performance,” says Cheng. “The unique capabilities of these elements to shape light waves supporting AR/VR applications are not achievable with traditional refractive lenses. However, without simulation, DOEs are generally more costly to manufacture.”
The consumer devices supporting AR/VR applications are getting smaller and faster all the time, requiring more energy-efficient designs. In combination, OpticStudio and Lumerical achieve these benchmarks by helping designers develop more efficient optical systems where less light is lost on the journey from the PGU to the wearer’s eye. Not only does this result in higher-brightness images — it can also help designers reduce PGU requirements for brightness, power, and cooling in small electronics; reduce system size; and realize greater efficiencies overall.
What happens when you add Ansys Speos optical system design and validation software into the mix? Working together, OpticStudio, Lumerical, and Speos provide a full optical solution to help build smaller consumer-based electronic AR/VR system prototypes faster, reduce time to market, and help manufacturers maintain their competitive edge technologically.
First, Lumerical simulates the optical effects on a nano scale. The resulting data is transferred into OpticStudio, which has tools to design components and systems based on this information on the scale of millimeters to meters. Speos can then integrate the optics into larger system models and incorporate the properties of human vision, which are important to evaluate the quality and experience of any AR/VR consumer solution.
“Ansys offers a unique set of seamlessly linked tools covering all aspects of AR/VR device design,” says Cheng. “You can use Lumerical to handle all the sub-wavelength structures, such like grating, polarizer, think film, or quarter waveplate. In OpticStudio, you can design, optimize, and analyze tolerance of the whole system with data from Lumerical, then verify the system using Speos for human eye perception, environmental effect, and straylight.”
If you’re still curious about Ansys solutions and how they can help you solve specific optical challenges in AR/VR, sign up for a free trial of Ansys Zemax OpticStudio, Ansys Lumerical, and Ansys Speos.
1. 'Fortnite' virtual rap concert draws record 12.3M attendees, Marketing Dive, April 27, 2020.
We’re here to answer your questions and look forward to speaking with you. A member of our Ansys sales team will contact you shortly.