To ensure their final product matches their design, engineers need to simulate how a human will visualize the product’s lit and unlit real-world appearance within a virtual environment, such as an aircraft cockpit or vehicle interior.
This webinar spotlights Ansys SPEOS, which offers a refined human eye sensor model and provides access to the human vision algorithm for luminance and immersive 360-degree observer results. The purpose of SPEOS’ Human Vision simulation feature is to overcome classical display limitations in terms of luminance dynamic range. Human Vision smartly adapts the displayed luminance and contrast, enabling visualization of key details that would be seen in the real world, then emulating physiological properties of the human eye like the glare, depth of field, acuity and temporal adaptation. In turn, the simulation matches the appearance of product designs, based on spectral luminance results, algorithms for the dynamic adaptation and glare aspects of human vision — while considering the limitations of display capabilities. This enables users to evaluate and improve their designs to achieve quality levels for homogeneity, color differences and stray light issues.
Attend this webinar to explore best practices for SPEOS’ human vision simulation:
Understand monitor display capabilities, how to maximize their use and how to increase reliability across different monitors through calibration.
Learn how to configure and optimize SPEOS Lab viewer preferences to enable 1:1 luminance matching.
Determine the best settings for the simulation and the viewing environment to ensure a reliable displayed result display that replicates physical prototypes.