Ansys는 학생들에게 시뮬레이션 엔지니어링 소프트웨어를 무료로 제공함으로써 오늘날의 학생들의 성장을 지속적으로 지원하고 있습니다.
Ansys는 학생들에게 시뮬레이션 엔지니어링 소프트웨어를 무료로 제공함으로써 오늘날의 학생들의 성장을 지속적으로 지원하고 있습니다.
Ansys는 학생들에게 시뮬레이션 엔지니어링 소프트웨어를 무료로 제공함으로써 오늘날의 학생들의 성장을 지속적으로 지원하고 있습니다.
ANSYS BLOG
October 17, 2019
Visual simulations and virtual reality (VR) enable engineers to see through the eyes of the end-user. This is great news for marketers looking to improve the visual appeal of a product. More importantly, the engineering applications of these tools can improve automotive safety.
What we see depends on many variables, including:
The human observer could also have conditions that affect their ability to perceive the world. In short, different people see different details in an environment.
Instead of relying on their personal vision, engineers can use visual simulations to assess how everyone would perceive a car’s environment, control system and surroundings.
Think of it: You don’t want a warning signal on the car’s GPS to be invisible to people living with a form of color-blindness. Visual simulations can help engineers see through the eyes of various individuals — firsthand — so they can experience how the HMI fails a portion of the population.
Engineers can even create virtual reality representations of the car’s HMI system. These VR simulations enable the engineer to experience their designs in an immersive environment. This experience improves their abilities to optimize the HMI design to ensure that it is universally safe for all potential users.
Visual simulation compares how a scene would look to people living with various visual impairments.
Ansys Speos can be used to model the color-sensitive cones and light-sensitive rods of the human eye. The software transforms the light information that would reach the eye into a virtual representation. This visual simulation can then be shared with the engineering team to optimize the optical safety of designs.
The visual simulation shows how glare affects the vision of the driver based on a certain age. Engineers can use this information to improve the car’s windshield and human-machine interface (HMI).
For instance, engineers could simulate a car’s dashboard and compare how light from external headlights, readouts, windshield reflections, mirrors and radio can all affect the driver’s vision. These simulations can be tweaked to assess how the driver’s ages or visual impairments could affect the results.
With age, people tend to become more sensitive to glare. They also tend to perceive certain objects with a yellowy tint. Age will often stiffen an eye’s lens, causing close images and writing to appear blurry. Speos can simulate all of these effect within its human eye model. The human eye model can also be tweaked to test color-blindness and other visual impairments.
Based on this information, engineers can improve driver safety by iterating the color, shape, brightness, glare or (when applicable) fonts of the:
When engineers design an HMI — for an aircraft or a car— it’s important to ensure the information will be perceived and understood by the end-user.
As a result, engineers need to fully experience their HMI designs. This experience can help them test and validate the human factors of the HMI to ensure they are optimized to every potential user.
To do this, engineers can use Ansys VRXPERIENCE’s HMI VR testing capabilities. These VR simulations can provide engineers with haptic feedback, visual simulations and eye/finger tracking data. This information can then be used to optimize the car’s HMI system to reduce eye strain, road distractions and more.
VRXPERIENCE also provides a way to integrate with Ansys SCADE and bring the HMI software-in-the-loop of the VR environment. Engineers can then use this environment to run, test and interact with the embedded software.
Ansys VRXPERIENCE can transform an HMI design into a virtual reality experience.