Ansysは、シミュレーションエンジニアリングソフトウェアを学生に無償で提供することで、未来を拓く学生たちの助けとなることを目指しています。
Ansysは、シミュレーションエンジニアリングソフトウェアを学生に無償で提供することで、未来を拓く学生たちの助けとなることを目指しています。
Ansysは、シミュレーションエンジニアリングソフトウェアを学生に無償で提供することで、未来を拓く学生たちの助けとなることを目指しています。
ANSYS BLOG
October 18, 2022
Augmented reality (AR) and virtual reality (VR) have come a long way since their brief introduction in the 1990s. Today, both have re-emerged in the gaming industry and other settings — as a source of family entertainment, to conduct virtual tours, or in training programs for civil servants, doctors, pilots, and more. Today, revenues from AR/VR are projected to reach $72.8 billion in 2024 (up from $12 billion in 2020).1 About 80% of this revenue comes directly from gaming.2
But, consumers haven’t always been so excited about AR/VR technology. Early enthusiasm for AR/VR quickly faded. Many saw it as a combination of bad timing and bad luck because the technology to support a good user experience just wasn’t there yet. Fast forward to today and eye-tracking technology — a chief enabler of AR/VR.
Eye tracking is the measurement of eye activity that can trace where human eyes gaze and pupil size. Eye activity such as blinking, following objects, and responses to stimuli are all considered valuable to AR/VR user experiences.
It’s not difficult to understand why platforms like Google, Facebook, Apple, and Microsoft are pursuing AR/VR and eye tracking, due to its obvious benefits. Eye tracking helps their developers to analyze users’ interests and predict their next move, enabling platforms to decrease response times. It also enhances the user experience in digital environments generated by AR/VR. Today, there are several tools available for adoption. Many AR/VR devices already contain eye-tracking systems to enhance system optics even further, including the Oculus Rift, HoloLens 2, Magic Leap, and HTC Vive.
Eye tracking data is driving AR/VR system development in numerous ways. Since the human eye only has high resolution around the fovea — the area of the eye responsible for sharp central vision — and lower resolution for wide angle views, such information can help save computational resources to generate details in a small area, and lower resolution for other contents to display.
In turn, lower computational resource requirements mean less power consumption and heat generation, which is quite significant in terms of overall user comfort. Heat dissipation is one cause of bulkiness and heaviness in AR/VR applications. In this regard, eye tracking captures a more natural experience leading to more comfortable AR/VR head gear designs and a better overall user experience.
A common problem for AR/VR systems is the vergence-accommodation conflict (VAC). It happens when visual cues between the distance of a virtual object and the required focusing distance misalign in your brain. In this scenario, looking at stereoscopic imagery can lead to focusing problems, eye strain, and visual fatigue.
Using the focus distance to tune the virtual image will relieve VAC and help AR/VR users have a better experience. To do this, you need eye tracking to tell an optical system where the user is looking exactly to get the correct focus. With this information, adjusting the system to accurately capture the point of convergence between the eye and the correct focal plane can cancel out these affects.
Interestingly, the human eye can be used to control an AR/VR device. In the Microsoft HoloLens 2, for example, a user’s eye-gaze input can rapidly and effortlessly deliver a contextual input signal that can influence and shape their own holographic experience.
As AR/VR headsets get smaller, the space for eye tracking systems is decreased. For certain AR/VR design types, the distance between the eye tracking camera and human eye is squeezed, while the required field of view is enlarged with the same image quality.
How does Ansys connect disciplines that consider AR/VR in one efficient workflow to address optical challenges like these? By offering a comprehensive software solution across the entire AR/VR optical module:
“An eye tracking system usually contains three parts: light source, lens, and sensor. Introducing Ansys Zemax Optical Design Software into a design workflow presents clear advantages in optimization and system tolerance for the user,” says Michael Cheng, Lead Application Engineer in the Ansys Optical group. “The software can provide a realistic simulation for eye tracking systems to help accurately predict optical system performance in the manufacturing stage to avoid costly production errors.”
For more information about Ansys software solutions for AR/VR, visit the Ansys Optics page.
Or watch the webinar: Design & Optimize Optical Components with Ansys Zemax