Cloud native EDA tools & pre-optimized hardware platforms
Virtual reality (VR) uses technology to create a completely simulated environment in which a user can experience and interact with that environment [1]. The hardware for virtual reality typically includes a computer capable of real-time scene simulation; wearable devices (e.g., haptic gloves) that sense and respond to motions of the user; a display for visual output; devices for audio feedback; and trackers for body, head, and eye.
Virtual reality optics include the cameras that capture raw data for scene simulation; fiber optics used in gloves and clothing to send and receive data; head-mounted displays (HMDs) that generate 3D perception; immersive and semi-immersive projection displays; and sensors that track the motion of the user and their eyes. Currently, the virtual reality optics of most interest are head-mounted displays (HMDs), which are also known as near-eye displays.
For virtual reality to work, there must be an optical system in a head-mounted display that will project an image on a display in front of your eyes.
In this optical system, an HMD includes light sources (display), receivers (eyes), and optical elements (lenses).
Figure 1.
Source: . [3]
One of the most important requirements for HMDs is good ergonomic design, i.e., the headset is comfortable to wear and view for prolonged usage. To be comfortable to wear, the headset should be compact and lightweight. Ideally, the weight and size should be no more than a pair of eyeglasses. To be comfortable to view, the headset should provide appropriate viewpoints based on the user’s head position and gaze point. The headsets should also have adequate eye clearance, a large-enough pupil size to allow for natural eye movement, appropriate interpupillary distance (IPD), and low divergence and dipvergence.
The key optical design constraints for HMDs are pupil (eye-box) size, eye clearance, divergence, dipvergence, and IPD (See Figure 2).
Figure 2: Schematic diagram of biocular parallax.
(a) No biocular parallax; (b) convergence; (c) divergence; and (d) dipvergence. [4]
An important design goal for VR HMDs is to match the image characteristics of the human-visual system. The FOV of the human eye is roughly 120 degrees vertically and almost 360 degrees horizontally considering eye rotation and head movements. The binocular FOV within which an object is visible to both eyes is about 114 degrees [5].
Source: . [6]
The effects of aberrations to image quality in HMDs are similar to that in other optical systems. Aberrations such as axial chromatic aberration, spherical aberration, coma, astigmatism, and field curvature introduce blur. Aberrations such as distortion, coma, and lateral chromatic aberration induce warping. Aberration control is important in the design of VR HMD optics.
Advances in HMD design take advantage of aspheric surfaces, diffractive optical elements (DOEs), holographic optical elements (HOEs), tunable lens and plastic optics.
Optical design software is an important tool for designing VR optics. There are multiple software needs for the design of VR optical systems. The optical engineer needs software to create and optimize the imaging system, analyze straylight in the optical path, and design diffractive optical elements. The mechanical engineer needs a CAD package to draw the system layout and accomplish thermal and structural analysis. The VR system may also require electrical engineer to track the eye motion and send the signal to the optical system. 草榴社区 provides a complete set of tools to simulate AR/VR devices.
The workflow:
Once the gratings have been built, the Bidirectional Scattering Distribution Function (BSDF) information and layout files can be exported directly to LightTools to define a surface property. All diffractive properties are included in the RSoft BSDF files, which contain information about how a surface (thin film, patterns, etc.) scatters light.
In VR, the display only needs to output the simulated environment. In augmented reality (AR), the display is often see-through to combine the simulated environment with the real environment.
There are a few differences between VR and AR optics.
Also see these resources:
Blog:
References:
[1] Steuer, Jonathan. "Defining virtual reality: Dimensions determining telepresence." Journal of communication 42.4 (1992): 73-93.
[2] Chang, Chenliang, et al. "Toward the next-generation VR/AR optics: a review of holographic near-eye displays from a human-centric perspective." Optica 7.11 (2020): 1563-1578.
[3] "" Used with permission.
[4] Zhao, Z.; Cheng, D.; et al. “Design and evaluation of a biocular system, ” Applied Optics, Vol. 58, Issue 28, pp. 7851-7857 (2019), https://doi.org/10.1364/AO.58.007851. Used with permission. ? The Optical Society.
[5] Rolland, Jannick P., and Hong Hua."Head-mounted display systems." Encyclopedia of Optical Engineering 2 (2005).
[6] Kress, B. "." Used with permission.