Advances in Augmented- and Virtual-Reality Technologies and Applications
The first wave of AR and VR devices has reached the marketplace, but much work needs to be done in order to provide immersive and life-like experiences to mainstream users.
by Achintya K. Bhowmik
IN recent years, virtual-reality (VR) and augmented -reality (AR) technologies have moved from the realms of science fiction and imagination to advanced research in academic laboratories, to product development in the industry, and, finally, into the hands of consumers in the real world. A number of marquee VR devices have been launched, along with compelling immersive applications. A few AR devices and developer kits have been released as well. The pace of progress in both VR and AR technologies has been rapid.
So, in line with this fast-emerging trend in the ecosystem, the Society for Information Display (SID) decided to create a special track on AR and VR for Display Week 2016. The rich lineup at Display Week in San Francisco included a short course, a seminar, and a number of invited and contributed presentations in the technical symposium, and demonstrations on the exhibit floor.
It is clear that the display industry is on the verge of another exciting phase of rejuvenation. Displays are the face of some of the most used electronic devices in our daily lives – such as the smartphone, tablet, laptop, monitor, and TV, among numerous examples. As such, the health of the display industry rises and falls with the growth and saturation of these devices. Take the exciting phase of innovation in LCD-TV technologies as an example. Screen sizes went from 24 to 32 in., to 40 in., to 55 in., to 80 in., and above. The pixel resolution went from 720p to full-HD and then to 4K, and frame rates went from 60 to 120 frames per second (fps). There were many more advances – contrast, brightness, color, etc. However, it gets to a point where further advances in display technology provides only small incremental benefits to the consumer. This often leads to a reduced demand for new features and a slowdown in development.
Let us now turn to virtual reality. It is a completely different story at the moment. The displays on the best state-of-the-art VR devices today fall way short of the specifications required for truly immersive and responsive experiences, despite the dizzying pace of development. The pixel density needs to increase significantly and latencies must be reduced drastically, along with many other improvements such as increased field of view, lower screen-door effects with reduced non-emitting spaces between active pixels, reduced pixel persistence, higher frame rates, etc. Besides the display, the systems also require integration of accurate and real-time sensing and tracking technologies as well as enhanced computation and processing power. AR devices impose additional requirements relating to see-through head-worn display technologies.
So, all this is exciting for the researchers and engineers in the industry. We are back to solving some difficult challenges, with a potential for big returns. Judging by the excellent quality of the papers, presentations, and exhibits at Display Week, it is obvious the display ecosystem is all geared up.
In this article, we present a summary of some of the advanced developments reported in the fields of VR and AR at Display Week 2016. For the sake of brevity, it is not a comprehensive coverage of all the work presented at the conference, and readers are encouraged to reference the symposium digest for all the relevant papers.
An Immersive Technical Program
The Sunday short course “Augmented and Virtual Reality: Towards Life-Like Immersive and Interactive Experiences”1 and the Monday seminar “Fundamentals of Head-Mounted Displays for Augmented and Virtual Reality”2 provided a comprehensive tutorial on the system-level requirements of VR and AR devices. Topics covered included fundamental human-factor considerations; a review of the advances in sensing, computing, and display technologies; and a description of the need for an end-to-end system architecture for seamless immersive and interactive experiences. Besides the comprehensive reviews of the technologies and systems, the seminar also reviewed historical perspectives and definitions.
A VR device places the user in a virtual environment, generating sensory stimuli such as visual, vestibular, auditory, haptic, etc., that provide the sensation of presence and immersion. An AR device places virtual objects in the real world while providing sensory cues to the user that are consistent between the physical and augmented elements. A new and emerging class of merged and mixed-reality devices blends real-world elements within the virtual environment with consistent perceptual cues. Subsequently, the various presentations reported developments in display technologies, sensing modules, systems, and applications-level innovations.
A key challenge for the display subsystem in VR and AR devices is to provide the user with an immersive and life-like 3D visual experience. While the devices that are currently commercially available provide stereopsis cues by presenting a pair of stereoscopic images to the left and right eyes, they are not able to provide a number of other important 3D cues that are salient to how we perceive the real world. In Fig. 1 (from a symposium paper titled “Light Fields and Computational Optics for Near-to-Eye Displays”), Gordon Wetzstein depicted a number of depth cues that we use in 3D visual perception, including the oculomotor and visual cues.
Fig. 1: Humans use both oculomotor cues (vergence and accommodation) and visual cues (binocular disparity and retinal blur).3
Further, in his paper, “Why Focus Cues Matter,” Martin Banks described the vergence–accommodation conflict experienced in the stereoscopic 3D displays that causes visual discomfort, which is illustrated in Fig. 2. By using an experimental display, Banks et al. investigated how incorrect focus cues affect visual perception and comfort. The results show that the ability to perceive correct depth ordering is significantly improved when focus cues are correct, the ability to binocularly fuse stimuli is improved when the vergence–accommodation conflict is minimized, and visual comfort is significantly increased when the conflict is eliminated.
Fig. 2: The vergence–accommodation conflict arises from the difference between the point where the two eyes converge and the points to which the eyes focus or accommodate for traditional stereoscopic 3D displays.4
After establishing the importance of producing the correct focus cues, Philip Bos (in the paper “A Simple Method to Reduce Accommodation Fatigue in Virtal- and Augmented-Reality Displays”), Banks, and Wetzstein presented various methods to address this issue. Bos et al. reported a simple approach that allows the eyes to accommodate at the distance to which they are converging, through using eye-tracking and a variable lens built using liquid crystals. Figure 3 shows the variable focus capabilities of the liquid-crystal lens. Wetzstein presented a prototype display that employs two LCD panels separated by a small distance (Fig. 4), which is able to present distinct light fields onto each eye, thereby creating a parallax effect spanning the eye-box areas.
Fig. 3: Results from an adjustable liquid-crystal lens include a lens image taken with voltage applied to provide a 400-mm focal length to focus the image (top left), an LC lens image with 0 V applied (right), and a glass lens image with a 400-mm focal length for reference (bottom).5
Fig. 4: A prototype light-field stereoscope comprising two stacked LCD panels.3
In some of the other developments, Shuxin Liu et al. presented a multi-plane volumetric optical see-through head-mounted 3D display using a stack of fast-switching polymer-stabilized liquid-crystal scattering shutters and implemented a proof-of-concept two-plane prototype.6 Jian Han et al. presented a prototype of a near-to-eye waveguide display with volume holograms and optical collimator.7 Mitsuru Sugawara et al. described a retinal imaging laser eyewear incorporating a miniature laser projector that provides the digital image information through the pupil of the user using the retina as a screen,8 as shown in Fig. 5. The authors described the benefits of focus-free image presentation independent of the wearer’s visual acuity and point of focus, and the laser safety analyses based on guidelines and standards.
Fig. 5: This retinal imaging laser eyewear incorporates an asymmetric free-surface mirror.8
Beyond the advances in display technologies for VR and AR applications, presentations at Display Week also covered system-level innovations towards immersive and interactive usages. As an example, Philip Greenhalgh et al. presented the DAQRI smart helmet with AR capabilities (Fig. 6) designed to enable its deployment into an industrial environment. Greenhalgh’s study showed that overlaying contextually filtered data in relevant physical spaces with AR technology yielded significant benefits. The authors also demonstrated how Intel RealSense embedded camera technology can further refine the context of data and enable advanced applications such as decluttering the background for object recognition, real-time sizing and measurements, hands-free human–machine interface controls, content access, etc.
Fig. 6: The DAQRI augmented-reality smart helmet9 is designed for industrial applications.
As another example, at Display Week the author of this article described Intel’s project for integrating RealSense technology into merged-reality headsets. As shown in Fig. 7, this system adds real-time 3D-sensing capability to VR and AR devices with RGB-D imaging and visual-inertial motion-tracking technologies. As a result, the devices are able to blend real-world elements into the virtual world and vice versa, enabling new applications such as natural interactions, multi-room scale mobility with integrated six-degrees-of-freedom positional tracking, real-time 3D scanning, and visual understanding. Additionally, Fig. 8 shows an application where virtually rendered 3D objects are seamlessly embedded into the real-world view with correct physical effects such as collision, occlusion, shadows, etc.
Fig. 7: The “merged reality “capability of a device is shown here with integrated Intel RealSense technology.10 The 3D images of the user’s hands as well as a person standing in front of the user are brought into the virtual world. This capability is also used to allow multi-room scale mobility with integrated six degrees-of-freedom positional tracking to help the user avoid colliding with physical objects.
Fig. 8: The real physical world is augmented with virtually rendered 3D objects using a device with an embedded Intel RealSense module.10 Here, a digitally rendered car is shown racing on a real table, with realistic physical effects such as collisions with real objects, correct occlusion and shadows, etc.
In summary, the special track on AR and VR at Display Week 2016 prominently featured some of the significant advances in these fields. While the rapid pace of development in recent years has brought the first wave of devices to market to enthusiastic reception, much work remains to improve the technologies to where they can provide immersive and life-like experiences to mainstream users. As the papers, presentations, and the discussions at Display Week indicate, researchers and developers in both the industry and academia are diligently continuing to enhance the key technologies, including sensors, processors, displays, and system integration and applications.
The future will blur the border between the real and the virtual worlds, and we are nearing that future!
References
1A. K. Bhowmik, “Augmented and Virtual Reality: Towards Life-Like Immersive and Interactive Experiences,” Seminar M-9, SID Display Week 2016.
2H. Hua, “Fundamentals of Head-Mounted Displays for Augmented and Virtual Reality,” Short Course S-4, SID Display Week 2016.
3G. Wetzstein, “Light Fields and Computational Optics for Near-to-Eye Displays,” Paper 28.3, SID Display Week 2016.
4M. Banks, “Why Focus Cues Matter,” Paper 28.1, SID Display Week 2016.
5P. J. Bos, L. Li, D. Bryant, A. Jamali, and A. K. Bhowmik, “A Simple Method to Reduce Accommodation Fatigue in Virtual Reality and Augmented Reality Displays,” Paper 28.2, SID Display Week 2016.
6S. Liu, Y. Li, X. Li, P. Zhou, N. Rong, Y. Yuan, S. Huang, W. Lu, and Y. Su, “A Multi-Plane Volumetric Optical See-Through Head-Mounted 3D Display,” Paper 3.1, SID Display Week 2016.
7J. Han, J. Liu, and Y. Wang, “Near-Eye Waveguide Display Based on Holograms,” Paper 3.2, SID Display Week 2016.
8M. Sugawara, M. Suzuki, and N. Miyauchi, “Retinal Imaging Laser Eyewear with Focus-Free and Augmented Reality,” Paper 14.5L, SID Display Week 2016.
9P. Greenhalgh, B. Mullins, A. Grunnet-Jepsen, and A. K. Bhowmik, “Industrial Deployment of a Full-featured Head-mounted Augmented-Reality System and the Incorporation of a 3D-Sensing Platform,” Paper 35.3, SID Display Week 2016.
10A. K. Bhowmik, “Real-Time 3D-Sensing Technologies and Applications in Interactive and Immersive Devices,” Paper 35.1, SID Display Week 2016. •
Achintya K. Bhowmik is the Vice-President and General Manager of the Perceptual Computing Group at Intel Corp. He can be reached at achintya.k.bhowmik@intel.com.