1
|
Zeng H, Zhao R. Perceptually-guided Dual-mode Virtual Reality System For Motion-adaptive Display. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; PP:2249-2257. [PMID: 37027616 DOI: 10.1109/tvcg.2023.3247097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
An ideal Virtual reality (VR) device should simultaneously provide retina-level resolution, wide field-of-view (FOV), and high refresh rate display, thereby bringing users into a deeply immersive virtual world. However, directly providing such high-quality display poses great challenges for display panel fabrication, real-time rendering, and data transfer. To address this issue, we introduce a dual-mode virtual reality system based on the spatio-temporal perception characteristics of human vision. The proposed VR system has a novel optical architecture. It can switch display modes according to the user's perceptual requirements for different display scenes to adaptively adjust the display spatial and temporal resolution based on a given display budget, thus providing users with the optimal visual perception quality. In this work, a complete design pipeline for the dual-mode VR optical system is proposed, and a bench-top prototype is built with only off-the-shelf hardware and components to verify its capability. Compared to the conventional VR system, our proposed scheme is more efficient and flexible in utilizing the display budget, and this work is expected to facilitate the development of the VR device based on the human visual system.
Collapse
|
2
|
Qiu Y, Zhao Z, Yang J, Cheng Y, Liu Y, Yang BR, Qin Z. Light field displays with computational vision correction for astigmatism and high-order aberrations with real-time implementation. OPTICS EXPRESS 2023; 31:6262-6280. [PMID: 36823887 DOI: 10.1364/oe.485547] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/12/2023] [Accepted: 01/29/2023] [Indexed: 06/18/2023]
Abstract
Vision-correcting near-eye displays are necessary concerning the large population with refractive errors. However, varifocal optics cannot effectively address astigmatism (AST) and high-order aberration (HOAs); freeform optics has little prescription flexibility. Thus, a computational solution is desired to correct AST and HOA with high prescription flexibility and no increase in volume and hardware complexity. In addition, the computational complexity should support real-time rendering. We propose that the light field display can achieve such computational vision correction by manipulating sampling rays so that rays forming a voxel are re-focused on the retina. The ray manipulation merely requires updating the elemental image array (EIA), being a fully computational solution. The correction is first calculated based on an eye's wavefront map and then refined by a simulator performing iterative optimization with a schematic eye model. Using examples of HOA and AST, we demonstrate that corrected EIAs make sampling rays distributed within ±1 arcmin on the retina. Correspondingly, the synthesized image is recovered to nearly as clear as normal vision. We also propose a new voxel-based EIA generation method considering the computational complexity. All voxel positions and the mapping between voxels and their homogeneous pixels are acquired in advance and stored as a lookup table, bringing about an ultra-fast rendering speed of 10 ms per frame with no cost in computing hardware and rendering accuracy. Finally, experimental verification is carried out by introducing the HOA and AST with customized lenses in front of a camera. As a result, significantly recovered images are reported.
Collapse
|
3
|
Han W, Han J, Ju YG, Jang J, Park JH. Super multi-view near-eye display with a lightguide combiner. OPTICS EXPRESS 2022; 30:46383-46403. [PMID: 36558594 DOI: 10.1364/oe.477517] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 11/25/2022] [Indexed: 06/17/2023]
Abstract
We propose a lightguide-type super multi-view near-eye display that uses a digital micromirror device and a LED array. The proposed method presents three-dimensional images with a natural monocular depth cue using a compact combiner optics which consists of a thin lightguide and holographic optical elements (HOEs). Feasibility of the proposed method is verified by optical experiments which demonstrate monocular three-dimensional image presentation over a wide depth range. We also analyze the degradation of the image quality stemming from the spectral spread of the HOEs and show its reduction by a pre-compensation exploiting an adaptive moment estimation (Adam) optimizer.
Collapse
|
4
|
Chakravarthula P, Zhang Z, Tursun O, Didyk P, Sun Q, Fuchs H. Gaze-Contingent Retinal Speckle Suppression for Perceptually-Matched Foveated Holographic Displays. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:4194-4203. [PMID: 34449368 DOI: 10.1109/tvcg.2021.3106433] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Computer-generated holographic (CGH) displays show great potential and are emerging as the next-generation displays for augmented and virtual reality, and automotive heads-up displays. One of the critical problems harming the wide adoption of such displays is the presence of speckle noise inherent to holography, that compromises its quality by introducing perceptible artifacts. Although speckle noise suppression has been an active research area, the previous works have not considered the perceptual characteristics of the Human Visual System (HVS), which receives the final displayed imagery. However, it is well studied that the sensitivity of the HVS is not uniform across the visual field, which has led to gaze-contingent rendering schemes for maximizing the perceptual quality in various computer-generated imagery. Inspired by this, we present the first method that reduces the "perceived speckle noise" by integrating foveal and peripheral vision characteristics of the HVS, along with the retinal point spread function, into the phase hologram computation. Specifically, we introduce the anatomical and statistical retinal receptor distribution into our computational hologram optimization, which places a higher priority on reducing the perceived foveal speckle noise while being adaptable to any individual's optical aberration on the retina. Our method demonstrates superior perceptual quality on our emulated holographic display. Our evaluations with objective measurements and subjective studies demonstrate a significant reduction of the human perceived noise.
Collapse
|
5
|
Kimura S, Iwai D, Punpongsanon P, Sato K. Multifocal Stereoscopic Projection Mapping. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2021; 27:4256-4266. [PMID: 34449374 DOI: 10.1109/tvcg.2021.3106486] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Stereoscopic projection mapping (PM) allows a user to see a three-dimensional (3D) computer-generated (CG) object floating over physical surfaces of arbitrary shapes around us using projected imagery. However, the current stereoscopic PM technology only satisfies binocular cues and is not capable of providing correct focus cues, which causes a vergence-accommodation conflict (VAC). Therefore, we propose a multifocal approach to mitigate VAC in stereoscopic PM. Our primary technical contribution is to attach electrically focus-tunable lenses (ETLs) to active shutter glasses to control both vergence and accommodation. Specifically, we apply fast and periodical focal sweeps to the ETLs, which causes the "virtual image" (as an optical term) of a scene observed through the ETLs to move back and forth during each sweep period. A 3D CG object is projected from a synchronized high-speed projector only when the virtual image of the projected imagery is located at a desired distance. This provides an observer with the correct focus cues required. In this study, we solve three technical issues that are unique to stereoscopic PM: (1) The 3D CG object is displayed on non-planar and even moving surfaces; (2) the physical surfaces need to be shown without the focus modulation; (3) the shutter glasses additionally need to be synchronized with the ETLs and the projector. We also develop a novel compensation technique to deal with the "lens breathing" artifact that varies the retinal size of the virtual image through focal length modulation. Further, using a proof-of-concept prototype, we demonstrate that our technique can present the virtual image of a target 3D CG object at the correct depth. Finally, we validate the advantage provided by our technique by comparing it with conventional stereoscopic PM using a user study on a depth-matching task.
Collapse
|
6
|
Gao C, Peng Y, Wang R, Zhang Z, Li H, Liu X. Foveated light-field display and real-time rendering for virtual reality. APPLIED OPTICS 2021; 60:8634-8643. [PMID: 34613088 DOI: 10.1364/ao.432911] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2021] [Accepted: 08/02/2021] [Indexed: 06/13/2023]
Abstract
Glasses-free light field displays have significantly progressed due to advances in high-resolution microdisplays and high-end graphics processing units (GPUs). However, for near-eye light-field displays requiring portability, the fundamental trade-off regarding achieved spatial resolution remains: retinal blur quality must be degraded; otherwise, computational consumption increases. This has prevented synthesizing the high-quality light field from being fast. By integrating off-the-shelf gaze tracking modules into near-eye light-field displays, we present wearable virtual reality prototypes supporting human visual system-oriented focus cues. An optimized, foveated light field is delivered to each eye subject to the gaze point, providing more natural visual experiences than state-of-the-art solutions. Importantly, the factorization runtime can be immensely reduced, since the image resolution is only high within the gaze cone. In addition, we demonstrate significant improvements in computation and retinal blur quality over counterpart near-eye displays.
Collapse
|
7
|
Ueda T, Iwai D, Sato K. IlluminatedZoom: spatially varying magnified vision using periodically zooming eyeglasses and a high-speed projector. OPTICS EXPRESS 2021; 29:16377-16395. [PMID: 34154202 DOI: 10.1364/oe.427616] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2021] [Accepted: 05/07/2021] [Indexed: 06/13/2023]
Abstract
Spatial zooming and magnification, which control the size of only a portion of a scene while maintaining its context, is an essential interaction technique in augmented reality (AR) systems. It has been applied in various AR applications including surgical navigation, visual search support, and human behavior control. However, spatial zooming has been implemented only on video see-through displays and not been supported by optical see-through displays. It is not trivial to achieve spatial zooming of an observed real scene using near-eye optics. This paper presents the first optical see-through spatial zooming glasses which enables interactive control of the perceived sizes of real-world appearances in a spatially varying manner. The key to our technique is the combination of periodically fast zooming eyeglasses and a synchronized high-speed projector. We stack two electrically focus-tunable lenses (ETLs) for each eyeglass and sweep their focal lengths to modulate the magnification periodically from one (unmagnified) to higher (magnified) at 60 Hz in a manner that prevents a user from perceiving the modulation. We use a 1,000 fps high-speed projector to provide high-resolution spatial illumination for the real scene around the user. A portion of the scene that is to appear magnified is illuminated by the projector when the magnification is greater than one, while the other part is illuminated when the magnification is equal to one. Through experiments, we demonstrate the spatial zooming results of up to 30% magnification using a prototype system. Our technique has the potential to expand the application field of spatial zooming interaction in optical see-through AR.
Collapse
|
8
|
Zhang Q, Song W, Hu X, Hu K, Weng D, Liu Y, Wang Y. Design of a near-eye display measurement system using an anthropomorphic vision imaging method. OPTICS EXPRESS 2021; 29:13204-13218. [PMID: 33985060 DOI: 10.1364/oe.421920] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/04/2021] [Accepted: 04/05/2021] [Indexed: 06/12/2023]
Abstract
We developed a new near-eye display measurement system using anthropomorphic vision imaging to measure the key parameters of near-eye displays, including field-of-view (FOV), angular resolution, eye box, and virtual image depth. The characteristics of the human eye, such as pupil position, pupil size variation, accommodation function, and the high resolution of the fovea, are imitated by the proposed measurement system. A FOV scanning structure, together with a non-vignetting image-telecentric lens system, captures the virtual image from the near-eye display by imitating human eye function. As a proof-of-concept, a prototype device was used to obtain large-range, high-resolution measurements for key parameters of near-eye displays.
Collapse
|
9
|
Song W, Li X, Zheng Y, Liu Y, Wang Y. Full-color retinal-projection near-eye display using a multiplexing-encoding holographic method. OPTICS EXPRESS 2021; 29:8098-8107. [PMID: 33820262 DOI: 10.1364/oe.421439] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Accepted: 02/21/2021] [Indexed: 06/12/2023]
Abstract
We propose a novel method to construct an optical see-through retinal-projection near-eye display using the Maxwellian view and a holographic method. To provide a dynamic full-color virtual image, a single phase-only spatial light modulator (SLM) was employed in conjunction with a multiplexing-encoding holographic method. Holographic virtual images can be directly projected onto the retina using an optical see-through eyepiece. The virtual image is sufficiently clear when the crystal lens can focus at different depths; the presented method can resolve convergence and accommodation conflict during the use of near-eye displays. To verify the proposed method, a proof-of-concept prototype was developed to provide vivid virtual images alongside real-world ones.
Collapse
|
10
|
Aydındoğan G, Kavaklı K, Şahin A, Artal P, Ürey H. Applications of augmented reality in ophthalmology [Invited]. BIOMEDICAL OPTICS EXPRESS 2021; 12:511-538. [PMID: 33659087 PMCID: PMC7899512 DOI: 10.1364/boe.405026] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/11/2020] [Revised: 12/08/2020] [Accepted: 12/10/2020] [Indexed: 05/21/2023]
Abstract
Throughout the last decade, augmented reality (AR) head-mounted displays (HMDs) have gradually become a substantial part of modern life, with increasing applications ranging from gaming and driver assistance to medical training. Owing to the tremendous progress in miniaturized displays, cameras, and sensors, HMDs are now used for the diagnosis, treatment, and follow-up of several eye diseases. In this review, we discuss the current state-of-the-art as well as potential uses of AR in ophthalmology. This review includes the following topics: (i) underlying optical technologies, displays and trackers, holography, and adaptive optics; (ii) accommodation, 3D vision, and related problems such as presbyopia, amblyopia, strabismus, and refractive errors; (iii) AR technologies in lens and corneal disorders, in particular cataract and keratoconus; (iv) AR technologies in retinal disorders including age-related macular degeneration (AMD), glaucoma, color blindness, and vision simulators developed for other types of low-vision patients.
Collapse
Affiliation(s)
- Güneş Aydındoğan
- Koç University, Department of Electrical Engineering and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| | - Koray Kavaklı
- Koç University, Department of Electrical Engineering and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| | - Afsun Şahin
- Koç University, School of Medicine and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| | - Pablo Artal
- Laboratorio de Óptica, Instituto Universitario de Investigación en Óptica y Nanofísica, Universidad de Murcia, Campus de Espinardo, E-30100 Murcia, Spain
| | - Hakan Ürey
- Koç University, Department of Electrical Engineering and Translational Medicine Research Center (KUTTAM), Istanbul 34450, Turkey
| |
Collapse
|
11
|
Sun X, Zhang Y, Huang PC, Acharjee N, Dagenais M, Peckerar M, Varshney A. Correcting the Proximity Effect in Nanophotonic Phased Arrays. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:3503-3513. [PMID: 32941146 DOI: 10.1109/tvcg.2020.3023601] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Thermally modulated Nanophotonic Phased Arrays (NPAs) can be used as phase-only holographic displays. Compared to the holographic displays based on Liquid Crystal on Silicon Spatial Light Modulators (LCoS SLMs), NPAs have the advantage of integrated light source and high refresh rate. However, the formation of the desired wavefront requires accurate modulation of the phase which is distorted by the thermal proximity effect. This problem has been largely overlooked and existing approaches to similar problems are either slow or do not provide a good result in the setting of NPAs. We propose two new algorithms based on the iterative phase retrieval algorithm and the proximal algorithm to address this challenge. We have carried out computational simulations to compare and contrast various algorithms in terms of image quality and computational efficiency. This work is going to benefit the research on NPAs and enable the use of large-scale NPAs as holographic displays.
Collapse
|
12
|
Zhang Z, Liu J, Duan X, Wang Y. Enlarging field of view by a two-step method in a near-eye 3D holographic display. OPTICS EXPRESS 2020; 28:32709-32720. [PMID: 33114950 DOI: 10.1364/oe.403538] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/23/2020] [Accepted: 09/25/2020] [Indexed: 06/11/2023]
Abstract
The narrow field of view (FOV) has always been one of the most with limitations that drag the development of holographic three-dimensional (3D) near-eye display (NED). The complex amplitude modulation (CAM) technique is one way to realize holographic 3D display in real time with the advantage of high image quality. Previously, we applied the CAM technique on the design and integration of a compact colorful 3D-NED system. In this paper, a viewing angle enlarged CAM based 3D-NED system using a Abbe-Porter scheme and curved reflective structure is proposed. The viewing angle is increased in two steps. An Abbe-Porter filter system, composed of a lens and a grating, is used to enlarge the FOV for the first step and, meanwhile, realize complex amplitude modulation. A curved reflective structure is used to realize the FOV enlargement for the second step. Besides, the system retains the ability of colorful 3D display with high image quality. Optical experiments are performed, and the results show the system could present a 45.2° diagonal viewing angle. The system is able to present dynamic display as well. A compact prototype is fabricated and integrated for wearable and lightweight design.
Collapse
|