1
|
Zhang W, Sang X, Gao X, Yu X, Gao C, Yan B, Yu C. A flipping-free 3D integral imaging display using a twice-imaging lens array. OPTICS EXPRESS 2019; 27:32810-32822. [PMID: 31684486 DOI: 10.1364/oe.27.032810] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/05/2019] [Accepted: 10/16/2019] [Indexed: 06/10/2023]
Abstract
Integral imaging is a promising 3D visualization technique for reconstructing 3D medical scenes to enhance medical analysis and diagnosis. However, the use of lens arrays inevitably introduces flipped images beyond the field of view, which cannot reproduce the correct parallax relation. To avoid the flipping effect in optical reconstruction, a twice-imaging lens array based integral display is presented. The proposed lens arrangement, which consists of a light-controlling lens array, a field lens array and an imaging lens array, allows the light rays from each elemental image only pass through its corresponding lens unit. The lens arrangement is optimized with geometrical optics method, and the proposed display system is experimentally demonstrated. A full-parallax 3D medical scene showing continuous viewpoint information without flipping is reconstructed in 45° field of view.
Collapse
|
2
|
Li Y, Sang X, Xing S, Guan Y, Yang S, Chen D, Yang L, Yan B. Real-time optical 3D reconstruction based on Monte Carlo integration and recurrent CNNs denoising with the 3D light field display. OPTICS EXPRESS 2019; 27:22198-22208. [PMID: 31510515 DOI: 10.1364/oe.27.022198] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/01/2019] [Accepted: 06/20/2019] [Indexed: 06/10/2023]
Abstract
A general integral imaging generation method based on the path-traced Monte Carlo (MC) method and recurrent convolutional neural networks denoising is presented. According to the optical layer structure of the three-dimensional (3D) light field display, screen pixels are encoded to specific viewpoints, then the directional rays are cast from viewpoints to screen pixels to preform the path integral. In the process of the integral, advanced illumination is used for high-quality elemental image array (EIA) generation. Recurrent convolutional neural networks are implemented as an auxiliary post-processing for the EIA to eliminate the noise of the 3D image in MC integration. 4K (3840 × 2160) resolution, 2 sample/pixel and the ray path tracing method are realized in the experiment. Experimental results demonstrate that the structural similarity metric (SSIM) value and peak signal-to-noise ratio (PSNR) gain of the reconstructed 3D image and target 3D image exceed 90% and 10 dB within 10 frames, respectively. Besides, real-time frame rate is more than 30 fps, showing the super efficiency and quality in optical 3D reconstruction.
Collapse
|
3
|
Chen G, Ma C, Fan Z, Cui X, Liao H. Real-Time Lens Based Rendering Algorithm for Super-Multiview Integral Photography without Image Resampling. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2018; 24:2600-2609. [PMID: 28961116 DOI: 10.1109/tvcg.2017.2756634] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
We propose a computer generated integral photography (CGIP) method that employs a lens based rendering (LBR) algorithm for super-multiview displays to achieve higher frame rates and better image quality without pixel resampling or view interpolation. The algorithm can utilize both fixed and programmable graphics pipelines to accelerate CGIP rendering and inter-perspective antialiasing. Two hardware prototypes were fabricated with two high-resolution liquid crystal displays and micro-lens arrays (MLA). Qualitative and quantitative experiments were performed to evaluate the feasibility of the proposed algorithm. To the best of our knowledge, the proposed LBR method outperforms state-of-the-art CGIP algorithms relative to rendering speed and image quality with our super-multiview hardware configurations. A demonstration experiment was also conducted to reveal the interactivity of a super-multiview display utilizing the proposed algorithm.
Collapse
|
4
|
Wen J, Yan X, Jiang X, Yan Z, Wang Y, Wang J. Nonlinear mapping method for the generation of an elemental image array in a photorealistic pseudoscopic free 3D display. APPLIED OPTICS 2018; 57:6375-6382. [PMID: 30117866 DOI: 10.1364/ao.57.006375] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/22/2018] [Accepted: 07/02/2018] [Indexed: 06/08/2023]
Abstract
Limited available elemental image array resources may be the most severe bottleneck for the promotion and application of integral-imaging-based 3D display. We propose a nonlinear mapping method for the generation of an elemental image array to get a photorealistic pseudoscopic free 3D display based on the parallel light field reconstruction nature of the integral imaging system. All the light rays emitted from the display panel are classified into a corresponding parallel light field according to its direction, and all the parallel light fields are captured as orthogonal projections of the scene before synthesizing all the orthogonal projections using a nonlinear mapping method to form the final elemental image array. Preliminary optical experiments as well as ray optical analysis are conducted to prove the feasibility and validity of the proposed method. The proposed method can exploit most of the current 3D platform. It is an effective and efficient way to generate an elemental image array.
Collapse
|
5
|
Erdenebat MU, Kim BJ, Piao YL, Park SY, Kwon KC, Piao ML, Yoo KH, Kim N. Three-dimensional image acquisition and reconstruction system on a mobile device based on computer-generated integral imaging. APPLIED OPTICS 2017; 56:7796-7802. [PMID: 29047770 DOI: 10.1364/ao.56.007796] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/27/2017] [Accepted: 08/09/2017] [Indexed: 06/07/2023]
Abstract
A mobile three-dimensional image acquisition and reconstruction system using a computer-generated integral imaging technique is proposed. A depth camera connected to the mobile device acquires the color and depth data of a real object simultaneously, and an elemental image array is generated based on the original three-dimensional information for the object, with lens array specifications input into the mobile device. The three-dimensional visualization of the real object is reconstructed on the mobile display through optical or digital reconstruction methods. The proposed system is implemented successfully and the experimental results certify that the system is an effective and interesting method of displaying real three-dimensional content on a mobile device.
Collapse
|
6
|
Xing S, Sang X, Yu X, Duo C, Pang B, Gao X, Yang S, Guan Y, Yan B, Yuan J, Wang K. High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction. OPTICS EXPRESS 2017; 25:330-338. [PMID: 28085827 DOI: 10.1364/oe.25.000330] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
A high-efficient computer-generated integral imaging (CGII) method is presented based on the backward ray-tracing technique. In traditional CGII methods, the total rendering time is long, because a large number of cameras are established in the virtual world. The ray origin and the ray direction for every pixel in elemental image array are calculated with the backward ray-tracing technique, and the total rendering time can be noticeably reduced. The method is suitable to create high quality integral image without the pseudoscopic problem. Real time and non-real time CGII rendering images and optical reconstruction are demonstrated, and the effectiveness is verified with different types of 3D object models. Real time optical reconstruction with 90 × 90 viewpoints and the frame rate above 40 fps for the CGII 3D display are realized without the pseudoscopic problem.
Collapse
|
7
|
Song MH, Jeong JS, Erdenebat MU, Kwon KC, Kim N, Yoo KH. Integral imaging system using an adaptive lens array. APPLIED OPTICS 2016; 55:6399-6403. [PMID: 27534485 DOI: 10.1364/ao.55.006399] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
We produced an adaptive lens array composed of multiple flat lens arrays arranged in a curved shape with an adjustable radius of curvature, in order to overcome the hardware problem of the conventional flat or curved lens array-based systems. The manufactured adaptive lens array is applied to an integral imaging system. The gap mismatch that occurs when using a curved lens array is resolved by computing the exact display mapping position of element images through each lens. The results of the experiment demonstrate that the adaptive lens array-based integral imaging system successfully generated elemental images according to the curvature transformation of the adaptive lens array, and they were reconstructed as 3D images.
Collapse
|
8
|
Kwon KC, Jeong JS, Erdenebat MU, Piao YL, Yoo KH, Kim N. Resolution-enhancement for an orthographic-view image display in an integral imaging microscope system. BIOMEDICAL OPTICS EXPRESS 2015; 6:736-46. [PMID: 25798299 PMCID: PMC4361429 DOI: 10.1364/boe.6.000736] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/20/2014] [Revised: 01/12/2015] [Accepted: 02/03/2015] [Indexed: 05/09/2023]
Abstract
Due to the limitations of micro lens arrays and camera sensors, images on display devices through the integral imaging microscope systems have been suffering for a low-resolution. In this paper, a resolution-enhanced orthographic-view image display method for integral imaging microscopy is proposed and demonstrated. Iterative intermediate-view reconstructions are performed based on bilinear interpolation using neighborhood elemental image information, and a graphics processing unit parallel processing algorithm is applied for fast image processing. The proposed method is verified experimentally and the effective results are presented in this paper.
Collapse
Affiliation(s)
- Ki-Chul Kwon
- School of Information and Communication Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Chungbuk 362-763,
South Korea
| | - Ji-Seong Jeong
- Department of Digital Informatics and Convergence, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Chungbuk 362-763,
South Korea
| | - Munkh-Uchral Erdenebat
- School of Information and Communication Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Chungbuk 362-763,
South Korea
| | - Yan-Ling Piao
- School of Information and Communication Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Chungbuk 362-763,
South Korea
| | - Kwan-Hee Yoo
- Department of Digital Informatics and Convergence, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Chungbuk 362-763,
South Korea
| | - Nam Kim
- School of Information and Communication Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Chungbuk 362-763,
South Korea
| |
Collapse
|
9
|
Erdenebat MU, Kwon KC, Dashdavaa E, Piao YL, Yoo KH, Baasantseren G, Kim Y, Kim N. Advanced 360-Degree Integral-Floating Display Using a Hidden Point Removal Operator and a Hexagonal Lens Array. ACTA ACUST UNITED AC 2014. [DOI: 10.3807/josk.2014.18.6.706] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
10
|
Jung JH, Aloni D, Yitzhaky Y, Peli E. Active confocal imaging for visual prostheses. Vision Res 2014; 111:182-96. [PMID: 25448710 DOI: 10.1016/j.visres.2014.10.023] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2014] [Revised: 10/14/2014] [Accepted: 10/25/2014] [Indexed: 11/26/2022]
Abstract
There are encouraging advances in prosthetic vision for the blind, including retinal and cortical implants, and other "sensory substitution devices" that use tactile or electrical stimulation. However, they all have low resolution, limited visual field, and can display only few gray levels (limited dynamic range), severely restricting their utility. To overcome these limitations, image processing or the imaging system could emphasize objects of interest and suppress the background clutter. We propose an active confocal imaging system based on light-field technology that will enable a blind user of any visual prosthesis to efficiently scan, focus on, and "see" only an object of interest while suppressing interference from background clutter. The system captures three-dimensional scene information using a light-field sensor and displays only an in-focused plane with objects in it. After capturing a confocal image, a de-cluttering process removes the clutter based on blur difference. In preliminary experiments we verified the positive impact of confocal-based background clutter removal on recognition of objects in low resolution and limited dynamic range simulated phosphene images. Using a custom-made multiple-camera system based on light-field imaging, we confirmed that the concept of a confocal de-cluttered image can be realized effectively.
Collapse
Affiliation(s)
- Jae-Hyun Jung
- Schepens Eye Research Institute, Massachusetts Eye and Ear, Harvard Medical School, Boston, MA, USA
| | - Doron Aloni
- Department of Electro-Optics Engineering, Ben-Gurion University of the Negev, Beer Sheva, Israel
| | - Yitzhak Yitzhaky
- Department of Electro-Optics Engineering, Ben-Gurion University of the Negev, Beer Sheva, Israel
| | - Eli Peli
- Schepens Eye Research Institute, Massachusetts Eye and Ear, Harvard Medical School, Boston, MA, USA.
| |
Collapse
|
11
|
Li SL, Wang QH, Xiong ZL, Deng H, Ji CC. Multiple Orthographic Frustum Combing for Real-Time Computer-Generated Integral Imaging System. ACTA ACUST UNITED AC 2014. [DOI: 10.1109/jdt.2014.2315665] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
12
|
Kwon KC, Jeong JS, Erdenebat MU, Lim YT, Yoo KH, Kim N. Real-time interactive display for integral imaging microscopy. APPLIED OPTICS 2014; 53:4450-9. [PMID: 25090064 DOI: 10.1364/ao.53.004450] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/26/2014] [Accepted: 05/29/2014] [Indexed: 05/24/2023]
Abstract
A real-time interactive orthographic-view image display of integral imaging (II) microscopy that includes the generation of intermediate-view elemental images (IVEIs) for resolution enhancement is proposed. Unlike the conventional II microscopes, parallel processing through a graphics processing unit is required for real-time display that generates the IVEIs and interactive orthographic-view images in high speed, according to the user interactive input. The real-time directional-view display for the specimen for which 3D information is acquired through II microscopy is successfully demonstrated by using resolution-enhanced elemental image arrays. A user interactive feature is also satisfied in the proposed real-time interactive display for II microscopy.
Collapse
|