1
|
Kim J, Moon S, Jeong Y, Jang C, Kim Y, Lee B. Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display. JOURNAL OF BIOMEDICAL OPTICS 2018; 23:1-11. [PMID: 29931838 DOI: 10.1117/1.jbo.23.6.066502] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/14/2018] [Accepted: 05/30/2018] [Indexed: 06/08/2023]
Abstract
Here, we present dual-dimensional microscopy that captures both two-dimensional (2-D) and light-field images of an in-vivo sample simultaneously, synthesizes an upsampled light-field image in real time, and visualizes it with a computational light-field display system in real time. Compared with conventional light-field microscopy, the additional 2-D image greatly enhances the lateral resolution at the native object plane up to the diffraction limit and compensates for the image degradation at the native object plane. The whole process from capturing to displaying is done in real time with the parallel computation algorithm, which enables the observation of the sample's three-dimensional (3-D) movement and direct interaction with the in-vivo sample. We demonstrate a real-time 3-D interactive experiment with Caenorhabditis elegans.
Collapse
Affiliation(s)
- Jonghyun Kim
- Seoul National University, School of Electrical and Computer Engineering, Seoul, Republic of Korea
| | - Seokil Moon
- Seoul National University, School of Electrical and Computer Engineering, Seoul, Republic of Korea
| | - Youngmo Jeong
- Seoul National University, School of Electrical and Computer Engineering, Seoul, Republic of Korea
| | - Changwon Jang
- Seoul National University, School of Electrical and Computer Engineering, Seoul, Republic of Korea
| | - Youngmin Kim
- Korea Electronics Technology Institute, VR/AR Research Center, Seoul, Republic of Korea
| | - Byoungho Lee
- Seoul National University, School of Electrical and Computer Engineering, Seoul, Republic of Korea
| |
Collapse
|
2
|
Soomro SR, Urey H. Integrated 3D display and imaging using dual purpose passive screen and head-mounted projectors and camera. OPTICS EXPRESS 2018; 26:1161-1173. [PMID: 29401993 DOI: 10.1364/oe.26.001161] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/15/2017] [Accepted: 01/04/2018] [Indexed: 06/07/2023]
Abstract
We propose an integrated 3D display and imaging system using a head-mounted device and a special dual-purpose passive screen that can simultaneously facilitate 3D display and imaging. The screen is mainly composed of two optical layers, the first layer is a projection surface, which are the finely patterned retro-reflective microspheres that provide high optical gain when illuminated with head-mounted projectors. The second layer is an imaging surface made up of an array of curved mirrors, which form the perspective views of the scene captured by a head-mounted camera. The display and imaging operation are separated by performing polarization multiplexing. The demonstrated prototype system consists of a head-worn unit having a pair of 15 lumen pico-projectors and a 24MP camera, and an in-house designed and fabricated 30cm × 24cm screen. The screen provides bright display using 25% filled retro-reflective microspheres and 20 different perspective views of the user/scene using 5 × 4 array of convex mirrors. The real-time implementation is demonstrated by displaying stereo-3D content providing high brightness (up to 240 cd/m2) and low crosstalk (<4%), while 3D image capture is demonstrated by performing the computational reconstruction of the discrete free-viewpoint stereo pair displayed on a desktop or virtual reality display. Furthermore, the capture quality is determined by measuring the imaging MTF of the captured views and the capture light efficiency is calculated by considering the loss in transmitted light at each interface. Further developments in microfabrication and computational optics can present the proposed system as a unique mobile platform for immersive human-computer interaction of the future.
Collapse
|
3
|
Matsushima K, Sonobe N. Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects. APPLIED OPTICS 2018; 57:A150-A156. [PMID: 29328140 DOI: 10.1364/ao.57.00a150] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/14/2017] [Accepted: 11/07/2017] [Indexed: 06/07/2023]
Abstract
Digitized holography techniques are used to reconstruct three-dimensional (3D) images of physical objects using large-scale computer-generated holograms (CGHs). The object field is captured at three wavelengths over a wide area at high densities. Synthetic aperture techniques using single sensors are used for image capture in phase-shifting digital holography. The captured object field is incorporated into a virtual 3D scene that includes nonphysical objects, e.g., polygon-meshed CG models. The synthetic object field is optically reconstructed as a large-scale full-color CGH using red-green-blue color filters. The CGH has a wide full-parallax viewing zone and reconstructs a deep 3D scene with natural motion parallax.
Collapse
|
4
|
Yi F, Jeoung Y, Moon I. Three-dimensional image authentication scheme using sparse phase information in double random phase encoded integral imaging. APPLIED OPTICS 2017; 56:4381-4387. [PMID: 29047866 DOI: 10.1364/ao.56.004381] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
In recent years, many studies have focused on authentication of two-dimensional (2D) images using double random phase encryption techniques. However, there has been little research on three-dimensional (3D) imaging systems, such as integral imaging, for 3D image authentication. We propose a 3D image authentication scheme based on a double random phase integral imaging method. All of the 2D elemental images captured through integral imaging are encrypted with a double random phase encoding algorithm and only partial phase information is reserved. All the amplitude and other miscellaneous phase information in the encrypted elemental images is discarded. Nevertheless, we demonstrate that 3D images from integral imaging can be authenticated at different depths using a nonlinear correlation method. The proposed 3D image authentication algorithm can provide enhanced information security because the decrypted 2D elemental images from the sparse phase cannot be easily observed by the naked eye. Additionally, using sparse phase images without any amplitude information can greatly reduce data storage costs and aid in image compression and data transmission.
Collapse
|
5
|
Dorado A, Martinez-Corral M, Saavedra G, Hong S. Computation and Display of 3D Movie From a Single Integral Photography. ACTA ACUST UNITED AC 2016. [DOI: 10.1109/jdt.2016.2522510] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
6
|
Jang C, Lee CK, Jeong J, Li G, Lee S, Yeom J, Hong K, Lee B. Recent progress in see-through three-dimensional displays using holographic optical elements [Invited]. APPLIED OPTICS 2016; 55:A71-85. [PMID: 26835960 DOI: 10.1364/ao.55.000a71] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
The principles and characteristics of see-through 3D displays are presented. We especially focus on the integral-imaging display system using a holographic optical element (IDHOE), which is able to display 3D images and satisfy the see-through property at the same time. The technique has the advantage of the high transparency and capability of displaying autostereoscopic 3D images. We have analyzed optical properties of IDHOE for both recording and displaying stages. Furthermore, various studies of new applications and system improvements for IDHOE are introduced. Thanks to the characteristics of holographic volume grating, it is possible to implement a full-color lens-array holographic optical element and conjugated reconstruction as well as 2D/3D convertible IDHOE. Studies on the improvements of viewing characteristics including a viewing angle, fill factor, and resolution are also presented. Lastly, essential issues and their possible solutions are discussed as future work.
Collapse
|
7
|
Jeong Y, Kim J, Yeom J, Lee CK, Lee B. Real-time depth controllable integral imaging pickup and reconstruction method with a light field camera. APPLIED OPTICS 2015; 54:10333-10341. [PMID: 26836855 DOI: 10.1364/ao.54.010333] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
In this paper, we develop a real-time depth controllable integral imaging system. With a high-frame-rate camera and a focus controllable lens, light fields from various depth ranges can be captured. According to the image plane of the light field camera, the objects in virtual and real space are recorded simultaneously. The captured light field information is converted to the elemental image in real time without pseudoscopic problems. In addition, we derive characteristics and limitations of the light field camera as a 3D broadcasting capturing device with precise geometry optics. With further analysis, the implemented system provides more accurate light fields than existing devices without depth distortion. We adapt an f-number matching method at the capture and display stage to record a more exact light field and solve depth distortion, respectively. The algorithm allows the users to adjust the pixel mapping structure of the reconstructed 3D image in real time. The proposed method presents a possibility of a handheld real-time 3D broadcasting system in a cheaper and more applicable way as compared to the previous methods.
Collapse
|
8
|
Kim J, Hong JY, Hong K, Yang HK, Han SB, Hwang JM, Lee B. Glasses-free randot stereotest. JOURNAL OF BIOMEDICAL OPTICS 2015; 20:065004. [PMID: 26057031 DOI: 10.1117/1.jbo.20.6.065004] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/22/2015] [Accepted: 05/12/2015] [Indexed: 06/04/2023]
Abstract
We proposed a glasses-free randot stereotest using a multiview display system. We designed a four-view parallax barrier system and proposed the use of a random-dot multigram as a set of view images for the glasses-free randot stereotest. The glasses-free randot stereotest can be used to verify the effect of glasses in a stereopsis experience. Furthermore, the proposed system is convertible between two-view and four-view structures so that the motion parallax effect could be verified within the system. We discussed the design principles and the method used to generate images in detail and implemented a glasses-free randot stereotest system with a liquid crystal display panel and a customized parallax barrier. We also developed graphical user interfaces and a method for their calibration for practical usage. We performed experiments with five adult subjects with normal vision. The experimental results show that the proposed system provides a stereopsis experience to the subjects and is consistent with the glasses-type randot stereotest and the Frisby–Davis test. The implemented system is free from monocular cues and provides binocular disparity only. The crosstalk of the system is about 6.42% for four-view and 4.17% for two-view, the time required for one measurement is less than 20 s, and the minimum angular disparity that the system can provide is about 23 arc sec.
Collapse
Affiliation(s)
- Jonghyun Kim
- Seoul National University, School of Electrical Engineering, Gwanak-Gu Gwanakro 1, Seoul 151-744, Republic of Korea
| | - Jong-Young Hong
- Seoul National University, School of Electrical Engineering, Gwanak-Gu Gwanakro 1, Seoul 151-744, Republic of Korea
| | - Keehoon Hong
- Electronics and Telecommunications Research Institute (ETRI), Broadcasting & Telecommunications Media Research Laboratory, 218 Gajeong-ro, Yuseong-gu, Daejeon 305-700, Republic of Korea
| | - Hee Kyung Yang
- Seoul National University Bundang Hospital, Department of Ophthalmology, 300, Gumi-dong, Bundang-gu, Seongnam, Gyeonggi-do 463-707, Republic of Korea
| | - Sang Beom Han
- Kangwon National University Hospital, Department of Ophthalmology, 156 Baengnyeong-ro, Chuncheon, Kangwon 200-722, Republic of Korea
| | - Jeong-Min Hwang
- Seoul National University Bundang Hospital, Department of Ophthalmology, 300, Gumi-dong, Bundang-gu, Seongnam, Gyeonggi-do 463-707, Republic of Korea
| | - Byoungho Lee
- Seoul National University, School of Electrical Engineering, Gwanak-Gu Gwanakro 1, Seoul 151-744, Republic of Korea
| |
Collapse
|
9
|
Kim J, Lee CK, Jeong Y, Jang C, Hong JY, Lee W, Shin YC, Yoon JH, Lee B. Crosstalk-Reduced Dual-Mode Mobile 3D Display. ACTA ACUST UNITED AC 2015. [DOI: 10.1109/jdt.2014.2362798] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
10
|
Jung JH, Aloni D, Yitzhaky Y, Peli E. Active confocal imaging for visual prostheses. Vision Res 2014; 111:182-96. [PMID: 25448710 DOI: 10.1016/j.visres.2014.10.023] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2014] [Revised: 10/14/2014] [Accepted: 10/25/2014] [Indexed: 11/26/2022]
Abstract
There are encouraging advances in prosthetic vision for the blind, including retinal and cortical implants, and other "sensory substitution devices" that use tactile or electrical stimulation. However, they all have low resolution, limited visual field, and can display only few gray levels (limited dynamic range), severely restricting their utility. To overcome these limitations, image processing or the imaging system could emphasize objects of interest and suppress the background clutter. We propose an active confocal imaging system based on light-field technology that will enable a blind user of any visual prosthesis to efficiently scan, focus on, and "see" only an object of interest while suppressing interference from background clutter. The system captures three-dimensional scene information using a light-field sensor and displays only an in-focused plane with objects in it. After capturing a confocal image, a de-cluttering process removes the clutter based on blur difference. In preliminary experiments we verified the positive impact of confocal-based background clutter removal on recognition of objects in low resolution and limited dynamic range simulated phosphene images. Using a custom-made multiple-camera system based on light-field imaging, we confirmed that the concept of a confocal de-cluttered image can be realized effectively.
Collapse
Affiliation(s)
- Jae-Hyun Jung
- Schepens Eye Research Institute, Massachusetts Eye and Ear, Harvard Medical School, Boston, MA, USA
| | - Doron Aloni
- Department of Electro-Optics Engineering, Ben-Gurion University of the Negev, Beer Sheva, Israel
| | - Yitzhak Yitzhaky
- Department of Electro-Optics Engineering, Ben-Gurion University of the Negev, Beer Sheva, Israel
| | - Eli Peli
- Schepens Eye Research Institute, Massachusetts Eye and Ear, Harvard Medical School, Boston, MA, USA.
| |
Collapse
|
11
|
Deng H, Wang QH, Luo CG, Liu CL, Li C. Accommodation and convergence in integral imaging 3D display. JOURNAL OF THE SOCIETY FOR INFORMATION DISPLAY 2014. [DOI: 10.1002/jsid.230] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Affiliation(s)
- Huan Deng
- School of Electronics and Information Engineering; Sichuan University; Chengdu 610065 China
| | - Qiong-Hua Wang
- School of Electronics and Information Engineering; Sichuan University; Chengdu 610065 China
| | - Cheng-Gao Luo
- School of Electronics and Information Engineering; Sichuan University; Chengdu 610065 China
| | - Chun-Ling Liu
- Department of Ophthalmology, West China Hospital; Sichuan University; Chengdu 610041 China
| | - Chen Li
- Department of Ophthalmology, West China Hospital; Sichuan University; Chengdu 610041 China
| |
Collapse
|
12
|
Li SL, Wang QH, Xiong ZL, Deng H, Ji CC. Multiple Orthographic Frustum Combing for Real-Time Computer-Generated Integral Imaging System. ACTA ACUST UNITED AC 2014. [DOI: 10.1109/jdt.2014.2315665] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
13
|
Kwon KC, Jeong JS, Erdenebat MU, Lim YT, Yoo KH, Kim N. Real-time interactive display for integral imaging microscopy. APPLIED OPTICS 2014; 53:4450-9. [PMID: 25090064 DOI: 10.1364/ao.53.004450] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/26/2014] [Accepted: 05/29/2014] [Indexed: 05/24/2023]
Abstract
A real-time interactive orthographic-view image display of integral imaging (II) microscopy that includes the generation of intermediate-view elemental images (IVEIs) for resolution enhancement is proposed. Unlike the conventional II microscopes, parallel processing through a graphics processing unit is required for real-time display that generates the IVEIs and interactive orthographic-view images in high speed, according to the user interactive input. The real-time directional-view display for the specimen for which 3D information is acquired through II microscopy is successfully demonstrated by using resolution-enhanced elemental image arrays. A user interactive feature is also satisfied in the proposed real-time interactive display for II microscopy.
Collapse
|
14
|
Yeom J, Hong K, Jeong Y, Jang C, Lee B. Solution for pseudoscopic problem in integral imaging using phase-conjugated reconstruction of lens-array holographic optical elements. OPTICS EXPRESS 2014; 22:13659-13670. [PMID: 24921560 DOI: 10.1364/oe.22.013659] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
We propose an optical pseudoscopic to orthoscopic conversion method for integral imaging using a lens-array holographic optical element (LAHOE), which solves the pseudoscopic problem. The LAHOE reconstructs an array of diverging spherical waves when a probe wave with the phase-conjugated condition is imposed on it, while an array of converging spherical waves is reconstructed in ordinary reconstruction. For given pseudoscopic elemental images, the array of the diverging spherical waves integrates the orthoscopic three-dimensional images without a distortion. The principle of the proposed method is verified by the experiments of displaying the integral imaging on the LAHOE using computer generated and optically acquired elemental images.
Collapse
|
15
|
Kim J, Jung JH, Jeong Y, Hong K, Lee B. Real-time integral imaging system for light field microscopy. OPTICS EXPRESS 2014; 22:10210-10220. [PMID: 24921724 DOI: 10.1364/oe.22.010210] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
We propose a real-time integral imaging system for light field microscopy systems. To implement a 3D live in-vivo experimental environment for multiple experimentalists, we generate elemental images for an integral imaging system from the captured light field with a light field microscope in real-time. We apply the f-number matching method to generate an elemental image to reconstruct an undistorted 3D image. Our implemented system produces real and orthoscopic 3D images of micro objects in 16 frames per second. We verify the proposed system via experiments using Caenorhabditis elegans.
Collapse
|