1
|
Fadzli FE, Ismail AW, Abd Karim Ishigaki S. A systematic literature review: Real-time 3D reconstruction method for telepresence system. PLoS One 2023; 18:e0287155. [PMID: 37967080 PMCID: PMC10651044 DOI: 10.1371/journal.pone.0287155] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2021] [Accepted: 06/01/2023] [Indexed: 11/17/2023] Open
Abstract
Real-time three-dimensional (3D) reconstruction of real-world environments has many significant applications in various fields, including telepresence technology. When depth sensors, such as those from Microsoft's Kinect series, are introduced simultaneously and become widely available, a new generation of telepresence systems can be developed by combining a real-time 3D reconstruction method with these new technologies. This combination enables users to engage with a remote person while remaining in their local area, as well as control remote devices while viewing their 3D virtual representation. There are numerous applications in which having a telepresence experience could be beneficial, including remote collaboration and entertainment, as well as education, advertising, and rehabilitation. The purpose of this systematic literature review is to analyze the recent advances in 3D reconstruction methods for telepresence systems and the significant related work in this field. Next, we determine the input data and the technological device employed to acquire the input data, which will be utilized in the 3D reconstruction process. The methods of 3D reconstruction implemented in the telepresence system as well as the evaluation of the system, have been extracted and assessed from the included studies. Through the analysis and summarization of many dimensions, we discussed the input data used for the 3D reconstruction method, the real-time 3D reconstruction methods implemented in the telepresence system, and how to evaluate the system. We conclude that real-time 3D reconstruction methods for telepresence systems have progressively improved over the years in conjunction with the advancement of machines and devices such as Red Green Blue-Depth (RGB-D) cameras and Graphics Processing Unit (GPU).
Collapse
Affiliation(s)
- Fazliaty Edora Fadzli
- Department of Emergent Computing, Faculty of Computing, Universiti Teknologi Malaysia (UTM), Johor Bahru, Johore, Malaysia
- Mixed and Virtual Environment Research Lab (mivielab), ViCubeLab, Universiti Teknologi Malaysia (UTM), Johor Bahru, Johore, Malaysia
| | - Ajune Wanis Ismail
- Department of Emergent Computing, Faculty of Computing, Universiti Teknologi Malaysia (UTM), Johor Bahru, Johore, Malaysia
- Mixed and Virtual Environment Research Lab (mivielab), ViCubeLab, Universiti Teknologi Malaysia (UTM), Johor Bahru, Johore, Malaysia
| | - Shafina Abd Karim Ishigaki
- Department of Emergent Computing, Faculty of Computing, Universiti Teknologi Malaysia (UTM), Johor Bahru, Johore, Malaysia
- Mixed and Virtual Environment Research Lab (mivielab), ViCubeLab, Universiti Teknologi Malaysia (UTM), Johor Bahru, Johore, Malaysia
| |
Collapse
|
2
|
Ban S, Lee YJ, Kim KR, Kim JH, Yeo WH. Advances in Materials, Sensors, and Integrated Systems for Monitoring Eye Movements. BIOSENSORS 2022; 12:1039. [PMID: 36421157 PMCID: PMC9688058 DOI: 10.3390/bios12111039] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/20/2022] [Revised: 11/11/2022] [Accepted: 11/13/2022] [Indexed: 06/16/2023]
Abstract
Eye movements show primary responses that reflect humans' voluntary intention and conscious selection. Because visual perception is one of the fundamental sensory interactions in the brain, eye movements contain critical information regarding physical/psychological health, perception, intention, and preference. With the advancement of wearable device technologies, the performance of monitoring eye tracking has been significantly improved. It also has led to myriad applications for assisting and augmenting human activities. Among them, electrooculograms, measured by skin-mounted electrodes, have been widely used to track eye motions accurately. In addition, eye trackers that detect reflected optical signals offer alternative ways without using wearable sensors. This paper outlines a systematic summary of the latest research on various materials, sensors, and integrated systems for monitoring eye movements and enabling human-machine interfaces. Specifically, we summarize recent developments in soft materials, biocompatible materials, manufacturing methods, sensor functions, systems' performances, and their applications in eye tracking. Finally, we discuss the remaining challenges and suggest research directions for future studies.
Collapse
Affiliation(s)
- Seunghyeb Ban
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Yoon Jae Lee
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Ka Ram Kim
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Jong-Hoon Kim
- School of Engineering and Computer Science, Washington State University, Vancouver, WA 98686, USA
- Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
| | - Woon-Hong Yeo
- IEN Center for Human-Centric Interfaces and Engineering, Institute for Electronics and Nanotechnology, Georgia Institute of Technology, Atlanta, GA 30332, USA
- George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332, USA
- Wallace H. Coulter Department of Biomedical Engineering, Georgia Tech and Emory University School of Medicine, Atlanta, GA 30332, USA
- Neural Engineering Center, Institute for Materials, Institute for Robotics and Intelligent Machines, Georgia Institute of Technology, Atlanta, GA 30332, USA
| |
Collapse
|
3
|
Fujishiro T, Aoyama T, Hano K, Takasu M, Takeuchi M, Hasegawa Y. Microinjection System to Enable Real-Time 3D Image Presentation Through Focal Position Adjustment. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3067857] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
4
|
Implementation and Optimization of a Dual-confocal Autofocusing System. SENSORS 2020; 20:s20123479. [PMID: 32575631 PMCID: PMC7349031 DOI: 10.3390/s20123479] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/18/2020] [Revised: 06/11/2020] [Accepted: 06/17/2020] [Indexed: 11/17/2022]
Abstract
This paper describes the implementation and optimization of a dual-confocal autofocusing system that can easily describe a real-time position by measuring the response signal (i.e., intensity) of the front and the rear focal points of the system. This is a new and systematic design strategy that would make it possible to use this system for other applications while retrieving their characteristic curves experimentally; there is even a good chance of this technique becoming the gold standard for optimizing these dual-confocal configurations. We adopt two indexes to predict our system performance and discover that the rear focal position and its physical design are major factors. A laboratory-built prototype was constructed and demonstrated to ensure that its optimization was valid. The experimental results showed that a total optical difference from 150 to 400 mm significantly affected the effective volume of our designed autofocusing system. The results also showed that the sensitivity of the dual-confocal autofocusing system is affected more by the position of the rear focal point than the position of the front focal point. The final optimizing setup indicated that the rear focal length and the front focal length should be set at 200 and 100 mm, respectively. In addition, the characteristic curve between the focus error signal and its position could successfully define the exact position by a polynomial equation of the sixth order, meaning that the system can be straightforwardly applied to an accurate micro-optical auto-focusing system.
Collapse
|