1
|
Ji Y, Park SM, Kwon S, Leem JW, Nair VV, Tong Y, Kim YL. mHealth hyperspectral learning for instantaneous spatiospectral imaging of hemodynamics. PNAS NEXUS 2023; 2:pgad111. [PMID: 37113981 PMCID: PMC10129064 DOI: 10.1093/pnasnexus/pgad111] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/15/2022] [Accepted: 03/23/2023] [Indexed: 04/29/2023]
Abstract
Hyperspectral imaging acquires data in both the spatial and frequency domains to offer abundant physical or biological information. However, conventional hyperspectral imaging has intrinsic limitations of bulky instruments, slow data acquisition rate, and spatiospectral trade-off. Here we introduce hyperspectral learning for snapshot hyperspectral imaging in which sampled hyperspectral data in a small subarea are incorporated into a learning algorithm to recover the hypercube. Hyperspectral learning exploits the idea that a photograph is more than merely a picture and contains detailed spectral information. A small sampling of hyperspectral data enables spectrally informed learning to recover a hypercube from a red-green-blue (RGB) image without complete hyperspectral measurements. Hyperspectral learning is capable of recovering full spectroscopic resolution in the hypercube, comparable to high spectral resolutions of scientific spectrometers. Hyperspectral learning also enables ultrafast dynamic imaging, leveraging ultraslow video recording in an off-the-shelf smartphone, given that a video comprises a time series of multiple RGB images. To demonstrate its versatility, an experimental model of vascular development is used to extract hemodynamic parameters via statistical and deep learning approaches. Subsequently, the hemodynamics of peripheral microcirculation is assessed at an ultrafast temporal resolution up to a millisecond, using a conventional smartphone camera. This spectrally informed learning method is analogous to compressed sensing; however, it further allows for reliable hypercube recovery and key feature extractions with a transparent learning algorithm. This learning-powered snapshot hyperspectral imaging method yields high spectral and temporal resolutions and eliminates the spatiospectral trade-off, offering simple hardware requirements and potential applications of various machine learning techniques.
Collapse
Affiliation(s)
- Yuhyun Ji
- Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - Sang Mok Park
- Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - Semin Kwon
- Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - Jung Woo Leem
- Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN 47907, USA
| | | | - Yunjie Tong
- Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN 47907, USA
| | - Young L Kim
- Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN 47907, USA
- Purdue Institute for Cancer Research, Purdue University, West Lafayette, IN 47906, USA
- Regenstrief Center for Healthcare Engineering, Purdue University, West Lafayette, IN 47907, USA
- Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, IN 47907, USA
| |
Collapse
|
2
|
Bai X, Zhu Z, Schwing A, Forsyth D, Gruev V. Angle of polarization calibration for omnidirectional polarization cameras. OPTICS EXPRESS 2023; 31:6759-6769. [PMID: 36823926 DOI: 10.1364/oe.483337] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Accepted: 01/24/2023] [Indexed: 06/18/2023]
Abstract
Polarization cameras quantify one of the fundamental properties of light and capture intrinsic properties of the imaged environment that are otherwise omitted by color sensors. Many polarization applications, such as underwater geolocalization and sky-based polarization compass, require simultaneous imaging of the entire radial optical field with omnidirectional lenses. However, the reconstructed angle of polarization captured with omnidirectional lenses has a radial offset due to redirection of the light rays within these lenses. In this paper, we describe a calibration method for correcting angle of polarization images captured with omnidirectional lenses. Our calibration method reduces the variance of reconstructed angle of polarization from 76.2 ∘ to 4.1 ∘. Example images collected both on an optical bench and in nature, demonstrate the improved accuracy of the reconstructed angle of polarization with our calibration method. The improved accuracy in the angle of polarization images will aid the development of polarization-based applications with omnidirectional lenses.
Collapse
|