1
|
Wang N, Zhu L, Yuan Q, Ge X, Gao Z, Wang S, Yang P. Performance of the neural network-based prediction model in closed-loop adaptive optics. OPTICS LETTERS 2024; 49:2926-2929. [PMID: 38824294 DOI: 10.1364/ol.527429] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/16/2024] [Accepted: 04/30/2024] [Indexed: 06/03/2024]
Abstract
Adaptive optics (AO) technology is an effective means to compensate for atmospheric turbulence, but the inherent delay error of an AO system will cause the compensation phase of the deformable mirror (DM) to lag behind the actual distortion, which limits the correction performance of the AO technology. Therefore, the feed-forward prediction of atmospheric turbulence has important research value and application significance to offset the inherent time delay and improve the correction bandwidth of the AO system. However, most prediction algorithms are limited to an open-loop system, and the deployment and the application in the actual AO system are rarely reported, so its correction performance improvement has not been verified in practice. We report, to our knowledge, the first successful test of a deep learning-based spatiotemporal prediction model in an actual 3 km laser atmospheric transport AO system and compare it with the traditional closed-loop control methods, demonstrating that the AO system with the prediction model has higher correction performance.
Collapse
|
2
|
Hyde MW, McCrae JE, Kalensky M, Spencer MF. "Hidden phase" in two-wavelength adaptive optics. APPLIED OPTICS 2024; 63:E1-E9. [PMID: 38856586 DOI: 10.1364/ao.516039] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Accepted: 02/10/2024] [Indexed: 06/11/2024]
Abstract
Two-wavelength adaptive optics (AO), where sensing and correcting (from a beacon) are performed at one wavelength λ B and compensation and observation (after transmission through the atmosphere) are performed at another λ T , has historically been analyzed and practiced assuming negligible irradiance fluctuations (i.e., weak scintillation). Under these conditions, the phase corrections measured at λ B are robust over a relatively large range of wavelengths, resulting in a negligible decrease in AO performance. In weak-to-moderate scintillation conditions, which result from distributed-volume atmospheric aberrations, the pupil-phase function becomes discontinuous, producing what Fried called the "hidden phase" because it is not sensed by traditional least-squares phase reconstructors or unwrappers. Neglecting the hidden phase has a significant negative impact on AO performance even with perfect least-squares phase compensation. To the authors' knowledge, the hidden phase has not been studied in the context of two-wavelength AO. In particular, how does the hidden phase sensed at λ B relate to the compensation (or observation) wavelength λ T ? If the hidden phase is highly correlated across λ B and λ T , like the least-squares phase, it is worth sensing and correcting; otherwise, it is not. Through a series of wave optics simulations, we find an approximate expression for the hidden-phase correlation coefficient as a function of λ B , λ T , and the scintillation strength. In contrast to the least-squares phase, we determine that the hidden phase (when present) is correlated over a small band of wavelengths centered on λ T . Over the range λ B ,λ T ∈[1,3]µm and in weak-to-moderate scintillation conditions (spherical-wave log-amplitude variance σ χ2∈[0.1,0.5]), we find the average hidden-phase correlation linewidth to be approximately 0.35 µm. Consequently, for |λ B -λ T | greater than this linewidth, including the hidden phase does not significantly improve AO performance over least-squares phase compensation.
Collapse
|
3
|
Dayton DC, Spencer MF. Scaled-laboratory demonstrations of deep-turbulence conditions. APPLIED OPTICS 2024; 63:E54-E63. [PMID: 38856592 DOI: 10.1364/ao.520208] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/28/2024] [Accepted: 03/20/2024] [Indexed: 06/11/2024]
Abstract
This paper uses five spatially distributed reflective liquid-crystal phase modulators (LcPMs) to accurately simulate deep-turbulence conditions in a scaled-laboratory environment. In practice, we match the Fresnel numbers for long-range, horizontal-path scenarios using optical trombones and relays placed between the reflective LcPMs. Similar to computational wave-optic simulations, we also command repeatable high-resolution phase screens to the reflective LcPMs with the proper path-integrated spatial and temporal Kolmogorov statistics.
Collapse
|
4
|
Zhuang Y, Wang D, Deng X, Lin S, Zheng Y, Guo L, Zhang Y, Huang L. High robustness single-shot wavefront sensing method using a near-field profile image and fully-connected retrieval neural network for a high power laser facility. OPTICS EXPRESS 2023; 31:26990-27005. [PMID: 37710547 DOI: 10.1364/oe.496020] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2023] [Accepted: 07/10/2023] [Indexed: 09/16/2023]
Abstract
This paper proposes a single-shot high robustness wavefront sensing method based on deep-learning for wavefront distortion measurement in high power lasers. This method could achieve fast and robust wavefront retrieval by using a single-shot near-field profile image and trained network. The deep-learning network uses fully-skip cross connections to extract and integrate multi-scale feature maps from various layers and stages, which improves the wavefront retrieval speed and enhances the robustness of the method. The numerical simulation proves that the method could directly predict the wavefront distortion of high power lasers with high accuracy. The experiment demonstrates the residual RMS between the method and a Shack-Hartmann wavefront sensor is less than 0.01 µm. The simulational and experimental results show that the method could accurately predict the incident wavefront distortion in high power lasers, exhibiting high speed and good robustness in wavefront retrieval.
Collapse
|
5
|
Chen H, Zhang H, He Y, Wei L, Yang J, Li X, Huang L, Wei K. Direct wavefront sensing with a plenoptic sensor based on deep learning. OPTICS EXPRESS 2023; 31:10320-10332. [PMID: 37157581 DOI: 10.1364/oe.481433] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Traditional plenoptic wavefront sensors (PWS) suffer from the obvious step change of the slope response which leads to the poor performance of phase retrieval. In this paper, a neural network model combining the transformer architecture with the U-Net model is utilized to restore wavefront directly from the plenoptic image of PWS. The simulation results show that the averaged root mean square error (RMSE) of residual wavefront is less than 1/14λ (Marechal criterion), proving the proposed method successfully breaks through the non-linear problem existed in PWS wavefront sensing. In addition, our model performs better than the recently developed deep learning models and traditional modal approach. Furthermore, the robustness of our model to turbulence strength and signal level is also tested, proving the good generalizability of our model. To the best of our knowledge, it is the first time to perform direct wavefront detection with a deep-learning-based method in PWS-based applications and achieve the state-of-the-art performance.
Collapse
|
6
|
Hamilton RJ, Hart M. Development and verification of the signal to noise ratio for a layer of turbulence in a multi-layer atmosphere. JOURNAL OF THE OPTICAL SOCIETY OF AMERICA. A, OPTICS, IMAGE SCIENCE, AND VISION 2023; 40:573-582. [PMID: 37133040 DOI: 10.1364/josaa.484162] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Wide-field image correction in systems that look through the atmosphere generally requires a tomographic reconstruction of the turbulence volume to compensate for anisoplanatism. The reconstruction is conditioned by estimating the turbulence volume as a profile of thin homogeneous layers. We present the signal to noise ratio (SNR) of a layer, which quantifies how difficult a single layer of homogeneous turbulence is to detect with wavefront slope measurements. The signal is the sum of wavefront tip and tilt variances at the signal layer, and the noise is the sum of wavefront tip and tilt auto-correlations given the aperture shape and projected aperture separations at all non-signal layers. An analytic expression for layer SNR is found for Kolmogorov and von Kármán turbulence models, then verified with a Monte Carlo simulation. We show that the Kolmogorov layer SNR is a function of only layer Fried length, the spatio-angular sampling of the system, and normalized aperture separation at the layer. In addition to these parameters, the von Kármán layer SNR also depends on aperture size, and layer inner and outer scales. Due to the infinite outer scale, layers of Kolmogorov turbulence tend to have lower SNR than von Kármán layers. We conclude that the layer SNR is a statistically valid performance metric to be used when designing, simulating, operating, and quantifying the performance of any system that measures properties of layers of turbulence in the atmosphere from slope data.
Collapse
|
7
|
Chen H, Wei L, He Y, Yang J, Li X, Li L, Huang L, Wei K. Deep learning assisted plenoptic wavefront sensor for direct wavefront detection. OPTICS EXPRESS 2023; 31:2989-3004. [PMID: 36785300 DOI: 10.1364/oe.478239] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Accepted: 12/16/2022] [Indexed: 06/18/2023]
Abstract
Traditional plenoptic wavefront sensors (PWFS) suffer from the obvious step change of the slope response, leading to poor wavefront detection performance. In order to solve this problem, in this paper, a deep learning model is proposed to restore phase maps directly from slope measurements of PWFS. Numerical simulations are employed to demonstrate our approach, and the statistical residual wavefront root mean square error (RMSE) of our method is 0.0810 ± 0.0258λ, which is much superior to those of modal algorithm (0.2511 ± 0.0587λ) and zonal approach (0.3584 ± 0.0487λ). The internal driving force of PWFS-ResUnet is investigated, and the slope response differences between sub-apertures and directions are considered as a probably key role to help our model to accurately restore the phase map. Additionally, the robustness of our model to turbulence strength and signal-to-noise ratio (SNR) level is also tested. The proposed method provides a new direction to solve the nonlinear problem of traditional PWFS.
Collapse
|
8
|
Strycker BD. Zernike-like Laguerre-Gaussian orthonormal polynomials for optical field reconstruction. OPTICS LETTERS 2022; 47:6137-6140. [PMID: 37219191 DOI: 10.1364/ol.475979] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/21/2022] [Accepted: 10/31/2022] [Indexed: 05/24/2023]
Abstract
Analytic closed form expressions for orthonormal polynomials exhibiting both rotational and Gaussian symmetries are derived for both circular and elliptical geometries. They exhibit a close correspondence to the Zernike polynomials but are of Gaussian shape and orthogonal over the (x,y) plane. Consequently, they may be expressed in terms of Laguerre polynomials. Formulas for calculating the centroid of a real function are also presented and, along with the analytic expressions for the polynomials, may prove to be of especial use in reconstruction of the intensity distribution incident on a Shack-Hartmann wavefront sensor.
Collapse
|
9
|
Abstract
Phase retrieval from supervised learning neural networks is restricted due to the problem of obtaining labels. To address this situation, in the present paper, we propose a phase retrieval model of self-supervised physical deep learning combined with a complete physical model to represent the image-formation process. The model includes two parts: one is MobileNet V1, which is used to map the input samples to the Zernike coefficients, the other one is an optical imaging system and it is used to obtain the point spread function for training the model. In addition, the loss function is calculated based on the similarity between the input and the output to realize self-supervised learning. The root-mean-square (RMS) of the wave-front error (WFE) between the input and reconstruction is 0.1274 waves in the situation of D/r0 = 20 in the simulation. By comparison, The RMS of WFE is 0.1069 waves when using the label to train the model. This method retrieves numerous wave-front errors in real time in the presence of simulated detector noise without relying on label values. Moreover, this method is more suitable for practical applications and is more robust than supervised learning. We believe that this technology has great applications in free-space optical communication.
Collapse
|
10
|
Zuo C, Qian J, Feng S, Yin W, Li Y, Fan P, Han J, Qian K, Chen Q. Deep learning in optical metrology: a review. LIGHT, SCIENCE & APPLICATIONS 2022; 11:39. [PMID: 35197457 PMCID: PMC8866517 DOI: 10.1038/s41377-022-00714-x] [Citation(s) in RCA: 71] [Impact Index Per Article: 35.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/11/2021] [Revised: 01/03/2022] [Accepted: 01/11/2022] [Indexed: 05/20/2023]
Abstract
With the advances in scientific foundations and technological implementations, optical metrology has become versatile problem-solving backbones in manufacturing, fundamental research, and engineering applications, such as quality control, nondestructive testing, experimental mechanics, and biomedicine. In recent years, deep learning, a subfield of machine learning, is emerging as a powerful tool to address problems by learning from data, largely driven by the availability of massive datasets, enhanced computational power, fast data storage, and novel training algorithms for the deep neural network. It is currently promoting increased interests and gaining extensive attention for its utilization in the field of optical metrology. Unlike the traditional "physics-based" approach, deep-learning-enabled optical metrology is a kind of "data-driven" approach, which has already provided numerous alternative solutions to many challenging problems in this field with better performances. In this review, we present an overview of the current status and the latest progress of deep-learning technologies in the field of optical metrology. We first briefly introduce both traditional image-processing algorithms in optical metrology and the basic concepts of deep learning, followed by a comprehensive review of its applications in various optical metrology tasks, such as fringe denoising, phase retrieval, phase unwrapping, subset correlation, and error compensation. The open challenges faced by the current deep-learning approach in optical metrology are then discussed. Finally, the directions for future research are outlined.
Collapse
Grants
- 61722506, 61705105, 62075096 National Natural Science Foundation of China (National Science Foundation of China)
- 61722506, 61705105, 62075096 National Natural Science Foundation of China (National Science Foundation of China)
- 61722506, 61705105, 62075096 National Natural Science Foundation of China (National Science Foundation of China)
- 61722506, 61705105, 62075096 National Natural Science Foundation of China (National Science Foundation of China)
- 61722506, 61705105, 62075096 National Natural Science Foundation of China (National Science Foundation of China)
- 61722506, 61705105, 62075096 National Natural Science Foundation of China (National Science Foundation of China)
- National Key R&D Program of China (2017YFF0106403) Leading Technology of Jiangsu Basic Research Plan (BK20192003) National Defense Science and Technology Foundation of China (2019-JCJQ-JJ-381) "333 Engineering" Research Project of Jiangsu Province (BRA2016407) Fundamental Research Funds for the Central Universities (30920032101, 30919011222) Open Research Fund of Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense (3091801410411)
Collapse
Affiliation(s)
- Chao Zuo
- Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China.
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China.
| | - Jiaming Qian
- Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
| | - Shijie Feng
- Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
| | - Wei Yin
- Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
| | - Yixuan Li
- Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
| | - Pengfei Fan
- Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
- School of Engineering and Materials Science, Queen Mary University of London, London, E1 4NS, UK
| | - Jing Han
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China
| | - Kemao Qian
- School of Computer Science and Engineering, Nanyang Technological University, Singapore, 639798, Singapore.
| | - Qian Chen
- Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and Technology, 210094, Nanjing, Jiangsu Province, China.
| |
Collapse
|
11
|
Dou J, Wang D, Yu Q, Kong M, Liu L, Xu X, Liang R. Deep-learning-based deflectometry for freeform surface measurement. OPTICS LETTERS 2022; 47:78-81. [PMID: 34951885 DOI: 10.1364/ol.447006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/27/2021] [Accepted: 11/22/2021] [Indexed: 06/14/2023]
Abstract
We propose a deep-learning based deflectometric method for freeform surface measurement, in which a deep neural network is devised for freeform surface reconstruction. Full-scale skip connections are adopted in the network architecture to extract and incorporate multi-scale feature maps from different layers, enabling the accuracy and robustness of the testing system to be greatly enhanced. The feasibility of the proposed method is numerically and experimentally validated, and its excellent performance in terms of accuracy and robustness is also demonstrated. The proposed method provides a feasible way to achieve the general measurement of freeform surfaces while minimizing the measurement errors due to noise and system geometry calibration.
Collapse
|
12
|
Chen JG, Shah V, Liu L. Performance of a U-Net-based neural network for predictive adaptive optics. OPTICS LETTERS 2021; 46:2513-2516. [PMID: 33988623 DOI: 10.1364/ol.422656] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/19/2021] [Accepted: 04/09/2021] [Indexed: 06/12/2023]
Abstract
We apply a U-Net-based convolutional neural network (NN) architecture to the problem of predictive adaptive optics (AO) for tracking and imaging fast-moving targets, such as satellites in low Earth orbit (LEO). We show that the fine-tuned NN is able to achieve an approximately 50% reduction in mean-squared wavefront error over non-predictive approaches while predicting up to eight frames into the future. These results were obtained when the NN, trained mostly on simulated data, tested its performance on 1 kHz Shack-Hartmann wavefront sensor data collected in open-loop at the Advanced Electro-Optical System facility at Haleakala Observatory while the telescope tracked a naturally illuminated piece of LEO space debris. We report, to our knowledge, the first successful test of a NN for the predictive AO application using on-sky data, as well as the first time such a network has been developed for the more stressing space tracking application.
Collapse
|
13
|
Kim JJ, Fernandez B, Agrawal B. Iterative wavefront reconstruction for strong turbulence using Shack-Hartmann wavefront sensor measurements. JOURNAL OF THE OPTICAL SOCIETY OF AMERICA. A, OPTICS, IMAGE SCIENCE, AND VISION 2021; 38:456-464. [PMID: 33690478 DOI: 10.1364/josaa.413934] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/04/2020] [Accepted: 02/03/2021] [Indexed: 06/12/2023]
Abstract
An iterative wavefront reconstruction method using Shack-Hartmann wavefront sensor (SHWFS) measurements is presented in this paper. A new cost function for the wavefront reconstruction problem is derived and the solution is obtained iteratively using the gradient descent method. The proposed method aims to effectively handle the scintillated SHWFS measurements and to provide simpler and accurate ways to achieve branch-point-tolerant wavefront reconstruction suitable for adaptive optics compensation of strong turbulence. Simulated iterative wavefront reconstruction results show the effectiveness of the proposed method. A laboratory optical testbed is also presented to show the experimental implementation of the proposed method.
Collapse
|
14
|
Crepp JR, Letchev SO, Potier SJ, Follansbee JH, Tusay NT. Measuring phase errors in the presence of scintillation. OPTICS EXPRESS 2020; 28:37721-37733. [PMID: 33379601 DOI: 10.1364/oe.408825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Accepted: 10/30/2020] [Indexed: 06/12/2023]
Abstract
Strong turbulence conditions create amplitude aberrations through the effects of near-field diffraction. When integrated over long optical path lengths, amplitude aberrations (seen as scintillation) can nullify local areas in the recorded image of a coherent beam, complicating the wavefront reconstruction process. To estimate phase aberrations experienced by a telescope beam control system in the presence of strong turbulence, the wavefront sensor (WFS) of an adaptive optics must be robust to scintillation. We have designed and built a WFS, which we refer to as a "Fresnel sensor," that uses near-field diffraction to measure phase errors under moderate to strong turbulent conditions. Systematic studies of its sensitivity were performed with laboratory experiments using a point source beacon. The results were then compared to a Shack-Hartmann WFS (SHWFS). When the SHWFS experiences irradiance fade in the presence of moderate turbulence, the Fresnel WFS continues to routinely extract phase information. For a scintillation index of S = 0.55, we show that the Fresnel WFS offers a factor of 9 × gain in sensitivity over the SHWFS. We find that the Fresnel WFS is capable of operating with extremely low light levels, corresponding to a signal-to-noise ratio of only SNR≈2-3 per pixel. Such a device is well-suited for coherent beam propagation, laser communications, remote sensing, and applications involving long optical path-lengths, site-lines along the horizon, and faint signals.
Collapse
|
15
|
A Single Far-Field Deep Learning Adaptive Optics System Based on Four-Quadrant Discrete Phase Modulation. SENSORS 2020; 20:s20185106. [PMID: 32911666 PMCID: PMC7570715 DOI: 10.3390/s20185106] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/05/2020] [Revised: 08/26/2020] [Accepted: 09/03/2020] [Indexed: 11/29/2022]
Abstract
In adaptive optics (AO), multiple different incident wavefronts correspond to a same far-field intensity distribution, which leads to a many-to-one mapping. To solve this problem, a single far-field deep learning adaptive optics system based on four-quadrant discrete phase modulation (FQDPM) is proposed. Our method performs FQDPM on an incident wavefront to overcome this many-to-one mapping, then convolutional neural network (CNN) is used to directly predict the wavefront. Numerical simulations indicate that the proposed method can achieve precise high-speed wavefront correction with a single far-field intensity distribution: it takes nearly 0.6ms to complete wavefront correction while the mean root mean square (RMS) of residual wavefronts is 6.3% of that of incident wavefronts, and the Strehl ratio of the far-field intensity distribution increases by 5.7 times after correction. In addition, the experiment results show that mean RMS of residual wavefronts is 6.5% of that of incident wavefronts and it takes nearly 0.5 ms to finish wavefront reconstruction, which verifies the correctness of our proposed method.
Collapse
|
16
|
Hu L, Hu S, Gong W, Si K. Deep learning assisted Shack-Hartmann wavefront sensor for direct wavefront detection. OPTICS LETTERS 2020; 45:3741-3744. [PMID: 32630943 DOI: 10.1364/ol.395579] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/21/2020] [Accepted: 05/29/2020] [Indexed: 06/11/2023]
Abstract
The conventional Shack-Hartmann wavefront sensor (SHWS) requires wavefront slope measurements of every micro-lens for wavefront reconstruction. In this Letter, we applied deep learning on the SHWS to directly predict the wavefront distributions without wavefront slope measurements. The results show that our method could provide a lower root mean square wavefront error in high detection speed. The performance of the proposed method is also evaluated on challenging wavefronts, while the conventional approaches perform insufficiently. This Letter provides a new approach, to the best of our knowledge, to perform direct wavefront detection in SHWS-based applications.
Collapse
|