1
|
Hsu CW, Lin CY, Hu YY, Chen SJ. Dual-resonant scanning multiphoton microscope with ultrasound lens and resonant mirror for rapid volumetric imaging. Sci Rep 2023; 13:161. [PMID: 36599927 DOI: 10.1038/s41598-022-27370-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2022] [Accepted: 12/30/2022] [Indexed: 01/06/2023] Open
Abstract
A dual-resonant scanning multiphoton (DRSM) microscope incorporating a tunable acoustic gradient index of refraction lens with a resonant mirror is developed for high-speed volumetric imaging. In the proposed microscope, the pulse train signal of a femtosecond laser is used to trigger an embedded field programmable gate array to sample the multiphoton excited fluorescence signal at the rate of one pixel per laser pulse. It is shown that a frame rate of around 8000 Hz can be obtained in the x-z plane for an image region with a size of 256 × 80 pixels. Moreover, a volumetric imaging rate of over 30 Hz can be obtained for a large image volume of 343 × 343 × 120 μm3 with an image size of 256 × 256 × 80 voxels. Moreover, a volumetric imaging rate of over 30 Hz can be obtained for a large image volume of 256 × 256 × 80 voxels, which represents 343 × 343 × 120 μm3 in field-of-view. The rapid volumetric imaging rate eliminates the aliasing effect for observed temporal frequencies lower than 15 Hz. The practical feasibility of the DRSM microscope is demonstrated by observing the mushroom bodies of a drosophila brain and performing 3D dynamic observations of moving 10-μm fluorescent beads.
Collapse
Affiliation(s)
- Chia-Wei Hsu
- College of Photonics, National Yang Ming Chiao Tung University, Tainan, 71150, Taiwan
| | - Chun-Yu Lin
- College of Photonics, National Yang Ming Chiao Tung University, Tainan, 71150, Taiwan
| | - Yvonne Yuling Hu
- Department of Photonics, National Cheng Kung University, Tainan, 70101, Taiwan
| | - Shean-Jen Chen
- College of Photonics, National Yang Ming Chiao Tung University, Tainan, 71150, Taiwan. .,Taiwan Instrument Research Institute, National Applied Research Laboratories, Hsinchu, 300, Taiwan.
| |
Collapse
|
2
|
Capturing the start point of the virus-cell interaction with high-speed 3D single-virus tracking. Nat Methods 2022; 19:1642-1652. [PMID: 36357694 PMCID: PMC10154077 DOI: 10.1038/s41592-022-01672-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2022] [Accepted: 09/30/2022] [Indexed: 11/12/2022]
Abstract
The early stages of the virus-cell interaction have long evaded observation by existing microscopy methods due to the rapid diffusion of virions in the extracellular space and the large three-dimensional cellular structures involved. Here we present an active-feedback single-particle tracking method with simultaneous volumetric imaging of the live cell environment called 3D-TrIm to address this knowledge gap. 3D-TrIm captures the extracellular phase of the infectious cycle in what we believe is unprecedented detail. We report what are, to our knowledge, previously unobserved phenomena in the early stages of the virus-cell interaction, including skimming contact events at the millisecond timescale, orders of magnitude change in diffusion coefficient upon binding and cylindrical and linear diffusion modes along cellular protrusions. Finally, we demonstrate how this method can move single-particle tracking from simple monolayer culture toward more tissue-like conditions by tracking single virions in tightly packed epithelial cells. This multiresolution method presents opportunities for capturing fast, three-dimensional processes in biological systems.
Collapse
|
3
|
van Heerden B, Vickers NA, Krüger TPJ, Andersson SB. Real-Time Feedback-Driven Single-Particle Tracking: A Survey and Perspective. SMALL (WEINHEIM AN DER BERGSTRASSE, GERMANY) 2022; 18:e2107024. [PMID: 35758534 PMCID: PMC9308725 DOI: 10.1002/smll.202107024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2021] [Revised: 04/07/2022] [Indexed: 05/14/2023]
Abstract
Real-time feedback-driven single-particle tracking (RT-FD-SPT) is a class of techniques in the field of single-particle tracking that uses feedback control to keep a particle of interest in a detection volume. These methods provide high spatiotemporal resolution on particle dynamics and allow for concurrent spectroscopic measurements. This review article begins with a survey of existing techniques and of applications where RT-FD-SPT has played an important role. Each of the core components of RT-FD-SPT are systematically discussed in order to develop an understanding of the trade-offs that must be made in algorithm design and to create a clear picture of the important differences, advantages, and drawbacks of existing approaches. These components are feedback tracking and control, ranging from simple proportional-integral-derivative control to advanced nonlinear techniques, estimation to determine particle location from the measured data, including both online and offline algorithms, and techniques for calibrating and characterizing different RT-FD-SPT methods. Then a collection of metrics for RT-FD-SPT is introduced to help guide experimentalists in selecting a method for their particular application and to help reveal where there are gaps in the techniques that represent opportunities for further development. Finally, this review is concluded with a discussion on future perspectives in the field.
Collapse
Affiliation(s)
- Bertus van Heerden
- Department of Physics, University of Pretoria, Pretoria, 0002, South Africa
- Forestry and Agricultural Biotechnology Institute (FABI), University of Pretoria, Pretoria, 0002, South Africa
| | - Nicholas A Vickers
- Department of Mechanical Engineering, Boston University, Boston, MA, 02215, USA
| | - Tjaart P J Krüger
- Department of Physics, University of Pretoria, Pretoria, 0002, South Africa
- Forestry and Agricultural Biotechnology Institute (FABI), University of Pretoria, Pretoria, 0002, South Africa
| | - Sean B Andersson
- Department of Mechanical Engineering, Boston University, Boston, MA, 02215, USA
- Division of Systems Engineering, Boston University, Boston, MA, 02215, USA
| |
Collapse
|
4
|
Huang L, Chen H, Luo Y, Rivenson Y, Ozcan A. Recurrent neural network-based volumetric fluorescence microscopy. LIGHT, SCIENCE & APPLICATIONS 2021; 10:62. [PMID: 33753716 PMCID: PMC7985192 DOI: 10.1038/s41377-021-00506-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Revised: 02/24/2021] [Accepted: 03/02/2021] [Indexed: 05/12/2023]
Abstract
Volumetric imaging of samples using fluorescence microscopy plays an important role in various fields including physical, medical and life sciences. Here we report a deep learning-based volumetric image inference framework that uses 2D images that are sparsely captured by a standard wide-field fluorescence microscope at arbitrary axial positions within the sample volume. Through a recurrent convolutional neural network, which we term as Recurrent-MZ, 2D fluorescence information from a few axial planes within the sample is explicitly incorporated to digitally reconstruct the sample volume over an extended depth-of-field. Using experiments on C. elegans and nanobead samples, Recurrent-MZ is demonstrated to significantly increase the depth-of-field of a 63×/1.4NA objective lens, also providing a 30-fold reduction in the number of axial scans required to image the same sample volume. We further illustrated the generalization of this recurrent network for 3D imaging by showing its resilience to varying imaging conditions, including e.g., different sequences of input images, covering various axial permutations and unknown axial positioning errors. We also demonstrated wide-field to confocal cross-modality image transformations using Recurrent-MZ framework and performed 3D image reconstruction of a sample using a few wide-field 2D fluorescence images as input, matching confocal microscopy images of the same sample volume. Recurrent-MZ demonstrates the first application of recurrent neural networks in microscopic image reconstruction and provides a flexible and rapid volumetric imaging framework, overcoming the limitations of current 3D scanning microscopy tools.
Collapse
Affiliation(s)
- Luzhe Huang
- Electrical and Computer Engineering Department, University of California, Los Angeles, CA, 90095, USA
- Bioengineering Department, University of California, Los Angeles, CA, 90095, USA
- California Nano Systems Institute (CNSI), University of California, Los Angeles, CA, 90095, USA
| | - Hanlong Chen
- Electrical and Computer Engineering Department, University of California, Los Angeles, CA, 90095, USA
| | - Yilin Luo
- Electrical and Computer Engineering Department, University of California, Los Angeles, CA, 90095, USA
| | - Yair Rivenson
- Electrical and Computer Engineering Department, University of California, Los Angeles, CA, 90095, USA
- Bioengineering Department, University of California, Los Angeles, CA, 90095, USA
- California Nano Systems Institute (CNSI), University of California, Los Angeles, CA, 90095, USA
| | - Aydogan Ozcan
- Electrical and Computer Engineering Department, University of California, Los Angeles, CA, 90095, USA.
- Bioengineering Department, University of California, Los Angeles, CA, 90095, USA.
- California Nano Systems Institute (CNSI), University of California, Los Angeles, CA, 90095, USA.
- David Geffen School of Medicine, University of California, Los Angeles, CA, 90095, USA.
| |
Collapse
|