1
|
Lin Y, Mos P, Ardelean A, Bruschini C, Charbon E. Coupling a recurrent neural network to SPAD TCSPC systems for real-time fluorescence lifetime imaging. Sci Rep 2024; 14:3286. [PMID: 38331957 PMCID: PMC10853568 DOI: 10.1038/s41598-024-52966-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2023] [Accepted: 01/25/2024] [Indexed: 02/10/2024] Open
Abstract
Fluorescence lifetime imaging (FLI) has been receiving increased attention in recent years as a powerful diagnostic technique in biological and medical research. However, existing FLI systems often suffer from a tradeoff between processing speed, accuracy, and robustness. Inspired by the concept of Edge Artificial Intelligence (Edge AI), we propose a robust approach that enables fast FLI with no degradation of accuracy. This approach couples a recurrent neural network (RNN), which is trained to estimate the fluorescence lifetime directly from raw timestamps without building histograms, to SPAD TCSPC systems, thereby drastically reducing transfer data volumes and hardware resource utilization, and enabling real-time FLI acquisition. We train two variants of the RNN on a synthetic dataset and compare the results to those obtained using center-of-mass method (CMM) and least squares fitting (LS fitting). Results demonstrate that two RNN variants, gated recurrent unit (GRU) and long short-term memory (LSTM), are comparable to CMM and LS fitting in terms of accuracy, while outperforming them in the presence of background noise by a large margin. To explore the ultimate limits of the approach, we derive the Cramer-Rao lower bound of the measurement, showing that RNN yields lifetime estimations with near-optimal precision. To demonstrate real-time operation, we build a FLI microscope based on an existing SPAD TCSPC system comprising a 32[Formula: see text]32 SPAD sensor named Piccolo. Four quantized GRU cores, capable of processing up to 4 million photons per second, are deployed on the Xilinx Kintex-7 FPGA that controls the Piccolo. Powered by the GRU, the FLI setup can retrieve real-time fluorescence lifetime images at up to 10 frames per second. The proposed FLI system is promising and ideally suited for biomedical applications, including biological imaging, biomedical diagnostics, and fluorescence-assisted surgery, etc.
Collapse
Affiliation(s)
- Yang Lin
- Advanced Quantum Architecture Laboratory, École polytechnique fédérale de Lausanne, Neuchâtel, 2002, Switzerland
| | - Paul Mos
- Advanced Quantum Architecture Laboratory, École polytechnique fédérale de Lausanne, Neuchâtel, 2002, Switzerland
| | - Andrei Ardelean
- Advanced Quantum Architecture Laboratory, École polytechnique fédérale de Lausanne, Neuchâtel, 2002, Switzerland
| | - Claudio Bruschini
- Advanced Quantum Architecture Laboratory, École polytechnique fédérale de Lausanne, Neuchâtel, 2002, Switzerland
| | - Edoardo Charbon
- Advanced Quantum Architecture Laboratory, École polytechnique fédérale de Lausanne, Neuchâtel, 2002, Switzerland.
| |
Collapse
|
2
|
Houston JP, Valentino S, Bitton A. Fluorescence Lifetime Measurements and Analyses: Protocols Using Flow Cytometry and High-Throughput Microscopy. Methods Mol Biol 2024; 2779:323-351. [PMID: 38526793 DOI: 10.1007/978-1-0716-3738-8_15] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/27/2024]
Abstract
This chapter focuses on applications and protocols that involve the measurement of the fluorescence lifetime as an informative cytometric parameter. The timing of fluorescence decay has been well-studied for cell counting, sorting, and imaging. Therefore, provided herein is an overview of the techniques used, how they enhance cytometry protocols, and the modern techniques used for lifetime analysis. The background and theory behind fluorescence decay kinetic measurements in cells is first discussed followed by the history of the development of time-resolved flow cytometry. These sections are followed by a review of applications that benefit from the quantitative nature of fluorescence lifetimes as a photophysical trait. Lastly, perspectives on the modern ways in which the fluorescence lifetime is scanned at high throughputs which include high-speed microscopy and machine learning are provided.
Collapse
Affiliation(s)
- Jessica P Houston
- Department of Chemical & Materials Engineering, New Mexico State University, Las Cruces, NM, USA.
| | - Samantha Valentino
- Department of Chemical & Materials Engineering, New Mexico State University, Las Cruces, NM, USA
| | | |
Collapse
|
3
|
Zang Z, Xiao D, Wang Q, Jiao Z, Chen Y, Li DDU. Compact and robust deep learning architecture for fluorescence lifetime imaging and FPGA implementation. Methods Appl Fluoresc 2023; 11. [PMID: 36863024 DOI: 10.1088/2050-6120/acc0d9] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Accepted: 03/01/2023] [Indexed: 03/04/2023]
Abstract
This paper reports a bespoke adder-based deep learning network for time-domain fluorescence lifetime imaging (FLIM). By leveraging thel1-norm extraction method, we propose a 1D Fluorescence Lifetime AdderNet (FLAN) without multiplication-based convolutions to reduce the computational complexity. Further, we compressed fluorescence decays in temporal dimension using a log-scale merging technique to discard redundant temporal information derived as log-scaling FLAN (FLAN+LS). FLAN+LS achieves 0.11 and 0.23 compression ratios compared with FLAN and a conventional 1D convolutional neural network (1D CNN) while maintaining high accuracy in retrieving lifetimes. We extensively evaluated FLAN and FLAN+LS using synthetic and real data. A traditional fitting method and other non-fitting, high-accuracy algorithms were compared with our networks for synthetic data. Our networks attained a minor reconstruction error in different photon-count scenarios. For real data, we used fluorescent beads' data acquired by a confocal microscope to validate the effectiveness of real fluorophores, and our networks can differentiate beads with different lifetimes. Additionally, we implemented the network architecture on a field-programmable gate array (FPGA) with a post-quantization technique to shorten the bit-width, thereby improving computing efficiency. FLAN+LS on hardware achieves the highest computing efficiency compared to 1D CNN and FLAN. We also discussed the applicability of our network and hardware architecture for other time-resolved biomedical applications using photon-efficient, time-resolved sensors.
Collapse
Affiliation(s)
- Zhenya Zang
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G1 1XQ, United Kingdom
| | - Dong Xiao
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G1 1XQ, United Kingdom
| | - Quan Wang
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G1 1XQ, United Kingdom
| | - Ziao Jiao
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G1 1XQ, United Kingdom
| | - Yu Chen
- Department of Physics, University of Strathclyde, Glasgow G4 0NG, United Kingdom
| | - David Day Uei Li
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G1 1XQ, United Kingdom
| |
Collapse
|
4
|
Xiao D, Zang Z, Wang Q, Jiao Z, Rocca FMD, Chen Y, Li DDU. Smart Wide-field Fluorescence Lifetime Imaging System with CMOS Single-photon Avalanche Diode Arrays. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:1887-1890. [PMID: 36086288 DOI: 10.1109/embc48229.2022.9870996] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Wide-field fluorescence lifetime imaging (FLIM) is a promising technique for biomedical and clinic applications. Integrating with CMOS single-photon avalanche diode (SPAD) sensor arrays can lead to cheaper and portable real-time FLIM systems. However, the FLIM data obtained by such sensor systems often have sophisticated noise features. There is still a lack of fast tools to recover lifetime parameters from highly noise-corrupted fluorescence signals efficiently. This paper proposes a smart wide-field FLIM system containing a 192×128 COMS SPAD sensor and a field-programmable gate array (FPGA) embedded deep learning (DL) FLIM processor. The processor adopts a hardware-friendly and light-weighted neural network for fluorescence lifetime analysis, showing the advantages of high accuracy against noise, fast speed, and low power consumption. Experimental results demonstrate the proposed system's superior and robust performances, promising for many FLIM applications such as FLIM-guided clinical surgeries, cancer diagnosis, and biomedical imaging.
Collapse
|
5
|
Zang Z, Xiao D, Wang Q, Jiao Z, Li Z, Chen Y, Li DDU. Hardware Inspired Neural Network for Efficient Time-Resolved Biomedical Imaging. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2022; 2022:1883-1886. [PMID: 36085638 DOI: 10.1109/embc48229.2022.9871214] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Convolutional neural networks (CNN) have revealed exceptional performance for fluorescence lifetime imaging (FLIM). However, redundant parameters and complicated topologies make it challenging to implement such networks on embedded hardware to achieve real-time processing. We report a lightweight, quantized neural architecture that can offer fast FLIM imaging. The forward-propagation is significantly simplified by replacing matrix multiplications in each convolution layer with additions and data quantization using a low bit-width. We first used synthetic 3-D lifetime data with given lifetime ranges and photon counts to assure correct average lifetimes can be obtained. Afterwards, human prostatic cancer cells incubated with gold nanoprobes were utilized to validate the feasibility of the network for real-world data. The quantized network yielded a 37.8% compression ratio without performance degradation. Clinical relevance - This neural network can be applied to diagnose cancer early based on fluorescence lifetime in a non-invasive way. This approach brings high accuracy and accelerates diagnostic processes for clinicians who are not experts in biomedical signal processing.
Collapse
|
6
|
Zang Z, Xiao D, Wang Q, Li Z, Xie W, Chen Y, Li DDU. Fast Analysis of Time-Domain Fluorescence Lifetime Imaging via Extreme Learning Machine. SENSORS 2022; 22:s22103758. [PMID: 35632167 PMCID: PMC9146214 DOI: 10.3390/s22103758] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Revised: 05/11/2022] [Accepted: 05/13/2022] [Indexed: 01/25/2023]
Abstract
We present a fast and accurate analytical method for fluorescence lifetime imaging microscopy (FLIM), using the extreme learning machine (ELM). We used extensive metrics to evaluate ELM and existing algorithms. First, we compared these algorithms using synthetic datasets. The results indicate that ELM can obtain higher fidelity, even in low-photon conditions. Afterwards, we used ELM to retrieve lifetime components from human prostate cancer cells loaded with gold nanosensors, showing that ELM also outperforms the iterative fitting and non-fitting algorithms. By comparing ELM with a computational efficient neural network, ELM achieves comparable accuracy with less training and inference time. As there is no back-propagation process for ELM during the training phase, the training speed is much higher than existing neural network approaches. The proposed strategy is promising for edge computing with online training.
Collapse
Affiliation(s)
- Zhenya Zang
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G4 0RE, UK; (Z.Z.); (D.X.); (Q.W.); (W.X.)
| | - Dong Xiao
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G4 0RE, UK; (Z.Z.); (D.X.); (Q.W.); (W.X.)
| | - Quan Wang
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G4 0RE, UK; (Z.Z.); (D.X.); (Q.W.); (W.X.)
| | - Zinuo Li
- Department of Physics, University of Strathclyde, Glasgow G4 0NG, UK; (Z.L.); (Y.C.)
| | - Wujun Xie
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G4 0RE, UK; (Z.Z.); (D.X.); (Q.W.); (W.X.)
| | - Yu Chen
- Department of Physics, University of Strathclyde, Glasgow G4 0NG, UK; (Z.L.); (Y.C.)
| | - David Day Uei Li
- Department of Biomedical Engineering, University of Strathclyde, Glasgow G4 0RE, UK; (Z.Z.); (D.X.); (Q.W.); (W.X.)
- Correspondence:
| |
Collapse
|
7
|
Xiao D, Zang Z, Xie W, Sapermsap N, Chen Y, Uei Li DD. Spatial resolution improved fluorescence lifetime imaging via deep learning. OPTICS EXPRESS 2022; 30:11479-11494. [PMID: 35473091 DOI: 10.1364/oe.451215] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/07/2022] [Accepted: 03/12/2022] [Indexed: 06/14/2023]
Abstract
We present a deep learning approach to obtain high-resolution (HR) fluorescence lifetime images from low-resolution (LR) images acquired from fluorescence lifetime imaging (FLIM) systems. We first proposed a theoretical method for training neural networks to generate massive semi-synthetic FLIM data with various cellular morphologies, a sizeable dynamic lifetime range, and complex decay components. We then developed a degrading model to obtain LR-HR pairs and created a hybrid neural network, the spatial resolution improved FLIM net (SRI-FLIMnet) to simultaneously estimate fluorescence lifetimes and realize the nonlinear transformation from LR to HR images. The evaluative results demonstrate SRI-FLIMnet's superior performance in reconstructing spatial information from limited pixel resolution. We also verified SRI-FLIMnet using experimental images of bacterial infected mouse raw macrophage cells. Results show that the proposed data generation method and SRI-FLIMnet efficiently achieve superior spatial resolution for FLIM applications. Our study provides a solution for fast obtaining HR FLIM images.
Collapse
|