1
|
Ritter C, Lee JY, Pham MT, Pabba MK, Cardoso MC, Bartenschlager R, Rohr K. Multi-detector fusion and Bayesian smoothing for tracking viral and chromatin structures. Med Image Anal 2024; 97:103227. [PMID: 38897031 DOI: 10.1016/j.media.2024.103227] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2022] [Revised: 08/15/2023] [Accepted: 05/27/2024] [Indexed: 06/21/2024]
Abstract
Automatic tracking of viral and intracellular structures displayed as spots with varying sizes in fluorescence microscopy images is an important task to quantify cellular processes. We propose a novel probabilistic tracking approach for multiple particle tracking based on multi-detector and multi-scale data fusion as well as Bayesian smoothing. The approach integrates results from multiple detectors using a novel intensity-based covariance intersection method which takes into account information about the image intensities, positions, and uncertainties. The method ensures a consistent estimate of multiple fused particle detections and does not require an optimization step. Our probabilistic tracking approach performs data fusion of detections from classical and deep learning methods as well as exploits single-scale and multi-scale detections. In addition, we use Bayesian smoothing to fuse information of predictions from both past and future time points. We evaluated our approach using image data of the Particle Tracking Challenge and achieved state-of-the-art results or outperformed previous methods. Our method was also assessed on challenging live cell fluorescence microscopy image data of viral and cellular proteins expressed in hepatitis C virus-infected cells and chromatin structures in non-infected cells, acquired at different spatial-temporal resolutions. We found that the proposed approach outperforms existing methods.
Collapse
Affiliation(s)
- C Ritter
- Biomedical Computer Vision Group, BioQuant, IPMB, Heidelberg University, Im Neuenheimer Feld 267, Heidelberg, Germany.
| | - J-Y Lee
- Department of Infectious Diseases, Molecular Virology, Heidelberg University, Im Neuenheimer Feld 344, Heidelberg, Germany; German Center for Infection Research (DZIF), Heidelberg Partner Site, Germany
| | - M-T Pham
- Department of Infectious Diseases, Molecular Virology, Heidelberg University, Im Neuenheimer Feld 344, Heidelberg, Germany; German Center for Infection Research (DZIF), Heidelberg Partner Site, Germany
| | - M K Pabba
- Department of Biology, Cell Biology and Epigenetics, Technical University of Darmstadt, Schnittspahnstraße 10, Darmstadt, Germany
| | - M C Cardoso
- Department of Biology, Cell Biology and Epigenetics, Technical University of Darmstadt, Schnittspahnstraße 10, Darmstadt, Germany
| | - R Bartenschlager
- Department of Infectious Diseases, Molecular Virology, Heidelberg University, Im Neuenheimer Feld 344, Heidelberg, Germany; German Center for Infection Research (DZIF), Heidelberg Partner Site, Germany
| | - K Rohr
- Biomedical Computer Vision Group, BioQuant, IPMB, Heidelberg University, Im Neuenheimer Feld 267, Heidelberg, Germany.
| |
Collapse
|
2
|
Roudot P, Legant WR, Zou Q, Dean KM, Isogai T, Welf ES, David AF, Gerlich DW, Fiolka R, Betzig E, Danuser G. u-track3D: Measuring, navigating, and validating dense particle trajectories in three dimensions. CELL REPORTS METHODS 2023; 3:100655. [PMID: 38042149 PMCID: PMC10783629 DOI: 10.1016/j.crmeth.2023.100655] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/12/2023] [Revised: 08/10/2023] [Accepted: 11/09/2023] [Indexed: 12/04/2023]
Abstract
We describe u-track3D, a software package that extends the versatile u-track framework established in 2D to address the specific challenges of 3D particle tracking. First, we present the performance of the new package in quantifying a variety of intracellular dynamics imaged by multiple 3D microcopy platforms and on the standard 3D test dataset of the particle tracking challenge. These analyses indicate that u-track3D presents a tracking solution that is competitive to both conventional and deep-learning-based approaches. We then present the concept of dynamic region of interest (dynROI), which allows an experimenter to interact with dynamic 3D processes in 2D views amenable to visual inspection. Third, we present an estimator of trackability that automatically defines a score for every trajectory, thereby overcoming the challenges of trajectory validation by visual inspection. With these combined strategies, u-track3D provides a complete framework for unbiased studies of molecular processes in complex volumetric sequences.
Collapse
Affiliation(s)
- Philippe Roudot
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA; Aix Marseille University, CNRS, Centrale Marseille, I2M, Turing Centre for Living Systems, Marseille, France.
| | - Wesley R Legant
- Joint Department of Biomedical Engineering, University of North Carolina at Chapel Hill, North Carolina State University, Chapel Hill, NC, USA; Department of Pharmacology, University of North Carolina, Chapel Hill, NC, USA
| | - Qiongjing Zou
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Kevin M Dean
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Tadamoto Isogai
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Erik S Welf
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Ana F David
- Institute of Molecular Biotechnology of the Austrian Academy of Sciences, Vienna BioCenter, Vienna, Austria
| | - Daniel W Gerlich
- Institute of Molecular Biotechnology of the Austrian Academy of Sciences, Vienna BioCenter, Vienna, Austria
| | - Reto Fiolka
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA
| | - Eric Betzig
- Department of Molecular & Cell Biology, University of California, Berkeley, Berkeley, CA, USA
| | - Gaudenz Danuser
- Lyda Hill Department of Bioinformatics, UT Southwestern Medical Center, Dallas, TX, USA.
| |
Collapse
|
3
|
Chai B, Efstathiou C, Yue H, Draviam VM. Opportunities and challenges for deep learning in cell dynamics research. Trends Cell Biol 2023:S0962-8924(23)00228-3. [PMID: 38030542 DOI: 10.1016/j.tcb.2023.10.010] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2023] [Revised: 09/30/2023] [Accepted: 10/13/2023] [Indexed: 12/01/2023]
Abstract
The growth of artificial intelligence (AI) has led to an increase in the adoption of computer vision and deep learning (DL) techniques for the evaluation of microscopy images and movies. This adoption has not only addressed hurdles in quantitative analysis of dynamic cell biological processes but has also started to support advances in drug development, precision medicine, and genome-phenome mapping. We survey existing AI-based techniques and tools, as well as open-source datasets, with a specific focus on the computational tasks of segmentation, classification, and tracking of cellular and subcellular structures and dynamics. We summarise long-standing challenges in microscopy video analysis from a computational perspective and review emerging research frontiers and innovative applications for DL-guided automation in cell dynamics research.
Collapse
Affiliation(s)
- Binghao Chai
- School of Biological and Behavioural Sciences, Queen Mary University of London (QMUL), London E1 4NS, UK
| | - Christoforos Efstathiou
- School of Biological and Behavioural Sciences, Queen Mary University of London (QMUL), London E1 4NS, UK
| | - Haoran Yue
- School of Biological and Behavioural Sciences, Queen Mary University of London (QMUL), London E1 4NS, UK
| | - Viji M Draviam
- School of Biological and Behavioural Sciences, Queen Mary University of London (QMUL), London E1 4NS, UK; The Alan Turing Institute, London NW1 2DB, UK.
| |
Collapse
|
4
|
Okamoto K, Fujita H, Okada Y, Shinkai S, Onami S, Abe K, Fujimoto K, Sasaki K, Shioi G, Watanabe TM. Single-molecule tracking of Nanog and Oct4 in living mouse embryonic stem cells uncovers a feedback mechanism of pluripotency maintenance. EMBO J 2023; 42:e112305. [PMID: 37609947 PMCID: PMC10505915 DOI: 10.15252/embj.2022112305] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2022] [Revised: 06/13/2023] [Accepted: 06/22/2023] [Indexed: 08/24/2023] Open
Abstract
Nanog and Oct4 are core transcription factors that form part of a gene regulatory network to regulate hundreds of target genes for pluripotency maintenance in mouse embryonic stem cells (ESCs). To understand their function in the pluripotency maintenance, we visualised and quantified the dynamics of single molecules of Nanog and Oct4 in a mouse ESCs during pluripotency loss. Interestingly, Nanog interacted longer with its target loci upon reduced expression or at the onset of differentiation, suggesting a feedback mechanism to maintain the pluripotent state. The expression level and interaction time of Nanog and Oct4 correlate with their fluctuation and interaction frequency, respectively, which in turn depend on the ESC differentiation status. The DNA viscoelasticity near the Oct4 target locus remained flexible during differentiation, supporting its role either in chromatin opening or a preferred binding to uncondensed chromatin regions. Based on these results, we propose a new negative feedback mechanism for pluripotency maintenance via the DNA condensation state-dependent interplay of Nanog and Oct4.
Collapse
Affiliation(s)
- Kazuko Okamoto
- Laboratory for Comprehensive BioimagingRIKEN Center for Biosystems Dynamics Research (BDR)KobeJapan
- Amphibian Research CenterHiroshima UniversityHiroshimaJapan
| | - Hideaki Fujita
- Department of Stem Cell Biology, Research Institute for Radiation Biology and MedicineHiroshima UniversityHigashi‐HiroshimaJapan
| | - Yasushi Okada
- Laboratory for Cell Polarity RegulationRIKEN Center for Biosystems Dynamics Research (BDR)OsakaJapan
- Department of Cell BiologyGraduate School of Medicine, The University of TokyoTokyoJapan
- Department of PhysicsUniversal Biology Institute (UBI)Graduate School of Science, The University of TokyoTokyoJapan
- International Research Center for Neurointelligence (WPI‐IRCN)Institutes for Advanced Study, The University of TokyoTokyoJapan
| | - Soya Shinkai
- Laboratory for Developmental DynamicsRIKEN Center for Biosystems Dynamics Research (BDR)KobeJapan
- Research Center for the Mathematics on Chromatin Live Dynamics (RcMcD)Hiroshima UniversityHiroshimaJapan
| | - Shuichi Onami
- Laboratory for Developmental DynamicsRIKEN Center for Biosystems Dynamics Research (BDR)KobeJapan
| | - Kuniya Abe
- Technology and Development Team for Mammalian Genome DynamicsRIKEN BioResource Research Center (BRC)TsukubaJapan
| | - Kenta Fujimoto
- Department of Stem Cell Biology, Research Institute for Radiation Biology and MedicineHiroshima UniversityHigashi‐HiroshimaJapan
| | - Kensuke Sasaki
- Laboratory for Comprehensive BioimagingRIKEN Center for Biosystems Dynamics Research (BDR)KobeJapan
| | - Go Shioi
- Laboratory for Comprehensive BioimagingRIKEN Center for Biosystems Dynamics Research (BDR)KobeJapan
| | - Tomonobu M Watanabe
- Laboratory for Comprehensive BioimagingRIKEN Center for Biosystems Dynamics Research (BDR)KobeJapan
- Department of Stem Cell Biology, Research Institute for Radiation Biology and MedicineHiroshima UniversityHigashi‐HiroshimaJapan
| |
Collapse
|
5
|
Geometric deep learning reveals the spatiotemporal features of microscopic motion. NAT MACH INTELL 2023. [DOI: 10.1038/s42256-022-00595-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
AbstractThe characterization of dynamical processes in living systems provides important clues for their mechanistic interpretation and link to biological functions. Owing to recent advances in microscopy techniques, it is now possible to routinely record the motion of cells, organelles and individual molecules at multiple spatiotemporal scales in physiological conditions. However, the automated analysis of dynamics occurring in crowded and complex environments still lags behind the acquisition of microscopic image sequences. Here we present a framework based on geometric deep learning that achieves the accurate estimation of dynamical properties in various biologically relevant scenarios. This deep-learning approach relies on a graph neural network enhanced by attention-based components. By processing object features with geometric priors, the network is capable of performing multiple tasks, from linking coordinates into trajectories to inferring local and global dynamic properties. We demonstrate the flexibility and reliability of this approach by applying it to real and simulated data corresponding to a broad range of biological experiments.
Collapse
|
6
|
Hradecka L, Wiesner D, Sumbal J, Koledova ZS, Maska M. Segmentation and Tracking of Mammary Epithelial Organoids in Brightfield Microscopy. IEEE TRANSACTIONS ON MEDICAL IMAGING 2023; 42:281-290. [PMID: 36170389 DOI: 10.1109/tmi.2022.3210714] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
We present an automated and deep-learning-based workflow to quantitatively analyze the spatiotemporal development of mammary epithelial organoids in two-dimensional time-lapse (2D+t) sequences acquired using a brightfield microscope at high resolution. It involves a convolutional neural network (U-Net), purposely trained using computer-generated bioimage data created by a conditional generative adversarial network (pix2pixHD), to infer semantic segmentation, adaptive morphological filtering to identify organoid instances, and a shape-similarity-constrained, instance-segmentation-correcting tracking procedure to reliably cherry-pick the organoid instances of interest in time. By validating it using real 2D+t sequences of mouse mammary epithelial organoids of morphologically different phenotypes, we clearly demonstrate that the workflow achieves reliable segmentation and tracking performance, providing a reproducible and laborless alternative to manual analyses of the acquired bioimage data.
Collapse
|