1
|
Seifert M, Roberts PA, Kafetzis G, Osorio D, Baden T. Birds multiplex spectral and temporal visual information via retinal On- and Off-channels. Nat Commun 2023; 14:5308. [PMID: 37652912 PMCID: PMC10471707 DOI: 10.1038/s41467-023-41032-z] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Accepted: 08/18/2023] [Indexed: 09/02/2023] Open
Abstract
In vertebrate vision, early retinal circuits divide incoming visual information into functionally opposite elementary signals: On and Off, transient and sustained, chromatic and achromatic. Together these signals can yield an efficient representation of the scene for transmission to the brain via the optic nerve. However, this long-standing interpretation of retinal function is based on mammals, and it is unclear whether this functional arrangement is common to all vertebrates. Here we show that male poultry chicks use a fundamentally different strategy to communicate information from the eye to the brain. Rather than using functionally opposite pairs of retinal output channels, chicks encode the polarity, timing, and spectral composition of visual stimuli in a highly correlated manner: fast achromatic information is encoded by Off-circuits, and slow chromatic information overwhelmingly by On-circuits. Moreover, most retinal output channels combine On- and Off-circuits to simultaneously encode, or multiplex, both achromatic and chromatic information. Our results from birds conform to evidence from fish, amphibians, and reptiles which retain the full ancestral complement of four spectral types of cone photoreceptors.
Collapse
Affiliation(s)
- Marvin Seifert
- School of Life Sciences, University of Sussex, Brighton, UK.
| | - Paul A Roberts
- School of Life Sciences, University of Sussex, Brighton, UK
| | | | - Daniel Osorio
- School of Life Sciences, University of Sussex, Brighton, UK.
| | - Tom Baden
- School of Life Sciences, University of Sussex, Brighton, UK.
- Institute of Ophthalmic Research, University of Tübingen, Tübingen, Germany.
| |
Collapse
|
2
|
Tran MH, Fei B. Compact and ultracompact spectral imagers: technology and applications in biomedical imaging. JOURNAL OF BIOMEDICAL OPTICS 2023; 28:040901. [PMID: 37035031 PMCID: PMC10075274 DOI: 10.1117/1.jbo.28.4.040901] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/17/2022] [Accepted: 02/27/2023] [Indexed: 05/18/2023]
Abstract
Significance Spectral imaging, which includes hyperspectral and multispectral imaging, can provide images in numerous wavelength bands within and beyond the visible light spectrum. Emerging technologies that enable compact, portable spectral imaging cameras can facilitate new applications in biomedical imaging. Aim With this review paper, researchers will (1) understand the technological trends of upcoming spectral cameras, (2) understand new specific applications that portable spectral imaging unlocked, and (3) evaluate proper spectral imaging systems for their specific applications. Approach We performed a comprehensive literature review in three databases (Scopus, PubMed, and Web of Science). We included only fully realized systems with definable dimensions. To best accommodate many different definitions of "compact," we included a table of dimensions and weights for systems that met our definition. Results There is a wide variety of contributions from industry, academic, and hobbyist spaces. A variety of new engineering approaches, such as Fabry-Perot interferometers, spectrally resolved detector array (mosaic array), microelectro-mechanical systems, 3D printing, light-emitting diodes, and smartphones, were used in the construction of compact spectral imaging cameras. In bioimaging applications, these compact devices were used for in vivo and ex vivo diagnosis and surgical settings. Conclusions Compact and ultracompact spectral imagers are the future of spectral imaging systems. Researchers in the bioimaging fields are building systems that are low-cost, fast in acquisition time, and mobile enough to be handheld.
Collapse
Affiliation(s)
- Minh H. Tran
- University of Texas at Dallas, Department of Bioengineering, Richardson, Texas, United States
| | - Baowei Fei
- University of Texas at Dallas, Department of Bioengineering, Richardson, Texas, United States
- University of Texas Southwestern Medical Center, Department of Radiology, Dallas, Texas, United States
- University of Texas at Dallas, Center for Imaging and Surgical Innovation, Richardson, Texas, United States
- Address all correspondence to Baowei Fei,
| |
Collapse
|
3
|
Stöckl AL, Foster JJ. Night skies through animals' eyes-Quantifying night-time visual scenes and light pollution as viewed by animals. Front Cell Neurosci 2022; 16:984282. [PMID: 36274987 PMCID: PMC9582234 DOI: 10.3389/fncel.2022.984282] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2022] [Accepted: 09/09/2022] [Indexed: 11/13/2022] Open
Abstract
A large proportion of animal species enjoy the benefits of being active at night, and have evolved the corresponding optical and neural adaptations to cope with the challenges of low light intensities. However, over the past century electric lighting has introduced direct and indirect light pollution into the full range of terrestrial habitats, changing nocturnal animals' visual worlds dramatically. To understand how these changes affect nocturnal behavior, we here propose an animal-centered analysis method based on environmental imaging. This approach incorporates the sensitivity and acuity limits of individual species, arriving at predictions of photon catch relative to noise thresholds, contrast distributions, and the orientation cues nocturnal species can extract from visual scenes. This analysis relies on just a limited number of visual system parameters known for each species. By accounting for light-adaptation in our analysis, we are able to make more realistic predictions of the information animals can extract from nocturnal visual scenes under different levels of light pollution. With this analysis method, we aim to provide context for the interpretation of behavioral findings, and to allow researchers to generate specific hypotheses for the behavior of nocturnal animals in observed light-polluted scenes.
Collapse
Affiliation(s)
- Anna Lisa Stöckl
- Department of Biology, University of Konstanz, Konstanz, Germany
- Centre for the Advanced Study of Collective Behaviour, University of Konstanz, Konstanz, Germany
- Zukunftskolleg, Universität Konstanz, Konstanz, Germany
| | - James Jonathan Foster
- Department of Biology, University of Konstanz, Konstanz, Germany
- Centre for the Advanced Study of Collective Behaviour, University of Konstanz, Konstanz, Germany
| |
Collapse
|
4
|
Yoshimatsu T, Bartel P, Schröder C, Janiak FK, St-Pierre F, Berens P, Baden T. Ancestral circuits for vertebrate color vision emerge at the first retinal synapse. SCIENCE ADVANCES 2021; 7:eabj6815. [PMID: 34644120 PMCID: PMC8514090 DOI: 10.1126/sciadv.abj6815] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
For color vision, retinal circuits separate information about intensity and wavelength. In vertebrates that use the full complement of four “ancestral” cone types, the nature and implementation of this computation remain poorly understood. Here, we establish the complete circuit architecture of outer retinal circuits underlying color processing in larval zebrafish. We find that the synaptic outputs of red and green cones efficiently rotate the encoding of natural daylight in a principal components analysis–like manner to yield primary achromatic and spectrally opponent axes, respectively. Blue cones are tuned to capture most remaining variance when opposed to green cones, while UV cone present a UV achromatic axis for prey capture. We note that fruitflies use essentially the same strategy. Therefore, rotating color space into primary achromatic and chromatic axes at the eye’s first synapse may thus be a fundamental principle of color vision when using more than two spectrally well-separated photoreceptor types.
Collapse
Affiliation(s)
| | - Philipp Bartel
- School of Life Sciences, University of Sussex, Brighton, UK
| | - Cornelius Schröder
- Institute of Ophthalmic Research, University of Tübingen, Tübingen, Germany
- Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
| | | | - François St-Pierre
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
- Department of Electrical and Computer Engineering, Rice University, Houston, TX, USA
- Systems, Synthetic, and Physical Biology Program, Rice University, Houston, TX, USA
| | - Philipp Berens
- Institute of Ophthalmic Research, University of Tübingen, Tübingen, Germany
- Centre for Integrative Neuroscience, University of Tübingen, Tübingen, Germany
- Institute for Bioinformatics and Medical Informatics, University of Tübingen, Tübingen, Germany
| | - Tom Baden
- School of Life Sciences, University of Sussex, Brighton, UK
- Institute of Ophthalmic Research, University of Tübingen, Tübingen, Germany
- Corresponding author.
| |
Collapse
|
5
|
Alafeef M, Moitra P, Dighe K, Pan D. Hyperspectral Mapping for the Detection of SARS-CoV-2 Using Nanomolecular Probes with Yoctomole Sensitivity. ACS NANO 2021; 15:13742-13758. [PMID: 34279093 PMCID: PMC8315249 DOI: 10.1021/acsnano.1c05226] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2021] [Accepted: 07/14/2021] [Indexed: 05/02/2023]
Abstract
Efficient monitoring of SARS-CoV-2 outbreak requires the use of a sensitive and rapid diagnostic test. Although SARS-CoV-2 RNA can be detected by RT-qPCR, the molecular-level quantification of the viral load is still challenging, time-consuming, and labor-intensive. Here, we report an ultrasensitive hyperspectral sensor (HyperSENSE) based on hafnium nanoparticles (HfNPs) for specific detection of COVID-19 causative virus, SARS-CoV-2. Density functional theoretical calculations reveal that HfNPs exhibit higher changes in their absorption wavelength and light scattering when bound to their target SARS-CoV-2 RNA sequence relative to the gold nanoparticles. The assay has a turnaround time of a few seconds and has a limit of detection in the yoctomolar range, which is 1 000 000-fold times higher than the currently available COVID-19 tests. We demonstrated in ∼100 COVID-19 clinical samples that the assay is highly sensitive and has a specificity of 100%. We also show that HyperSENSE can rapidly detect other viruses such as influenza A H1N1. The outstanding sensitivity indicates the potential of the current biosensor in detecting the prevailing presymptomatic and asymptomatic COVID-19 cases. Thus, integrating hyperspectral imaging with nanomaterials establishes a diagnostic platform for ultrasensitive detection of COVID-19 that can potentially be applied to any emerging infectious pathogen.
Collapse
Affiliation(s)
- Maha Alafeef
- Bioengineering Department, The University
of Illinois at Urbana−Champaign, Urbana, Illinois 61801,
United States
- Departments of Diagnostic Radiology and Nuclear
Medicine and Pediatrics, Center for Blood Oxygen Transport and Hemostasis,
University of Maryland Baltimore School of Medicine, Health
Sciences Research Facility III, 670 W. Baltimore Street, Baltimore, Maryland 21201,
United States
- Biomedical Engineering Department, Jordan
University of Science and Technology, Irbid 22110,
Jordan
- Department of Chemical, Biochemical and Environmental
Engineering, University of Maryland Baltimore County,
Interdisciplinary Health Sciences Facility, 1000 Hilltop Circle, Baltimore, Maryland
21250, United States
| | - Parikshit Moitra
- Departments of Diagnostic Radiology and Nuclear
Medicine and Pediatrics, Center for Blood Oxygen Transport and Hemostasis,
University of Maryland Baltimore School of Medicine, Health
Sciences Research Facility III, 670 W. Baltimore Street, Baltimore, Maryland 21201,
United States
| | - Ketan Dighe
- Departments of Diagnostic Radiology and Nuclear
Medicine and Pediatrics, Center for Blood Oxygen Transport and Hemostasis,
University of Maryland Baltimore School of Medicine, Health
Sciences Research Facility III, 670 W. Baltimore Street, Baltimore, Maryland 21201,
United States
- Department of Chemical, Biochemical and Environmental
Engineering, University of Maryland Baltimore County,
Interdisciplinary Health Sciences Facility, 1000 Hilltop Circle, Baltimore, Maryland
21250, United States
| | - Dipanjan Pan
- Bioengineering Department, The University
of Illinois at Urbana−Champaign, Urbana, Illinois 61801,
United States
- Departments of Diagnostic Radiology and Nuclear
Medicine and Pediatrics, Center for Blood Oxygen Transport and Hemostasis,
University of Maryland Baltimore School of Medicine, Health
Sciences Research Facility III, 670 W. Baltimore Street, Baltimore, Maryland 21201,
United States
- Department of Chemical, Biochemical and Environmental
Engineering, University of Maryland Baltimore County,
Interdisciplinary Health Sciences Facility, 1000 Hilltop Circle, Baltimore, Maryland
21250, United States
| |
Collapse
|
6
|
Qiu Y, Zhao Z, Klindt D, Kautzky M, Szatko KP, Schaeffel F, Rifai K, Franke K, Busse L, Euler T. Natural environment statistics in the upper and lower visual field are reflected in mouse retinal specializations. Curr Biol 2021; 31:3233-3247.e6. [PMID: 34107304 DOI: 10.1016/j.cub.2021.05.017] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2021] [Revised: 04/06/2021] [Accepted: 05/11/2021] [Indexed: 12/29/2022]
Abstract
Pressures for survival make sensory circuits adapted to a species' natural habitat and its behavioral challenges. Thus, to advance our understanding of the visual system, it is essential to consider an animal's specific visual environment by capturing natural scenes, characterizing their statistical regularities, and using them to probe visual computations. Mice, a prominent visual system model, have salient visual specializations, being dichromatic with enhanced sensitivity to green and UV in the dorsal and ventral retina, respectively. However, the characteristics of their visual environment that likely have driven these adaptations are rarely considered. Here, we built a UV-green-sensitive camera to record footage from mouse habitats. This footage is publicly available as a resource for mouse vision research. We found chromatic contrast to greatly diverge in the upper, but not the lower, visual field. Moreover, training a convolutional autoencoder on upper, but not lower, visual field scenes was sufficient for the emergence of color-opponent filters, suggesting that this environmental difference might have driven superior chromatic opponency in the ventral mouse retina, supporting color discrimination in the upper visual field. Furthermore, the upper visual field was biased toward dark UV contrasts, paralleled by more light-offset-sensitive ganglion cells in the ventral retina. Finally, footage recorded at twilight suggests that UV promotes aerial predator detection. Our findings support that natural scene statistics shaped early visual processing in evolution.
Collapse
Affiliation(s)
- Yongrong Qiu
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Graduate Training Centre of Neuroscience (GTC), International Max Planck Research School, University of Tübingen, 72076 Tübingen, Germany
| | - Zhijian Zhao
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany
| | - David Klindt
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Graduate Training Centre of Neuroscience (GTC), International Max Planck Research School, University of Tübingen, 72076 Tübingen, Germany
| | - Magdalena Kautzky
- Division of Neurobiology, Faculty of Biology, LMU Munich, 82152 Planegg-Martinsried, Germany; Graduate School of Systemic Neurosciences (GSN), LMU Munich, 82152 Planegg-Martinsried, Germany
| | - Klaudia P Szatko
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Graduate Training Centre of Neuroscience (GTC), International Max Planck Research School, University of Tübingen, 72076 Tübingen, Germany; Bernstein Centre for Computational Neuroscience, 72076 Tübingen, Germany
| | - Frank Schaeffel
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany
| | - Katharina Rifai
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Carl Zeiss Vision International GmbH, 73430 Aalen, Germany
| | - Katrin Franke
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Bernstein Centre for Computational Neuroscience, 72076 Tübingen, Germany
| | - Laura Busse
- Division of Neurobiology, Faculty of Biology, LMU Munich, 82152 Planegg-Martinsried, Germany; Bernstein Centre for Computational Neuroscience, 82152 Planegg-Martinsried, Germany.
| | - Thomas Euler
- Institute for Ophthalmic Research, University of Tübingen, 72076 Tübingen, Germany; Centre for Integrative Neuroscience (CIN), University of Tübingen, 72076 Tübingen, Germany; Bernstein Centre for Computational Neuroscience, 72076 Tübingen, Germany.
| |
Collapse
|
7
|
Straub D, Rothkopf CA. Looking for Image Statistics: Active Vision With Avatars in a Naturalistic Virtual Environment. Front Psychol 2021; 12:641471. [PMID: 33692732 PMCID: PMC7937646 DOI: 10.3389/fpsyg.2021.641471] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Accepted: 02/01/2021] [Indexed: 11/13/2022] Open
Abstract
The efficient coding hypothesis posits that sensory systems are tuned to the regularities of their natural input. The statistics of natural image databases have been the topic of many studies, which have revealed biases in the distribution of orientations that are related to neural representations as well as behavior in psychophysical tasks. However, commonly used natural image databases contain images taken with a camera with a planar image sensor and limited field of view. Thus, these images do not incorporate the physical properties of the visual system and its active use reflecting body and eye movements. Here, we investigate quantitatively, whether the active use of the visual system influences image statistics across the visual field by simulating visual behaviors in an avatar in a naturalistic virtual environment. Images with a field of view of 120° were generated during exploration of a virtual forest environment both for a human and cat avatar. The physical properties of the visual system were taken into account by projecting the images onto idealized retinas according to models of the eyes' geometrical optics. Crucially, different active gaze behaviors were simulated to obtain image ensembles that allow investigating the consequences of active visual behaviors on the statistics of the input to the visual system. In the central visual field, the statistics of the virtual images matched photographic images regarding their power spectra and a bias in edge orientations toward cardinal directions. At larger eccentricities, the cardinal bias was superimposed with a gradually increasing radial bias. The strength of this effect depends on the active visual behavior and the physical properties of the eye. There were also significant differences between the upper and lower visual field, which became stronger depending on how the environment was actively sampled. Taken together, the results show that quantitatively relating natural image statistics to neural representations and psychophysical behavior requires not only to take the structure of the environment into account, but also the physical properties of the visual system, and its active use in behavior.
Collapse
Affiliation(s)
- Dominik Straub
- Institute of Psychology, Technical University of Darmstadt, Darmstadt, Germany
- Centre for Cognitive Science, Technical University of Darmstadt, Darmstadt, Germany
| | - Constantin A. Rothkopf
- Institute of Psychology, Technical University of Darmstadt, Darmstadt, Germany
- Centre for Cognitive Science, Technical University of Darmstadt, Darmstadt, Germany
| |
Collapse
|
8
|
Underwater Hyperspectral Imaging Technology and Its Applications for Detecting and Mapping the Seafloor: A Review. SENSORS 2020; 20:s20174962. [PMID: 32887344 PMCID: PMC7506868 DOI: 10.3390/s20174962] [Citation(s) in RCA: 24] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Revised: 08/20/2020] [Accepted: 08/31/2020] [Indexed: 11/17/2022]
Abstract
Common methods of ocean remote sensing and seafloor surveying are mainly carried out by airborne and spaceborne hyperspectral imagers. However, the water column hinders the propagation of sunlight to deeper areas, thus limiting the scope of observation. As an emerging technology, underwater hyperspectral imaging (UHI) is an extension of hyperspectral imaging technology in air conditions, and is undergoing rapid development for applications in shallow and deep-sea environments. It is a close-range, high-resolution approach for detecting and mapping the seafloor. In this paper, we focus on the concepts of UHI technology, covering imaging systems and the correction methods of eliminating the water column’s influence. The current applications of UHI, such as deep-sea mineral exploration, benthic habitat mapping, and underwater archaeology, are highlighted to show the potential of this technology. This review can provide an introduction and overview for those working in the field and offer a reference for those searching for literature on UHI technology.
Collapse
|
9
|
Zhou M, Bear J, Roberts PA, Janiak FK, Semmelhack J, Yoshimatsu T, Baden T. Zebrafish Retinal Ganglion Cells Asymmetrically Encode Spectral and Temporal Information across Visual Space. Curr Biol 2020; 30:2927-2942.e7. [PMID: 32531283 PMCID: PMC7416113 DOI: 10.1016/j.cub.2020.05.055] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2020] [Revised: 03/31/2020] [Accepted: 05/18/2020] [Indexed: 12/23/2022]
Abstract
In vertebrate vision, the tetrachromatic larval zebrafish permits non-invasive monitoring and manipulating of neural activity across the nervous system in vivo during ongoing behavior. However, despite a perhaps unparalleled understanding of links between zebrafish brain circuits and visual behaviors, comparatively little is known about what their eyes send to the brain via retinal ganglion cells (RGCs). Major gaps in knowledge include any information on spectral coding and information on potentially critical variations in RGC properties across the retinal surface corresponding with asymmetries in the statistics of natural visual space and behavioral demands. Here, we use in vivo two-photon imaging during hyperspectral visual stimulation as well as photolabeling of RGCs to provide a functional and anatomical census of RGCs in larval zebrafish. We find that RGCs' functional and structural properties differ across the eye and include a notable population of UV-responsive On-sustained RGCs that are only found in the acute zone, likely to support visual prey capture of UV-bright zooplankton. Next, approximately half of RGCs display diverse forms of color opponency, including many that are driven by a pervasive and slow blue-Off system-far in excess of what would be required to satisfy traditional models of color vision. In addition, most information on spectral contrast was intermixed with temporal information. Taken together, our results suggest that zebrafish RGCs send a diverse and highly regionalized time-color code to the brain.
Collapse
Affiliation(s)
- Mingyi Zhou
- School of Life Sciences, University of Sussex, Brighton BN19QG, UK
| | - John Bear
- School of Life Sciences, University of Sussex, Brighton BN19QG, UK; Hong Kong University of Science and Technology, Hong Kong
| | - Paul A Roberts
- School of Life Sciences, University of Sussex, Brighton BN19QG, UK
| | - Filip K Janiak
- School of Life Sciences, University of Sussex, Brighton BN19QG, UK
| | | | | | - Tom Baden
- School of Life Sciences, University of Sussex, Brighton BN19QG, UK; Institute for Ophthalmic Research, University of Tübingen, Tübingen 72076, Germany.
| |
Collapse
|