1
|
Jörges B, Bansal A, Harris LR. Precision and temporal dynamics in heading perception assessed by continuous psychophysics. PLoS One 2024; 19:e0311992. [PMID: 39392815 PMCID: PMC11469512 DOI: 10.1371/journal.pone.0311992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2024] [Accepted: 09/27/2024] [Indexed: 10/13/2024] Open
Abstract
It is a well-established finding that more informative optic flow (e.g., faster, denser, or presented over a larger portion of the visual field) yields decreased variability in heading judgements. Current models of heading perception further predict faster processing under such circumstances, which has, however, not been supported empirically so far. In this study, we validate a novel continuous psychophysics paradigm by replicating the effect of the speed and density of optic flow on variability in performance, and we investigate how these manipulations affect the temporal dynamics. To this end, we tested 30 participants in a continuous psychophysics paradigm administered in Virtual Reality. We immersed them in a simple virtual environment where they experienced four 90-second blocks of optic flow where their linear heading direction (no simulated rotation) at any given moment was determined by a random walk. We asked them to continuously indicate with a joystick the direction in which they perceived themselves to be moving. In each of the four blocks they experienced a different combination of simulated self-motion speeds (SLOW and FAST) and density of optic flow (SPARSE and DENSE). Using a Cross-Correlogram Analysis, we determined that participants reacted faster and displayed lower variability in their performance in the FAST and DENSE conditions than in the SLOW and SPARSE conditions, respectively. Using a Kalman Filter-based analysis approach, we found a similar pattern, where the fitted perceptual noise parameters were higher for SLOW and SPARSE. While replicating previous results on variability, we show that more informative optic flow can speed up heading judgements, while at the same time validating a continuous psychophysics as an efficient method for studying heading perception.
Collapse
Affiliation(s)
- Björn Jörges
- Center for Vision Research, York University, North York, Canada
| | - Ambika Bansal
- Center for Vision Research, York University, North York, Canada
| | | |
Collapse
|
2
|
Meisner OC, Shi W, Fagan NA, Greenwood J, Shi W, Jadi MP, Nandy AS, Chang SWC. Development of a Marmoset Apparatus for Automated Pulling (MarmoAAP) to Study Cooperative Behaviors. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.02.16.579531. [PMID: 38405744 PMCID: PMC10889019 DOI: 10.1101/2024.02.16.579531] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/27/2024]
Abstract
In recent years, the field of neuroscience has increasingly recognized the importance of studying animal behaviors in naturalistic environments to gain deeper insights into ethologically relevant behavioral processes and neural mechanisms. The common marmoset (Callithrix jacchus), due to its small size, prosocial nature, and genetic proximity to humans, has emerged as a pivotal model toward this effort. However, traditional research methodologies often fail to fully capture the nuances of marmoset social interactions and cooperative behaviors. To address this critical gap, we developed the Marmoset Apparatus for Automated Pulling (MarmoAAP), a novel behavioral apparatus designed for studying cooperative behaviors in common marmosets. MarmoAAP addresses the limitations of traditional behavioral research methods by enabling high-throughput, detailed behavior outputs that can be integrated with video and audio recordings, allowing for more nuanced and comprehensive analyses even in a naturalistic setting. We also highlight the flexibility of MarmoAAP in task parameter manipulation which accommodates a wide range of behaviors and individual animal capabilities. Furthermore, MarmoAAP provides a platform to perform investigations of neural activity underlying naturalistic social behaviors. MarmoAAP is a versatile and robust tool for advancing our understanding of primate behavior and related cognitive processes. This new apparatus bridges the gap between ethologically relevant animal behavior studies and neural investigations, paving the way for future research in cognitive and social neuroscience using marmosets as a model organism.
Collapse
Affiliation(s)
- Olivia C. Meisner
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06510, USA
- Department of Psychology, Yale University, New Haven, CT 06520, USA
- Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
| | - Weikang Shi
- Department of Psychology, Yale University, New Haven, CT 06520, USA
- Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06510, USA
| | | | - Joel Greenwood
- Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
- Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
| | - Weikang Shi
- Department of Psychology, Yale University, New Haven, CT 06520, USA
- Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
- Department of Psychiatry, Yale University, New Haven, CT 06520, USA
| | - Monika P. Jadi
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06510, USA
- Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06510, USA
- Department of Psychiatry, Yale University, New Haven, CT 06520, USA
| | - Anirvan S. Nandy
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06510, USA
- Department of Psychology, Yale University, New Haven, CT 06520, USA
- Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06510, USA
- Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
| | - Steve W. C. Chang
- Interdepartmental Neuroscience Program, Yale University, New Haven, CT 06510, USA
- Department of Psychology, Yale University, New Haven, CT 06520, USA
- Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
- Wu Tsai Institute, Yale University, New Haven, CT 06510, USA
- Kavli Institute for Neuroscience, Yale University School of Medicine, New Haven, CT 06510, USA
| |
Collapse
|
3
|
Oesch LT, Ryan MB, Churchland AK. From innate to instructed: A new look at perceptual decision-making. Curr Opin Neurobiol 2024; 86:102871. [PMID: 38569230 PMCID: PMC11162954 DOI: 10.1016/j.conb.2024.102871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2023] [Revised: 03/07/2024] [Accepted: 03/08/2024] [Indexed: 04/05/2024]
Abstract
Understanding how subjects perceive sensory stimuli in their environment and use this information to guide appropriate actions is a major challenge in neuroscience. To study perceptual decision-making in animals, researchers use tasks that either probe spontaneous responses to stimuli (often described as "naturalistic") or train animals to associate stimuli with experimenter-defined responses. Spontaneous decisions rely on animals' pre-existing knowledge, while trained tasks offer greater versatility, albeit often at the cost of extensive training. Here, we review emerging approaches to investigate perceptual decision-making using both spontaneous and trained behaviors, highlighting their strengths and limitations. Additionally, we propose how trained decision-making tasks could be improved to achieve faster learning and a more generalizable understanding of task rules.
Collapse
Affiliation(s)
- Lukas T Oesch
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States
| | - Michael B Ryan
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States. https://twitter.com/NeuroMikeRyan
| | - Anne K Churchland
- Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, United States.
| |
Collapse
|
4
|
Burge J, Cormack LK. Continuous psychophysics shows millisecond-scale visual processing delays are faithfully preserved in movement dynamics. J Vis 2024; 24:4. [PMID: 38722274 PMCID: PMC11094763 DOI: 10.1167/jov.24.5.4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Accepted: 02/22/2024] [Indexed: 05/18/2024] Open
Abstract
Image differences between the eyes can cause interocular discrepancies in the speed of visual processing. Millisecond-scale differences in visual processing speed can cause dramatic misperceptions of the depth and three-dimensional direction of moving objects. Here, we develop a monocular and binocular continuous target-tracking psychophysics paradigm that can quantify such tiny differences in visual processing speed. Human observers continuously tracked a target undergoing Brownian motion with a range of luminance levels in each eye. Suitable analyses recover the time course of the visuomotor response in each condition, the dependence of visual processing speed on luminance level, and the temporal evolution of processing differences between the eyes. Importantly, using a direct within-observer comparison, we show that continuous target-tracking and traditional forced-choice psychophysical methods provide estimates of interocular delays that agree on average to within a fraction of a millisecond. Thus, visual processing delays are preserved in the movement dynamics of the hand. Finally, we show analytically, and partially confirm experimentally, that differences between the temporal impulse response functions in the two eyes predict how lateral target motion causes misperceptions of motion in depth and associated tracking responses. Because continuous target tracking can accurately recover millisecond-scale differences in visual processing speed and has multiple advantages over traditional psychophysics, it should facilitate the study of temporal processing in the future.
Collapse
Affiliation(s)
- Johannes Burge
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
- Neuroscience Graduate Group, University of Pennsylvania, Philadelphia, PA, USA
- Bioengineering Graduate Group, University of Pennsylvania, Philadelphia, PA, USA
| | - Lawrence K Cormack
- Department of Psychology, University of Texas at Austin, Austin, TX, USA
- Center for Perceptual Systems, University of Texas at Austin, Austin, TX, USA
- Institute for Neuroscience, University of Texas at Austin, Austin, TX, USA
| |
Collapse
|
5
|
Ruesseler M, Weber LA, Marshall TR, O'Reilly J, Hunt LT. Quantifying decision-making in dynamic, continuously evolving environments. eLife 2023; 12:e82823. [PMID: 37883173 PMCID: PMC10602589 DOI: 10.7554/elife.82823] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2022] [Accepted: 10/13/2023] [Indexed: 10/27/2023] Open
Abstract
During perceptual decision-making tasks, centroparietal electroencephalographic (EEG) potentials report an evidence accumulation-to-bound process that is time locked to trial onset. However, decisions in real-world environments are rarely confined to discrete trials; they instead unfold continuously, with accumulation of time-varying evidence being recency-weighted towards its immediate past. The neural mechanisms supporting recency-weighted continuous decision-making remain unclear. Here, we use a novel continuous task design to study how the centroparietal positivity (CPP) adapts to different environments that place different constraints on evidence accumulation. We show that adaptations in evidence weighting to these different environments are reflected in changes in the CPP. The CPP becomes more sensitive to fluctuations in sensory evidence when large shifts in evidence are less frequent, and the potential is primarily sensitive to fluctuations in decision-relevant (not decision-irrelevant) sensory input. A complementary triphasic component over occipito-parietal cortex encodes the sum of recently accumulated sensory evidence, and its magnitude covaries with parameters describing how different individuals integrate sensory evidence over time. A computational model based on leaky evidence accumulation suggests that these findings can be accounted for by a shift in decision threshold between different environments, which is also reflected in the magnitude of pre-decision EEG activity. Our findings reveal how adaptations in EEG responses reflect flexibility in evidence accumulation to the statistics of dynamic sensory environments.
Collapse
Affiliation(s)
- Maria Ruesseler
- Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford Centre for Human Brain Activity (OHBA) University Department of Psychiatry Warneford HospitalOxfordUnited Kingdom
| | - Lilian Aline Weber
- Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford Centre for Human Brain Activity (OHBA) University Department of Psychiatry Warneford HospitalOxfordUnited Kingdom
- Department of Experimental Psychology, University of Oxford, Anna Watts Building, Radcliffe Observatory QuarterOxfordUnited Kingdom
| | - Tom Rhys Marshall
- Department of Experimental Psychology, University of Oxford, Anna Watts Building, Radcliffe Observatory QuarterOxfordUnited Kingdom
- Centre for Human Brain Health, University of BirminghamBirminghamUnited Kingdom
| | - Jill O'Reilly
- Department of Experimental Psychology, University of Oxford, Anna Watts Building, Radcliffe Observatory QuarterOxfordUnited Kingdom
| | - Laurence Tudor Hunt
- Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford Centre for Human Brain Activity (OHBA) University Department of Psychiatry Warneford HospitalOxfordUnited Kingdom
- Department of Experimental Psychology, University of Oxford, Anna Watts Building, Radcliffe Observatory QuarterOxfordUnited Kingdom
| |
Collapse
|
6
|
Yip HMK, Allison-Walker TJ, Cloherty SL, Hagan MA, Price NSC. Ocular following responses of the marmoset monkey are dependent on postsaccadic delay, spatiotemporal frequency, and saccade direction. J Neurophysiol 2023; 130:189-198. [PMID: 37377195 PMCID: PMC10435071 DOI: 10.1152/jn.00126.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2023] [Revised: 06/15/2023] [Accepted: 06/15/2023] [Indexed: 06/29/2023] Open
Abstract
Ocular following is a short-latency, reflexive eye movement that tracks wide-field visual motion. It has been studied extensively in humans and macaques and is an appealing behavior for studying sensory-motor transformations in the brain because of its rapidity and rigidity. We explored ocular following in the marmoset, an emerging model in neuroscience because their lissencephalic brain allows direct access to most cortical areas for imaging and electrophysiological recordings. In three experiments, we tested ocular following responses in three adult marmosets. First, we varied the delay between saccade end and stimulus motion onset, from 10 to 300 ms. As in other species, tracking had shorter onset latencies and higher eye speeds with shorter postsaccadic delays. Second, using sine-wave grating stimuli, we explored the dependence of eye speed on spatiotemporal frequency. The highest eye speed was evoked at ∼16 Hz and ∼0.16 cycles per degree (cpd); however, the highest gain was elicited at ∼1.6 Hz and ∼1.2 cpd. The highest eye speed for each spatial frequency was observed at a different temporal frequency, but this interdependence was not consistent with complete speed tuning of the ocular following response. Finally, we found the highest eye speeds when saccade and stimulus motion directions were identical, although latencies were unaffected by direction difference. Our results showed qualitatively similar ocular following in marmosets, humans, and macaques, despite over an order of magnitude variation in body and eye size across species. This characterization will help future studies examining the neural basis of sensory-motor transformations.NEW & NOTEWORTHY Previous ocular following studies focused on humans and macaques. We examined the properties of ocular following responses in marmosets in three experiments, in which postsaccadic delay, spatial-temporal frequency of stimuli, and congruence of saccade and motion directions were manipulated. We have demonstrated short-latency ocular following in marmosets and discuss the similarities across three species that vary markedly in eye and head size. Our findings will help future studies examining the neural mechanism of sensory-motor transformations.
Collapse
Affiliation(s)
- Hoi Ming Ken Yip
- Department of Physiology and Neuroscience Program, Biomedicine Discovery Institute, Monash University, Clayton, Victoria, Australia
| | - Timothy John Allison-Walker
- Department of Physiology and Neuroscience Program, Biomedicine Discovery Institute, Monash University, Clayton, Victoria, Australia
- School of Engineering, RMIT University, Melbourne, Victoria, Australia
| | - Shaun Liam Cloherty
- Department of Physiology and Neuroscience Program, Biomedicine Discovery Institute, Monash University, Clayton, Victoria, Australia
- School of Engineering, RMIT University, Melbourne, Victoria, Australia
| | - Maureen Ann Hagan
- Department of Physiology and Neuroscience Program, Biomedicine Discovery Institute, Monash University, Clayton, Victoria, Australia
| | - Nicholas Seow Chiang Price
- Department of Physiology and Neuroscience Program, Biomedicine Discovery Institute, Monash University, Clayton, Victoria, Australia
| |
Collapse
|
7
|
Kay K, Bonnen K, Denison RN, Arcaro MJ, Barack DL. Tasks and their role in visual neuroscience. Neuron 2023; 111:1697-1713. [PMID: 37040765 DOI: 10.1016/j.neuron.2023.03.022] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2023] [Revised: 03/13/2023] [Accepted: 03/15/2023] [Indexed: 04/13/2023]
Abstract
Vision is widely used as a model system to gain insights into how sensory inputs are processed and interpreted by the brain. Historically, careful quantification and control of visual stimuli have served as the backbone of visual neuroscience. There has been less emphasis, however, on how an observer's task influences the processing of sensory inputs. Motivated by diverse observations of task-dependent activity in the visual system, we propose a framework for thinking about tasks, their role in sensory processing, and how we might formally incorporate tasks into our models of vision.
Collapse
Affiliation(s)
- Kendrick Kay
- Center for Magnetic Resonance Research, Department of Radiology, University of Minnesota, Minneapolis, MN 55455, USA.
| | - Kathryn Bonnen
- School of Optometry, Indiana University, Bloomington, IN 47405, USA
| | - Rachel N Denison
- Department of Psychological and Brain Sciences, Boston University, Boston, MA 02215, USA
| | - Mike J Arcaro
- Department of Psychology, University of Pennsylvania, Philadelphia, PA 19146, USA
| | - David L Barack
- Departments of Neuroscience and Philosophy, University of Pennsylvania, Philadelphia, PA 19146, USA
| |
Collapse
|
8
|
Mei Chow H, Spering M. Eye movements during optic flow perception. Vision Res 2023; 204:108164. [PMID: 36566560 DOI: 10.1016/j.visres.2022.108164] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2022] [Revised: 11/22/2022] [Accepted: 12/07/2022] [Indexed: 12/24/2022]
Abstract
Optic flow is an important visual cue for human perception and locomotion and naturally triggers eye movements. Here we investigate whether the perception of optic flow direction is limited or enhanced by eye movements. In Exp. 1, 23 human observers localized the focus of expansion (FOE) of an optic flow pattern; in Exp. 2, 18 observers had to detect brief visual changes at the FOE. Both tasks were completed during free viewing and fixation conditions while eye movements were recorded. Task difficulty was varied by manipulating the coherence of radial motion from the FOE (4 %-90 %). During free viewing, observers tracked the optic flow pattern with a combination of saccades and smooth eye movements. During fixation, observers nevertheless made small-scale eye movements. Despite differences in spatial scale, eye movements during free viewing and fixation were similarly directed toward the FOE (saccades) and away from the FOE (smooth tracking). Whereas FOE localization sensitivity was not affected by eye movement instructions (Exp. 1), observers' sensitivity to detect brief changes at the FOE was 27 % higher (p <.001) during free-viewing compared to fixation (Exp. 2). This performance benefit was linked to reduced saccade endpoint errors, indicating the direct beneficial impact of foveating eye movements on performance in a fine-grain perceptual task, but not during coarse perceptual localization.
Collapse
Affiliation(s)
- Hiu Mei Chow
- Dept. of Psychology, St. Thomas University, Fredericton, Canada; Dept. of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada.
| | - Miriam Spering
- Dept. of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, Canada; Djavad Mowafaghian Center for Brain Health, University of British Columbia, Vancouver, Canada; Institute for Computing, Information and Cognitive Systems, University of British Columbia, Vancouver, Canada
| |
Collapse
|
9
|
Chin BM, Burge J. Perceptual consequences of interocular differences in the duration of temporal integration. J Vis 2022; 22:12. [PMID: 36355360 PMCID: PMC9652723 DOI: 10.1167/jov.22.12.12] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Accepted: 07/25/2022] [Indexed: 11/12/2022] Open
Abstract
Temporal differences in visual information processing between the eyes can cause dramatic misperceptions of motion and depth. Processing delays between the eyes cause the Pulfrich effect: oscillating targets in the frontal plane are misperceived as moving along near-elliptical motion trajectories in depth (Pulfrich, 1922). Here, we explain a previously reported but poorly understood variant: the anomalous Pulfrich effect. When this variant is perceived, the illusory motion trajectory appears oriented left- or right-side back in depth, rather than aligned with the true direction of motion. Our data indicate that this perceived misalignment is due to interocular differences in neural temporal integration periods, as opposed to interocular differences in delay. For oscillating motion, differences in the duration of temporal integration dampen the effective motion amplitude in one eye relative to the other. In a dynamic analog of the Geometric effect in stereo-surface-orientation perception (Ogle, 1950), the different motion amplitudes cause the perceived misorientation of the motion trajectories. Forced-choice psychophysical experiments, conducted with both different spatial frequencies and different onscreen motion damping in the two eyes show that the perceived misorientation in depth is associated with the eye having greater motion damping. A target-tracking experiment provided more direct evidence that the anomalous Pulfrich effect is caused by interocular differences in temporal integration and delay. These findings highlight the computational hurdles posed to the visual system by temporal differences in sensory processing. Future work will explore how the visual system overcomes these challenges to achieve accurate perception.
Collapse
Affiliation(s)
- Benjamin M Chin
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - Johannes Burge
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
- Neuroscience Graduate Group, University of Pennsylvania, Philadelphia, PA, USA
- Bioengineering Graduate Group, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
10
|
Straub D, Rothkopf CA. Putting perception into action with inverse optimal control for continuous psychophysics. eLife 2022; 11:e76635. [PMID: 36173094 PMCID: PMC9522207 DOI: 10.7554/elife.76635] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2021] [Accepted: 08/08/2022] [Indexed: 11/13/2022] Open
Abstract
Psychophysical methods are a cornerstone of psychology, cognitive science, and neuroscience where they have been used to quantify behavior and its neural correlates for a vast range of mental phenomena. Their power derives from the combination of controlled experiments and rigorous analysis through signal detection theory. Unfortunately, they require many tedious trials and preferably highly trained participants. A recently developed approach, continuous psychophysics, promises to transform the field by abandoning the rigid trial structure involving binary responses and replacing it with continuous behavioral adjustments to dynamic stimuli. However, what has precluded wide adoption of this approach is that current analysis methods do not account for the additional variability introduced by the motor component of the task and therefore recover perceptual thresholds that are larger compared to equivalent traditional psychophysical experiments. Here, we introduce a computational analysis framework for continuous psychophysics based on Bayesian inverse optimal control. We show via simulations and previously published data that this not only recovers the perceptual thresholds but additionally estimates subjects' action variability, internal behavioral costs, and subjective beliefs about the experimental stimulus dynamics. Taken together, we provide further evidence for the importance of including acting uncertainties, subjective beliefs, and, crucially, the intrinsic costs of behavior, even in experiments seemingly only investigating perception.
Collapse
Affiliation(s)
- Dominik Straub
- Centre for Cognitive Science, Technical University of DarmstadtDarmstadtGermany
- Institute of Psychology, Technical University of DarmstadtDarmstadtGermany
| | - Constantin A Rothkopf
- Centre for Cognitive Science, Technical University of DarmstadtDarmstadtGermany
- Institute of Psychology, Technical University of DarmstadtDarmstadtGermany
- Frankfurt Institute for Advanced Studies, Goethe University FrankfurtFrankfurtGermany
| |
Collapse
|
11
|
Developmental changes in gaze patterns in response to radial optic flow in toddlerhood and childhood. Sci Rep 2022; 12:11566. [PMID: 35799054 PMCID: PMC9262903 DOI: 10.1038/s41598-022-15730-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2021] [Accepted: 06/28/2022] [Indexed: 11/11/2022] Open
Abstract
A large field visual motion pattern (optic flow) with a radial pattern provides a compelling perception of self-motion; a radially expanding/contracting optic flow generates the perception of forward/backward locomotion. Moreover, the focus of a radial optic flow, particularly an expansive flow, is an important visual cue to perceive and control the heading direction during human locomotion. Previous research has shown that human gaze patterns have an “expansion bias”: a tendency to be more attracted to the focus of expansive flow than to the focus of contractive flow. We investigated the development of the expansion bias in children (N = 240, 1–12 years) and adults (N = 20). Most children aged ≥ 5 years and adults showed a significant tendency to shift their gaze to the focus of an expansive flow, whereas the youngest group (1-year-old children) showed a significant but opposing tendency; their gaze was more attracted to the focus of contractive flow than to the focus of expansive flow. The relationship between the developmental change from the “contraction bias” in early toddlerhood to the expansion bias in the later developmental stages and possible factors (e.g., global visual motion processing abilities and locomotor experiences) are discussed.
Collapse
|
12
|
Patricio Décima A, Fernando Barraza J, López-Moliner J. The perceptual dynamics of the contrast induced speed bias. Vision Res 2021; 191:107966. [PMID: 34808549 DOI: 10.1016/j.visres.2021.107966] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Revised: 09/15/2021] [Accepted: 10/17/2021] [Indexed: 11/25/2022]
Abstract
In this article we present a temporal extension of the slow motion prior model to generate predictions regarding the temporal evolution of the contrast induced speed bias. We further tested these predictions using a novel experimental paradigm that allows us to measure the dynamic perceptual difference between stimuli through a series of manual pursuit open loop tasks. Results show good agreement with our model's predictions. The main findings reveal that hand speed dynamics are affected by stimulus contrast in a way that is consistent with a dynamic model of motion perception that assumes a slow motion prior. The proposed model also confirms observations made in previous studies that suggest that motion bias persisted even at high contrast as a consequence of the dynamics of the slow motion prior.
Collapse
Affiliation(s)
| | - José Fernando Barraza
- Dpto. Luminotecnia, Luz y Visión "Herberto C. Bühler" (DLLyV), FACET, UNT, Argentina; Instituto de Investigación en Luz, Ambiente y Visión (ILAV), CONICET-UNT, Argentina
| | - Joan López-Moliner
- Vision and Control of Action (VISCA) Group, Department of Cognition, Development and Psychology of Education, Institut de Neurociències, Universitat de Barcelona, Passeig de la Vall d'Hebron 171, 08035 Barcelona, Catalonia, Spain
| |
Collapse
|
13
|
Wang JZ, Kowler E. Micropursuit and the control of attention and eye movements in dynamic environments. J Vis 2021; 21:6. [PMID: 34347019 PMCID: PMC8340658 DOI: 10.1167/jov.21.8.6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
It is more challenging to plan eye movements during perceptual tasks performed in dynamic displays than in static displays. Decisions about the timing of saccades become more critical, and decisions must also involve smooth eye movements, as well as saccades. The present study examined eye movements when judging which of two moving discs would arrive first, or collide, at a common meeting point. Perceptual discrimination after training was precise (Weber fractions < 6%). Strategies reflected a combined contribution of saccades and smooth eye movements. The preferred strategy was to look near the meeting point when strategies were freely chosen. When strategies were assigned, looking near the meeting point produced better performance than switching between the discs. Smooth eye movements were engaged in two ways: (a) low-velocity smooth eye movements correlated with the motion of each disc (micropursuit) were found while the line of sight remained between the discs; and (b) spontaneous smooth pursuit of the pair of discs occurred after the perceptual report, when the discs moved as a pair along a common path. The results show clear preferences and advantages for those eye movement strategies during dynamic perceptual tasks that require minimal management or effort. In addition, smooth eye movements, whose involvement during perceptual tasks within dynamic displays may have previously escaped notice, provide useful indictors of the strategies used to select information and distribute attention during the performance of dynamic perceptual tasks.
Collapse
Affiliation(s)
- Jie Z Wang
- Department of Psychology, Rutgers University, Piscataway, NJ, USA.,http://orcid.org/0000-0002-8553-6706.,
| | - Eileen Kowler
- Department of Psychology, Rutgers University, Piscataway, NJ, USA.,http://orcid.org/0000-0001-7079-0376., https://ruccs.rutgers.edu/kowler
| |
Collapse
|
14
|
Visual Neuroscience Methods for Marmosets: Efficient Receptive Field Mapping and Head-Free Eye Tracking. eNeuro 2021; 8:ENEURO.0489-20.2021. [PMID: 33863782 PMCID: PMC8143020 DOI: 10.1523/eneuro.0489-20.2021] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2020] [Revised: 02/18/2021] [Accepted: 03/25/2021] [Indexed: 11/21/2022] Open
Abstract
The marmoset has emerged as a promising primate model system, in particular for visual neuroscience. Many common experimental paradigms rely on head fixation and an extended period of eye fixation during the presentation of salient visual stimuli. Both of these behavioral requirements can be challenging for marmosets. Here, we present two methodological developments, each addressing one of these difficulties. First, we show that it is possible to use a standard eye-tracking system without head fixation to assess visual behavior in the marmoset. Eye-tracking quality from head-free animals is sufficient to obtain precise psychometric functions from a visual acuity task. Second, we introduce a novel method for efficient receptive field (RF) mapping that does not rely on moving stimuli but uses fast flashing annuli and wedges. We present data recorded during head-fixation in areas V1 and V6 and show that RF locations are readily obtained within a short period of recording time. Thus, the methodological advancements presented in this work will contribute to establish the marmoset as a valuable model in neuroscience.
Collapse
|
15
|
Chow HM, Knöll J, Madsen M, Spering M. Look where you go: Characterizing eye movements toward optic flow. J Vis 2021; 21:19. [PMID: 33735378 PMCID: PMC7991960 DOI: 10.1167/jov.21.3.19] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 02/08/2021] [Indexed: 11/24/2022] Open
Abstract
When we move through our environment, objects in the visual scene create optic flow patterns on the retina. Even though optic flow is ubiquitous in everyday life, it is not well understood how our eyes naturally respond to it. In small groups of human and non-human primates, optic flow triggers intuitive, uninstructed eye movements to the focus of expansion of the pattern (Knöll, Pillow, & Huk, 2018). Here, we investigate whether such intuitive oculomotor responses to optic flow are generalizable to a larger group of human observers and how eye movements are affected by motion signal strength and task instructions. Observers (N = 43) viewed expanding or contracting optic flow constructed by a cloud of moving dots radiating from or converging toward a focus of expansion that could randomly shift. Results show that 84% of observers tracked the focus of expansion with their eyes without being explicitly instructed to track. Intuitive tracking was tuned to motion signal strength: Saccades landed closer to the focus of expansion, and smooth tracking was more accurate when dot contrast, motion coherence, and translational speed were high. Under explicit tracking instruction, the eyes aligned with the focus of expansion more closely than without instruction. Our results highlight the sensitivity of intuitive eye movements as indicators of visual motion processing in dynamic contexts.
Collapse
Affiliation(s)
- Hiu Mei Chow
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Jonas Knöll
- Institute of Animal Welfare and Animal Husbandry, Friedrich-Loeffler-Institut, Celle, Germany
| | - Matthew Madsen
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Miriam Spering
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
- Djavad Mowafaghian Center for Brain Health, University of British Columbia, Vancouver, British Columbia, Canada
- Institute for Computing, Information and Cognitive Systems, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
16
|
Chen CY, Matrov D, Veale R, Onoe H, Yoshida M, Miura K, Isa T. Properties of visually guided saccadic behavior and bottom-up attention in marmoset, macaque, and human. J Neurophysiol 2020; 125:437-457. [PMID: 33356912 DOI: 10.1152/jn.00312.2020] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022] Open
Abstract
Saccades are stereotypic behaviors whose investigation improves our understanding of how primate brains implement precise motor control. Furthermore, saccades offer an important window into the cognitive and attentional state of the brain. Historically, saccade studies have largely relied on macaques. However, the cortical network giving rise to the saccadic command is difficult to study in macaques because relevant cortical areas lie in deep sulci and are difficult to access. Recently, a New World monkey. the marmoset, has garnered attention as an alternative to macaques because of advantages including its smooth cortical surface. However, adoption of the marmoset for oculomotor research has been limited due to a lack of in-depth descriptions of marmoset saccade kinematics and their ability to perform psychophysical tasks. Here, we directly compare free-viewing and visually guided behavior of marmoset, macaque, and human engaged in identical tasks under similar conditions. In the video free-viewing task, all species exhibited qualitatively similar saccade kinematics up to 25° in amplitude although with different parameters. Furthermore, the conventional bottom-up saliency model predicted gaze targets at similar rates for all species. We further verified their visually guided behavior by training them with step and gap saccade tasks. In the step paradigm, marmosets did not show shorter saccade reaction time for upward saccades whereas macaques and humans did. In the gap paradigm, all species showed similar gap effect and express saccades. Our results suggest that the marmoset can serve as a model for oculomotor, attentional, and cognitive research while we need to be aware of their difference from macaque or human.NEW & NOTEWORTHY We directly compared the results of a video free-viewing task and visually guided saccade tasks (step and gap) among three different species: marmoset, macaque, and human. We found that all species exhibit qualitatively similar saccadic kinematics and saliency-driven saccadic behavior albeit with different parameters. Our results suggest that the marmoset possesses similar neural mechanisms to macaque and human for saccadic control, and it is an appropriate model to study neural mechanisms for active vision and attention.
Collapse
Affiliation(s)
- Chih-Yang Chen
- Department of Neuroscience, Graduate School of Medicine, Kyoto University, Kyoto, Japan.,Institute for the Advanced Study of Human Biology, Kyoto University, Kyoto, Japan
| | - Denis Matrov
- Department of Neuroscience, Graduate School of Medicine, Kyoto University, Kyoto, Japan.,Division of Neuropsychopharmacology, Department of Psychology, University of Tartu, Tartu, Estonia
| | - Richard Veale
- Department of Neuroscience, Graduate School of Medicine, Kyoto University, Kyoto, Japan
| | - Hirotaka Onoe
- Human Brain Research Center, Graduate School of Medicine, Kyoto University, Kyoto, Japan
| | - Masatoshi Yoshida
- Center for Human Nature, Artificial Intelligence, and Neuroscience, Hokkaido University, Sapporo, Japan
| | - Kenichiro Miura
- Department of Integrative Brain Science, Graduate School of Medicine, Kyoto University, Kyoto, Japan.,Department of Pathology of Mental Diseases, National Institute of Mental Health, National Center of Neurology and Psychiatry, Tokyo, Japan
| | - Tadashi Isa
- Department of Neuroscience, Graduate School of Medicine, Kyoto University, Kyoto, Japan.,Institute for the Advanced Study of Human Biology, Kyoto University, Kyoto, Japan.,Human Brain Research Center, Graduate School of Medicine, Kyoto University, Kyoto, Japan
| |
Collapse
|
17
|
Lakshminarasimhan KJ, Avila E, Neyhart E, DeAngelis GC, Pitkow X, Angelaki DE. Tracking the Mind's Eye: Primate Gaze Behavior during Virtual Visuomotor Navigation Reflects Belief Dynamics. Neuron 2020; 106:662-674.e5. [PMID: 32171388 PMCID: PMC7323886 DOI: 10.1016/j.neuron.2020.02.023] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Revised: 12/24/2019] [Accepted: 02/19/2020] [Indexed: 01/02/2023]
Abstract
To take the best actions, we often need to maintain and update beliefs about variables that cannot be directly observed. To understand the principles underlying such belief updates, we need tools to uncover subjects' belief dynamics from natural behavior. We tested whether eye movements could be used to infer subjects' beliefs about latent variables using a naturalistic navigation task. Humans and monkeys navigated to a remembered goal location in a virtual environment that provided optic flow but lacked explicit position cues. We observed eye movements that appeared to continuously track the goal location even when no visible target was present there. Accurate goal tracking was associated with improved task performance, and inhibiting eye movements in humans impaired navigation precision. These results suggest that gaze dynamics play a key role in action selection during challenging visuomotor behaviors and may possibly serve as a window into the subject's dynamically evolving internal beliefs.
Collapse
Affiliation(s)
- Kaushik J Lakshminarasimhan
- Center for Neural Science, New York University, New York, NY, USA; Center for Theoretical Neuroscience, Columbia University, New York, NY, USA.
| | - Eric Avila
- Center for Neural Science, New York University, New York, NY, USA
| | - Erin Neyhart
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA
| | | | - Xaq Pitkow
- Department of Neuroscience, Baylor College of Medicine, Houston, TX, USA; Department of Electrical and Computer Engineering, Rice University, Houston, TX, USA
| | - Dora E Angelaki
- Center for Neural Science, New York University, New York, NY, USA; Tandon School of Engineering, New York University, New York, NY, USA
| |
Collapse
|
18
|
Rapid assessment of natural visual motion integration across primate species. Proc Natl Acad Sci U S A 2018; 115:11112-11114. [PMID: 30348805 DOI: 10.1073/pnas.1816083115] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
|