1
|
Gehmacher Q, Schubert J, Schmidt F, Hartmann T, Reisinger P, Rösch S, Schwarz K, Popov T, Chait M, Weisz N. Eye movements track prioritized auditory features in selective attention to natural speech. Nat Commun 2024; 15:3692. [PMID: 38693186 PMCID: PMC11063150 DOI: 10.1038/s41467-024-48126-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 04/22/2024] [Indexed: 05/03/2024] Open
Abstract
Over the last decades, cognitive neuroscience has identified a distributed set of brain regions that are critical for attention. Strong anatomical overlap with brain regions critical for oculomotor processes suggests a joint network for attention and eye movements. However, the role of this shared network in complex, naturalistic environments remains understudied. Here, we investigated eye movements in relation to (un)attended sentences of natural speech. Combining simultaneously recorded eye tracking and magnetoencephalographic data with temporal response functions, we show that gaze tracks attended speech, a phenomenon we termed ocular speech tracking. Ocular speech tracking even differentiates a target from a distractor in a multi-speaker context and is further related to intelligibility. Moreover, we provide evidence for its contribution to neural differences in speech processing, emphasizing the necessity to consider oculomotor activity in future research and in the interpretation of neural differences in auditory cognition.
Collapse
Affiliation(s)
- Quirin Gehmacher
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria.
| | - Juliane Schubert
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Fabian Schmidt
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Thomas Hartmann
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Patrick Reisinger
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
| | - Sebastian Rösch
- Department of Otorhinolaryngology, Head and Neck Surgery, Paracelsus Medical University Salzburg, 5020, Salzburg, Austria
| | | | - Tzvetan Popov
- Methods of Plasticity Research, Department of Psychology, University of Zurich, CH-8050, Zurich, Switzerland
- Department of Psychology, University of Konstanz, DE- 78464, Konstanz, Germany
| | - Maria Chait
- Ear Institute, University College London, London, UK
| | - Nathan Weisz
- Paris-Lodron-University of Salzburg, Department of Psychology, Centre for Cognitive Neuroscience, Salzburg, Austria
- Neuroscience Institute, Christian Doppler University Hospital, Paracelsus Medical University, Salzburg, Austria
| |
Collapse
|
2
|
Rubinstein JF, Singh M, Kowler E. Bayesian approaches to smooth pursuit of random dot kinematograms: effects of varying RDK noise and the predictability of RDK direction. J Neurophysiol 2024; 131:394-416. [PMID: 38149327 PMCID: PMC11551001 DOI: 10.1152/jn.00116.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2023] [Revised: 11/30/2023] [Accepted: 12/20/2023] [Indexed: 12/28/2023] Open
Abstract
Smooth pursuit eye movements respond on the basis of both immediate and anticipated target motion, where anticipations may be derived from either memory or perceptual cues. To study the combined influence of both immediate sensory motion and anticipation, subjects pursued clear or noisy random dot kinematograms (RDKs) whose mean directions were chosen from Gaussian distributions with SDs = 10° (narrow prior) or 45° (wide prior). Pursuit directions were consistent with Bayesian theory in that transitions over time from dependence on the prior to near total dependence on immediate sensory motion (likelihood) took longer with the noisier RDKs and with the narrower, more reliable, prior. Results were fit to Bayesian models in which parameters representing the variability of the likelihood either were or were not constrained to be the same for both priors. The unconstrained model provided a statistically better fit, with the influence of the prior in the constrained model smaller than predicted from strict reliability-based weighting of prior and likelihood. Factors that may have contributed to this outcome include prior variability different from nominal values, low-level sensorimotor learning with the narrow prior, or departures of pursuit from strict adherence to reliability-based weighting. Although modifications of, or alternatives to, the normative Bayesian model will be required, these results, along with previous studies, suggest that Bayesian approaches are a promising framework to understand how pursuit combines immediate sensory motion, past history, and informative perceptual cues to accurately track the target motion that is most likely to occur in the immediate future.NEW & NOTEWORTHY Smooth pursuit eye movements respond on the basis of anticipated, as well as immediate, target motions. Bayesian models using reliability-based weighting of previous (prior) and immediate target motions (likelihood) accounted for many, but not all, aspects of pursuit of clear and noisy random dot kinematograms with different levels of predictability. Bayesian approaches may solve the long-standing problem of how pursuit combines immediate sensory motion and anticipation of future motion to configure an effective response.
Collapse
Affiliation(s)
- Jason F Rubinstein
- Department of Psychology, Rutgers University, Piscataway, New Jersey, United States
| | - Manish Singh
- Department of Psychology, Rutgers University, Piscataway, New Jersey, United States
| | - Eileen Kowler
- Department of Psychology, Rutgers University, Piscataway, New Jersey, United States
| |
Collapse
|
3
|
Stewart EEM, Fleming RW. The eyes anticipate where objects will move based on their shape. Curr Biol 2023; 33:R894-R895. [PMID: 37699342 DOI: 10.1016/j.cub.2023.07.028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Revised: 07/18/2023] [Accepted: 07/18/2023] [Indexed: 09/14/2023]
Abstract
Imagine staring into a clear river, starving, desperately searching for a fish to spear and cook. You see a dark shape lurking beneath the surface. It doesn't resemble any sort of fish you've encountered before - but you're hungry. To catch it, you need to anticipate which way it will move when you lunge for it, to compensate for your own sensory and motor processing delays1,2,3. Yet you know nothing about the behaviour of this creature, and do not know in which direction it will try to escape. What cues do you then use to drive such anticipatory responses? Fortunately, many species4, including humans, have the remarkable ability to predict the directionality of objects based on their shape - even if they are unfamiliar and so we cannot rely on semantic knowledge about their movements5. While it is known that such directional inferences can guide attention5, we do not yet fully understand how such causal inferences are made, or the extent to which they enable anticipatory behaviours. Does the oculomotor system, which moves our eyes to optimise visual input, use directional inferences from shape to anticipate upcoming motion direction? Such anticipation is necessary to stabilise the moving object on the high-resolution fovea of the retina while tracking the shape, a primary goal of the oculomotor system6, and to guide any future interactions7,8. Here, we leveraged a well-known behaviour of the oculomotor system: anticipatory smooth eye movements (ASEM), where an increase in eye velocity is observed in the direction of a stimulus' expected motion, before the stimulus actually moves3, to show that the oculomotor system extracts directional information from shape, and uses this inference to predict and anticipate upcoming motion.
Collapse
Affiliation(s)
- Emma E M Stewart
- Department of Experimental Psychology, Justus Liebig University Giessen, Otto-Behaghel-Strasse 10 F, D-35394 Giessen, Germany; Centre for Mind, Brain and Behaviour (CMBB), University of Marburg and Justus Liebig University Giessen, Germany.
| | - Roland W Fleming
- Department of Experimental Psychology, Justus Liebig University Giessen, Otto-Behaghel-Strasse 10 F, D-35394 Giessen, Germany; Centre for Mind, Brain and Behaviour (CMBB), University of Marburg and Justus Liebig University Giessen, Germany
| |
Collapse
|
4
|
Castellotti S, Montagnini A, Del Viva MM. Information-optimal local features automatically attract covert and overt attention. Sci Rep 2022; 12:9994. [PMID: 35705616 PMCID: PMC9200825 DOI: 10.1038/s41598-022-14262-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2022] [Accepted: 06/03/2022] [Indexed: 11/09/2022] Open
Abstract
In fast vision, local spatial properties of the visual scene can automatically capture the observer's attention. We used specific local features, predicted by a constrained maximum-entropy model to be optimal information-carriers, as candidate "salient features''. Previous studies showed that participants choose these optimal features as "more salient" if explicitly asked. Here, we investigated the implicit saliency of these optimal features in two attentional tasks. In a covert-attention experiment, we measured the luminance-contrast threshold for discriminating the orientation of a peripheral gabor. In a gaze-orienting experiment, we analyzed latency and direction of saccades towards a peripheral target. In both tasks, two brief peripheral cues, differing in saliency according to the model, preceded the target, presented on the same (valid trials) or the opposite side (invalid trials) of the optimal cue. Results showed reduced contrast thresholds, saccadic latencies, and direction errors in valid trials, and the opposite in invalid trials, compared to baseline values obtained with equally salient cues. Also, optimal features triggered more anticipatory saccades. Similar effects emerged in a luminance-control condition. Overall, in fast vision, optimal features automatically attract covert and overt attention, suggesting that saliency is determined by information maximization criteria coupled with computational limitations.
Collapse
Affiliation(s)
| | - Anna Montagnini
- Institut de Neurosciences de la Timone, CNRS and Aix-Marseille Universitè, Marseilles, France
| | | |
Collapse
|
5
|
Miyamoto T, Numasawa K, Ono S. Changes in visual speed perception induced by anticipatory smooth eye movements. J Neurophysiol 2022; 127:1198-1207. [PMID: 35353633 DOI: 10.1152/jn.00498.2021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Expectations about forthcoming visual motion shaped by observers' experiences are known to induce anticipatory smooth eye movements (ASEM) and changes in visual perception. Previous studies have demonstrated discrete effects of expectations on the control of ASEM and perception. However, the tasks designed in these studies were not able to segregate the effects of expectations and execution of ASEM itself on perception. In the current study, we attempted to directly examine the effect of ASEM itself on visual speed perception using a two-alternative forced-choice task (2AFC task), in which observers were asked to track a pair of sequentially presented visual motion stimuli with their eyes and to judge whether the second stimulus (test stimulus) was faster or slower than the first (reference stimulus). Our results showed that observers' visual speed perception, quantified by a psychometric function, shifted according to ASEM velocity. This was the case, even though there was no difference in the steady-state eye velocity. Further analyses revealed that the observers' perceptual decisions could be explained by a difference in the magnitude of retinal slip velocity in the initial phase of ocular tracking when the reference and test stimuli were presented, rather than in the steady-state phase. Our results provide psychophysical evidence of the importance of initial ocular tracking in visual speed perception and the strong impact of ASEM.
Collapse
Affiliation(s)
- Takeshi Miyamoto
- Faculty of Health and Sport Sciences, University of Tsukuba, Ibaraki, Japan
| | - Kosuke Numasawa
- Graduate School of Comprehensive Human Sciences, University of Tsukuba, Ibaraki, Japan
| | - Seiji Ono
- Faculty of Health and Sport Sciences, University of Tsukuba, Ibaraki, Japan
| |
Collapse
|
6
|
Goettker A, Gegenfurtner KR. A change in perspective: The interaction of saccadic and pursuit eye movements in oculomotor control and perception. Vision Res 2021; 188:283-296. [PMID: 34489101 DOI: 10.1016/j.visres.2021.08.004] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2021] [Revised: 07/26/2021] [Accepted: 08/16/2021] [Indexed: 11/17/2022]
Abstract
Due to the close relationship between oculomotor behavior and visual processing, eye movements have been studied in many different areas of research over the last few decades. While these studies have brought interesting insights, specialization within each research area comes at the potential cost of a narrow and isolated view of the oculomotor system. In this review, we want to expand this perspective by looking at the interactions between the two most important types of voluntary eye movements: saccades and pursuit. Recent evidence indicates multiple interactions and shared signals at the behavioral and neurophysiological level for oculomotor control and for visual perception during pursuit and saccades. Oculomotor control seems to be based on shared position- and velocity-related information, which leads to multiple behavioral interactions and synergies. The distinction between position- and velocity-related information seems to be also present at the neurophysiological level. In addition, visual perception seems to be based on shared efferent signals about upcoming eye positions and velocities, which are to some degree independent of the actual oculomotor response. This review suggests an interactive perspective on the oculomotor system, based mainly on different types of sensory input, and less so on separate subsystems for saccadic or pursuit eye movements.
Collapse
Affiliation(s)
- Alexander Goettker
- Abteilung Allgemeine Psychologie and Center for Mind, Brain & Behavior, Justus-Liebig University Giessen, Germany.
| | - Karl R Gegenfurtner
- Abteilung Allgemeine Psychologie and Center for Mind, Brain & Behavior, Justus-Liebig University Giessen, Germany
| |
Collapse
|
7
|
Castellotti S, Montagnini A, Del Viva MM. Early Visual Saliency Based on Isolated Optimal Features. Front Neurosci 2021; 15:645743. [PMID: 33994923 PMCID: PMC8120310 DOI: 10.3389/fnins.2021.645743] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Accepted: 04/06/2021] [Indexed: 12/02/2022] Open
Abstract
Under fast viewing conditions, the visual system extracts salient and simplified representations of complex visual scenes. Saccadic eye movements optimize such visual analysis through the dynamic sampling of the most informative and salient regions in the scene. However, a general definition of saliency, as well as its role for natural active vision, is still a matter for discussion. Following the general idea that visual saliency may be based on the amount of local information, a recent constrained maximum-entropy model of early vision, applied to natural images, extracts a set of local optimal information-carriers, as candidate salient features. These optimal features proved to be more informative than others in fast vision, when embedded in simplified sketches of natural images. In the present study, for the first time, these features were presented in isolation, to investigate whether they can be visually more salient than other non-optimal features, even in the absence of any meaningful global arrangement (contour, line, etc.). In four psychophysics experiments, fast discriminability of a compound of optimal features (target) in comparison with a similar compound of non-optimal features (distractor) was measured as a function of their number and contrast. Results showed that the saliency predictions from the constrained maximum-entropy model are well verified in the data, even when the optimal features are presented in smaller numbers or at lower contrast. In the eye movements experiment, the target and the distractor compounds were presented in the periphery at different angles. Participants were asked to perform a simple choice-saccade task. Results showed that saccades can select informative optimal features spatially interleaved with non-optimal features even at the shortest latencies. Saccades’ choice accuracy and landing position precision improved with SNR. In conclusion, the optimal features predicted by the reference model, turn out to be more salient than others, despite the lack of any clues coming from a global meaningful structure, suggesting that they get preferential treatment during fast image analysis. Also, peripheral fast visual processing of these informative local features is able to guide gaze orientation. We speculate that active vision is efficiently adapted to maximize information in natural visual scenes.
Collapse
Affiliation(s)
| | - Anna Montagnini
- Institut de Neurosciences de la Timone (UMR 7289), CNRS and Aix-Marseille Université, Marseille, France
| | | |
Collapse
|
8
|
Passive Motor Learning: Oculomotor Adaptation in the Absence of Behavioral Errors. eNeuro 2021; 8:ENEURO.0232-20.2020. [PMID: 33593731 PMCID: PMC8009667 DOI: 10.1523/eneuro.0232-20.2020] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2020] [Revised: 12/20/2020] [Accepted: 12/22/2020] [Indexed: 01/08/2023] Open
Abstract
Motor adaptation is commonly thought to be a trial-and-error process in which the accuracy of movement improves with repetition of behavior. We challenged this view by testing whether erroneous movements are necessary for motor adaptation. In the eye movement system, the association between movements and errors can be disentangled, since errors in the predicted stimulus trajectory can be perceived even without movements. We modified a smooth pursuit eye movement adaptation paradigm in which monkeys learn to make an eye movement that predicts an upcoming change in target direction. We trained the monkeys to fixate on a target while covertly, an additional target initially moved in one direction and then changed direction after 250 ms. The monkeys showed a learned response to infrequent probe trials in which they were instructed to follow the moving target. Additional experiments confirmed that probing learning or residual eye movements during fixation did not drive learning. These results show that motor adaptation can be elicited in the absence of movement and provide an animal model for studying the implementation of passive motor learning. Current models assume that the interaction between movement and error signals underlies adaptive motor learning. Our results point to other mechanisms that may drive learning in the absence of movement.
Collapse
|
9
|
Wu X, Rothwell AC, Spering M, Montagnini A. Expectations about motion direction affect perception and anticipatory smooth pursuit differently. J Neurophysiol 2021; 125:977-991. [PMID: 33534656 DOI: 10.1152/jn.00630.2020] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023] Open
Abstract
Smooth pursuit eye movements and visual motion perception rely on the integration of current sensory signals with past experience. Experience shapes our expectation of current visual events and can drive eye movement responses made in anticipation of a target, such as anticipatory pursuit. Previous research revealed consistent effects of expectation on anticipatory pursuit-eye movements follow the expected target direction or speed-and contrasting effects on motion perception, but most studies considered either eye movement or perceptual responses. The current study directly compared effects of direction expectation on perception and anticipatory pursuit within the same direction discrimination task to investigate whether both types of responses are affected similarly or differently. Observers (n = 10) viewed high-coherence random-dot kinematograms (RDKs) moving rightward and leftward with a probability of 50%, 70%, or 90% in a given block of trials to build up an expectation of motion direction. They were asked to judge motion direction of interleaved low-coherence RDKs (0%-15%). Perceptual judgements were compared with changes in anticipatory pursuit eye movements as a function of probability. Results show that anticipatory pursuit velocity scaled with probability and followed direction expectation (attraction bias), whereas perceptual judgments were biased opposite to direction expectation (repulsion bias). Control experiments suggest that the repulsion bias in perception was not caused by retinal slip induced by anticipatory pursuit, or by motion adaptation. We conclude that direction expectation can be processed differently for perception and anticipatory pursuit.NEW & NOTEWORTHY We show that expectations about motion direction that are based on long-term trial history affect perception and anticipatory pursuit differently. Whereas anticipatory pursuit direction was coherent with the expected motion direction (attraction bias), perception was biased opposite to the expected direction (repulsion bias). These opposite biases potentially reveal different ways in which perception and action utilize prior information and support the idea of different information processing for perception and pursuit.
Collapse
Affiliation(s)
- Xiuyun Wu
- Graduate Program in Neuroscience, University of British Columbia, Vancouver, British Columbia, Canada.,Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Austin C Rothwell
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| | - Miriam Spering
- Graduate Program in Neuroscience, University of British Columbia, Vancouver, British Columbia, Canada.,Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada.,Djavad Mowafaghian Center for Brain Health, University of British Columbia, Vancouver, British Columbia, Canada.,Institute for Computing, Information and Cognitive Systems, University of British Columbia, Vancouver, British Columbia, Canada
| | - Anna Montagnini
- Aix Marseille Univ, CNRS, INT, Inst Neurosci Timone, Marseille, France
| |
Collapse
|
10
|
Pasturel C, Montagnini A, Perrinet LU. Humans adapt their anticipatory eye movements to the volatility of visual motion properties. PLoS Comput Biol 2020; 16:e1007438. [PMID: 32282790 PMCID: PMC7179935 DOI: 10.1371/journal.pcbi.1007438] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2019] [Revised: 04/23/2020] [Accepted: 02/27/2020] [Indexed: 12/20/2022] Open
Abstract
Animal behavior constantly adapts to changes, for example when the statistical properties of the environment change unexpectedly. For an agent that interacts with this volatile setting, it is important to react accurately and as quickly as possible. It has already been shown that when a random sequence of motion ramps of a visual target is biased to one direction (e.g. right or left), human observers adapt their eye movements to accurately anticipate the target's expected direction. Here, we prove that this ability extends to a volatile environment where the probability bias could change at random switching times. In addition, we also recorded the explicit prediction of the next outcome as reported by observers using a rating scale. Both results were compared to the estimates of a probabilistic agent that is optimal in relation to the assumed generative model. Compared to the classical leaky integrator model, we found a better match between our probabilistic agent and the behavioral responses, both for the anticipatory eye movements and the explicit task. Furthermore, by controlling the level of preference between exploitation and exploration in the model, we were able to fit for each individual's experimental dataset the most likely level of volatility and analyze inter-individual variability across participants. These results prove that in such an unstable environment, human observers can still represent an internal belief about the environmental contingencies, and use this representation both for sensory-motor control and for explicit judgments. This work offers an innovative approach to more generically test the diversity of human cognitive abilities in uncertain and dynamic environments.
Collapse
Affiliation(s)
- Chloé Pasturel
- Institut de Neurosciences de la Timone (UMR 7289), Aix Marseille Univ, CNRS, Marseille, France
| | - Anna Montagnini
- Institut de Neurosciences de la Timone (UMR 7289), Aix Marseille Univ, CNRS, Marseille, France
| | - Laurent Udo Perrinet
- Institut de Neurosciences de la Timone (UMR 7289), Aix Marseille Univ, CNRS, Marseille, France
- * E-mail:
| |
Collapse
|
11
|
Affiliation(s)
- Katja Fiehler
- Department of Psychology, Justus Liebig University, Giessen, Germany
- Center for Mind, Brain, and Behavior (CMBB), Universities of Marburg and Giessen, Germany
| | - Eli Brenner
- Department of Human Movement Sciences, Vrije Universiteit Amsterdam, The Netherlands
| | - Miriam Spering
- Department of Ophthalmology & Visual Sciences, University of British Columbia, Vancouver, Canada
| |
Collapse
|
12
|
Abstract
Smooth pursuit eye movements maintain the line of sight on smoothly moving targets. Although often studied as a response to sensory motion, pursuit anticipates changes in motion trajectories, thus reducing harmful consequences due to sensorimotor processing delays. Evidence for predictive pursuit includes (a) anticipatory smooth eye movements (ASEM) in the direction of expected future target motion that can be evoked by perceptual cues or by memory for recent motion, (b) pursuit during periods of target occlusion, and (c) improved accuracy of pursuit with self-generated or biologically realistic target motions. Predictive pursuit has been linked to neural activity in the frontal cortex and in sensory motion areas. As behavioral and neural evidence for predictive pursuit grows and statistically based models augment or replace linear systems approaches, pursuit is being regarded less as a reaction to immediate sensory motion and more as a predictive response, with retinal motion serving as one of a number of contributing cues.
Collapse
Affiliation(s)
- Eileen Kowler
- Department of Psychology, Rutgers University, Piscataway, New Jersey 08854, USA; , ,
| | - Jason F Rubinstein
- Department of Psychology, Rutgers University, Piscataway, New Jersey 08854, USA; , ,
| | - Elio M Santos
- Department of Psychology, Rutgers University, Piscataway, New Jersey 08854, USA; , , .,Current affiliation: Department of Psychology, State University of New York, College at Oneonta, Oneonta, New York 13820, USA;
| | - Jie Wang
- Department of Psychology, Rutgers University, Piscataway, New Jersey 08854, USA; , ,
| |
Collapse
|
13
|
Notaro G, van Zoest W, Altman M, Melcher D, Hasson U. Predictions as a window into learning: Anticipatory fixation offsets carry more information about environmental statistics than reactive stimulus-responses. J Vis 2019; 19:8. [PMID: 30779844 DOI: 10.1167/19.2.8] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
A core question underlying neurobiological and computational models of behavior is how individuals learn environmental statistics and use them to make predictions. Most investigations of this issue have relied on reactive paradigms, in which inferences about predictive processes are derived by modeling responses to stimuli that vary in likelihood. Here we deployed a novel anticipatory oculomotor metric to determine how input statistics impact anticipatory behavior that is decoupled from target-driven-response. We implemented transition constraints between target locations, so that the probability of a target being presented on the same side as the previous trial was 70% in one condition (pret70) and 30% in the other (pret30). Rather than focus on responses to targets, we studied subtle endogenous anticipatory fixation offsets (AFOs) measured while participants fixated the screen center, awaiting a target. These AFOs were small (<0.4° from center on average), but strongly tracked global-level statistics. Speaking to learning dynamics, trial-by-trial fluctuations in AFO were well-described by a learning model, which identified a lower learning rate in pret70 than pret30, corroborating prior suggestions that pret70 is subjectively treated as more regular. Most importantly, direct comparisons with saccade latencies revealed that AFOs: (a) reflected similar temporal integration windows, (b) carried more information about the statistical context than did saccade latencies, and (c) accounted for most of the information that saccade latencies also contained about inputs statistics. Our work demonstrates how strictly predictive processes reflect learning dynamics, and presents a new direction for studying learning and prediction.
Collapse
Affiliation(s)
- Giuseppe Notaro
- Center for Mind/Brain Sciences (CIMeC), The University of Trento, Trento, Italy
| | - Wieske van Zoest
- Center for Mind/Brain Sciences (CIMeC), The University of Trento, Trento, Italy
| | - Magda Altman
- Center for Mind/Brain Sciences (CIMeC), The University of Trento, Trento, Italy
| | - David Melcher
- Center for Mind/Brain Sciences (CIMeC), The University of Trento, Trento, Italy
| | - Uri Hasson
- Center for Mind/Brain Sciences (CIMeC), The University of Trento, Trento, Italy.,Center for Practical Wisdom, The University of Chicago, Chicago, USA
| |
Collapse
|