1
|
Linear Integration of Sensory Evidence over Space and Time Underlies Face Categorization. J Neurosci 2021; 41:7876-7893. [PMID: 34326145 DOI: 10.1523/jneurosci.3055-20.2021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Revised: 07/08/2021] [Accepted: 07/21/2021] [Indexed: 11/21/2022] Open
Abstract
Visual object recognition relies on elaborate sensory processes that transform retinal inputs to object representations, but it also requires decision-making processes that read out object representations and function over prolonged time scales. The computational properties of these decision-making processes remain underexplored for object recognition. Here, we study these computations by developing a stochastic multifeature face categorization task. Using quantitative models and tight control of spatiotemporal visual information, we demonstrate that human subjects (five males, eight females) categorize faces through an integration process that first linearly adds the evidence conferred by task-relevant features over space to create aggregated momentary evidence and then linearly integrates it over time with minimum information loss. Discrimination of stimuli along different category boundaries (e.g., identity or expression of a face) is implemented by adjusting feature weights of spatial integration. This linear but flexible integration process over space and time bridges past studies on simple perceptual decisions to complex object recognition behavior.SIGNIFICANCE STATEMENT Although simple perceptual decision-making such as discrimination of random dot motion has been successfully explained as accumulation of sensory evidence, we lack rigorous experimental paradigms to study the mechanisms underlying complex perceptual decision-making such as discrimination of naturalistic faces. We develop a stochastic multifeature face categorization task as a systematic approach to quantify the properties and potential limitations of the decision-making processes during object recognition. We show that human face categorization could be modeled as a linear integration of sensory evidence over space and time. Our framework to study object recognition as a spatiotemporal integration process is broadly applicable to other object categories and bridges past studies of object recognition and perceptual decision-making.
Collapse
|
2
|
Abstract
Viewing static images depicting movement can result in a motion aftereffect: people tend to categorise direction signals as moving in the opposite direction relative to the implied motion in still photographs. This finding could indicate that inferred motion direction can penetrate sensory processing and change perception. Equally possible, however, is that inferred motion changes decision processes, but not perception. Here we test these two possibilities. Since both categorical decisions and subjective confidence are informed by sensory information, confidence can be informative about whether an aftereffect probably results from changes to perceptual or decision processes. We therefore used subjective confidence as an additional measure of the implied motion aftereffect. In Experiment 1 (implied motion), we find support for decision-level changes only, with no change in subjective confidence. In Experiment 2 (real motion), we find equal changes to decisions and confidence. Our results suggest the implied motion aftereffect produces a bias in decision-making, but leaves perceptual processing unchanged.
Collapse
|
3
|
Fillinger MG, Hübner R. On the relation between perceived stability and aesthetic appreciation. Acta Psychol (Amst) 2020; 208:103082. [PMID: 32534270 DOI: 10.1016/j.actpsy.2020.103082] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2019] [Revised: 04/09/2020] [Accepted: 04/29/2020] [Indexed: 11/18/2022] Open
Abstract
Perceived stability is an important feature of pictures with respect to their aesthetic appreciation. Pictures whose composition is perceived as stable are usually liked more than those with unstable arrangements. However, there are exceptions. In a recent study, we found that unstable Japanese calligraphies were preferred to stable ones. From this result, we hypothesized that instability is liked when it implies movement. Therefore, we systematically tested these two types of instability. In our first experiment, we used multiple-element pictures of varying stability as stimuli and show that perceived instability has a negative effect on liking. In a second experiment, we used dynamic paintings by the artist K.O. Götz, which largely vary in implied movement. As expected, for these dynamic pictures, instability was positively related to liking. Taken together, our findings indicate that perceived instability reduces the aesthetical appreciation of a picture unless it implies movement.
Collapse
|
4
|
Abstract
Previous research has shown that the typical or memory color of an object is perceived in images of that object, even when the image is achromatic. We performed an experiment to investigate whether the implied color in greyscale images could influence the perceived color of subsequent, simple stimuli. We used a standard top-up adaptation technique along with a roving-pedestal, two-alternative spatial forced-choice method for measuring perceptual bias without contamination from any response or decision biases. Adaptors were achromatic images of natural objects that are normally seen with diagnostic color. We found that, in some circumstances, greyscale adapting images had a biasing effect, shifting the achromatic point toward the implied color, in comparison with phase-scrambled images. We interpret this effect as evidence of adaptation in chromatic signaling mechanisms that receive top-down input from knowledge of object color. This implied color adaptation effect was particularly strong from images of bananas, which are popular stimuli in memory color experiments. We also consider the effect in a color constancy context, in which the implied color is used by the visual system to estimate an illuminant, but find our results inconsistent with this explanation.
Collapse
Affiliation(s)
- R. J. Lee
- School of Psychology, University of Lincoln, Lincoln, UK
| | - G. Mather
- School of Psychology, University of Lincoln, Lincoln, UK
| |
Collapse
|
5
|
Gallagher RM, Suddendorf T, Arnold DH. Confidence as a diagnostic tool for perceptual aftereffects. Sci Rep 2019; 9:7124. [PMID: 31073187 PMCID: PMC6509108 DOI: 10.1038/s41598-019-43170-1] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2018] [Accepted: 04/17/2019] [Indexed: 11/21/2022] Open
Abstract
Perceptual judgements are, by nature, a product both of sensation and the cognitive processes responsible for interpreting and reporting subjective experiences. Changed perceptual judgements may thus result from changes in how the world appears (perception), or subsequent interpretation (judgement). This ambiguity has led to persistent debates about how to interpret changes in decision-making, and if higher-order cognitions can change how the world looks, or sounds, or feels. Here we introduce an approach that can help resolve these ambiguities. In three motion-direction experiments, we measured perceptual judgements and subjective confidence. We show that each measure is sensitive to sensory information and can index sensory adaptation. Each measure is also sensitive to decision biases, but response bias impacts the central tendency of decision and confidence distributions differently. Our findings show that subjective confidence, when measured in addition to perceptual decisions, can supply important diagnostic information about the cause of aftereffects.
Collapse
Affiliation(s)
- Regan M Gallagher
- School of Psychology, The University of Queensland, Brisbane, Australia.
| | - Thomas Suddendorf
- School of Psychology, The University of Queensland, Brisbane, Australia
| | - Derek H Arnold
- School of Psychology, The University of Queensland, Brisbane, Australia
| |
Collapse
|
6
|
Linares D, Aguilar-Lleyda D, López-Moliner J. Decoupling sensory from decisional choice biases in perceptual decision making. eLife 2019; 8:e43994. [PMID: 30916643 PMCID: PMC6459673 DOI: 10.7554/elife.43994] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2018] [Accepted: 03/23/2019] [Indexed: 11/13/2022] Open
Abstract
The contribution of sensory and decisional processes to perceptual decision making is still unclear, even in simple perceptual tasks. When decision makers need to select an action from a set of balanced alternatives, any tendency to choose one alternative more often-choice bias-is consistent with a bias in the sensory evidence, but also with a preference to select that alternative independently of the sensory evidence. To decouple sensory from decisional biases, here we asked humans to perform a simple perceptual discrimination task with two symmetric alternatives under two different task instructions. The instructions varied the response mapping between perception and the category of the alternatives. We found that from 32 participants, 30 exhibited sensory biases and 15 decisional biases. The decisional biases were consistent with a criterion change in a simple signal detection theory model. Perceptual decision making, thus, even in simple scenarios, is affected by sensory and decisional choice biases.
Collapse
Affiliation(s)
- Daniel Linares
- Institut d’Investigacions Biomediques August Pi i Sunyer (IDIBAPS)BarcelonaSpain
| | - David Aguilar-Lleyda
- Centre d’Économie de la Sorbonne (CNRS & Université Paris 1 Panthéon-Sorbonne)ParisFrance
| | - Joan López-Moliner
- VISCA Group, Department of Cognition, Development and Psychology of Education, Institut de NeurociènciesUniversitat de BarcelonaBarcelonaSpain
| |
Collapse
|
7
|
Witthoft N, Sha L, Winawer J, Kiani R. Sensory and decision-making processes underlying perceptual adaptation. J Vis 2018; 18:10. [PMID: 30140892 PMCID: PMC6108310 DOI: 10.1167/18.8.10] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Perceptual systems adapt to their inputs. As a result, prolonged exposure to particular stimuli alters judgments about subsequent stimuli. This phenomenon is commonly assumed to be sensory in origin. Changes in the decision-making process, however, may also be a component of adaptation. Here, we quantify sensory and decision-making contributions to adaptation in a facial expression paradigm. As expected, exposure to happy or sad expressions shifts the psychometric function toward the adaptor. More surprisingly, response times show both an overall decline and an asymmetry, with faster responses opposite the adapting category, implicating a substantial change in the decision-making process. Specifically, we infer that sensory changes from adaptation are accompanied by changes in how much sensory information is accumulated for the two choices. We speculate that adaptation influences implicit expectations about the stimuli one will encounter, causing modifications in the decision-making process as part of a normative response to a change in context.
Collapse
Affiliation(s)
- Nathan Witthoft
- Department of Psychology, New York University, New York, NY, USA.,Department of Psychology, Stanford University, Stanford, CA, USA
| | - Long Sha
- Center for Neural Science, New York University, New York, NY, USA
| | - Jonathan Winawer
- Department of Psychology and the Center for Neural Science, New York University, New York, NY, USA
| | - Roozbeh Kiani
- Department of Psychology and the Center for Neural Science, New York University, New York, NY, USA.,Neuroscience Institute, NYU Langone Medical Center, New York, NY, USA
| |
Collapse
|
8
|
Mather G, Parsons T. Adaptation reveals sensory and decision components in the visual estimation of locomotion speed. Sci Rep 2018; 8:13059. [PMID: 30158552 PMCID: PMC6115446 DOI: 10.1038/s41598-018-30230-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2018] [Accepted: 07/19/2018] [Indexed: 11/09/2022] Open
Abstract
Locomotion speed provides important social information about an individual's fitness, mood and intent. Visual estimation of locomotion speed is a complex task for the visual system because viewing distance must be taken into account, and the estimate has to be calibrated by recent experience of typical speeds. Little is known about how locomotion speed judgements are made. Previous research indicates that the human visual system possesses neurons that respond specifically to moving human forms. This research used point-light walker (PLW) displays that are known to activate these cells, in order to investigate the process mediating locomotion speed judgements. The results of three adaptation experiments show that these judgements involve both a low-level sensory component and a high-level decision component. A simple theoretical scheme is proposed, in which neurons sensitive to image flicker rate (temporal frequency) provide a sensory speed code, and a benchmark 'norm' value of the speed code, based on prevailing locomotion speeds, is used to make decisions about objective speed. The output of a simple computational model of the scheme successfully captured variations in locomotion speed in the stimuli used in the experiments. The theory offers a biologically-motivated account of how locomotion speed can be visually estimated.
Collapse
Affiliation(s)
- George Mather
- School of Psychology, University of Lincoln, Lincoln, UK.
| | - Todd Parsons
- School of Psychology, University of Lincoln, Lincoln, UK
| |
Collapse
|
9
|
Schreiber K, Morgan M. Aperture Synthesis Shows Perceptual Integration of Geometrical Form Across Saccades. Perception 2017; 47:239-253. [PMID: 29212408 DOI: 10.1177/0301006617739804] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
We investigated the perceptual bias in perceived relative lengths in the Brentano version of the Müller-Lyer arrowheads figure. The magnitude of the bias was measured both under normal whole-figure viewing condition and under an aperture viewing condition, where participants moved their gaze around the figure but could see only one arrowhead at a time through a Gaussian-weighted contrast window. The extent of the perceptual bias was similar under the two conditions. The stimuli were presented on a CRT in a light-proof room with room-lights off, but visual context was provided by a rectangular frame surrounding the figure. The frame was either stationary with respect to the figure or moved in such a manner that the bias would be counteracted if the observer were locating features with respect to the frame. Biases were reduced in the latter condition. We conclude that integration occurs over saccades, but largely in an external visual framework, rather than in a body-centered frame using an extraretinal signal.
Collapse
Affiliation(s)
- Kai Schreiber
- Max-Planck Institute for Metabolism Research, Cologne, Germany
| | | |
Collapse
|
10
|
Low-level mediation of directionally specific motion aftereffects: Motion perception is not necessary. Atten Percept Psychophys 2017; 78:2621-2632. [PMID: 27392932 PMCID: PMC5110584 DOI: 10.3758/s13414-016-1160-1] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
Previous psychophysical experiments with normal human observers have shown that adaptation to a moving dot stream causes directionally specific repulsion in the perceived angle of a subsequently viewed moving probe. In this study, we used a two-alternative forced choice task with roving pedestals to determine the conditions that are necessary and sufficient for producing directionally specific repulsion with compound adaptors, each of which contains two oppositely moving, differently colored component streams. Experiment 1 provided a demonstration of repulsion between single-component adaptors and probes moving at approximately 90° or 270°. In Experiment 2, oppositely moving dots in the adaptor were paired to preclude the appearance of motion. Nonetheless, repulsion remained strong when the angle between each probe stream and one component was approximately 30°. In Experiment 3, adapting dot pairs were kept stationary during their limited lifetimes. Their orientation content alone proved insufficient for producing repulsion. In Experiments 4–6, the angle between the probe and both adapting components was approximately 90° or 270°. Directional repulsion was found when observers were asked to visually track one of the adapting components (Exp. 6), but not when they were asked to attentionally track it (Exp. 5), nor while they passively viewed the adaptor (Exp. 4). Our results are consistent with a low-level mechanism for motion adaptation. This mechanism is not selective for stimulus color and is not susceptible to attentional modulation. The most likely cortical locus of adaptation is area V1.
Collapse
|
11
|
Visual adaptation alters the apparent speed of real-world actions. Sci Rep 2017; 7:6738. [PMID: 28751645 PMCID: PMC5532221 DOI: 10.1038/s41598-017-06841-5] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2017] [Accepted: 06/19/2017] [Indexed: 11/09/2022] Open
Abstract
The apparent physical speed of an object in the field of view remains constant despite variations in retinal velocity due to viewing conditions (velocity constancy). For example, people and cars appear to move across the field of view at the same objective speed regardless of distance. In this study a series of experiments investigated the visual processes underpinning judgements of objective speed using an adaptation paradigm and video recordings of natural human locomotion. Viewing a video played in slow-motion for 30 seconds caused participants to perceive subsequently viewed clips played at standard speed as too fast, so playback had to be slowed down in order for it to appear natural; conversely after viewing fast-forward videos for 30 seconds, playback had to be speeded up in order to appear natural. The perceived speed of locomotion shifted towards the speed depicted in the adapting video (‘re-normalisation’). Results were qualitatively different from those obtained in previously reported studies of retinal velocity adaptation. Adapting videos that were scrambled to remove recognizable human figures or coherent motion caused significant, though smaller shifts in apparent locomotion speed, indicating that both low-level and high-level visual properties of the adapting stimulus contributed to the changes in apparent speed.
Collapse
|
12
|
Forder L, Taylor O, Mankin H, Scott RB, Franklin A. Colour Terms Affect Detection of Colour and Colour-Associated Objects Suppressed from Visual Awareness. PLoS One 2016; 11:e0152212. [PMID: 27023274 PMCID: PMC4811409 DOI: 10.1371/journal.pone.0152212] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2015] [Accepted: 03/10/2016] [Indexed: 11/19/2022] Open
Abstract
The idea that language can affect how we see the world continues to create controversy. A potentially important study in this field has shown that when an object is suppressed from visual awareness using continuous flash suppression (a form of binocular rivalry), detection of the object is differently affected by a preceding word prime depending on whether the prime matches or does not match the object. This may suggest that language can affect early stages of vision. We replicated this paradigm and further investigated whether colour terms likewise influence the detection of colours or colour-associated object images suppressed from visual awareness by continuous flash suppression. This method presents rapidly changing visual noise to one eye while the target stimulus is presented to the other. It has been shown to delay conscious perception of a target for up to several minutes. In Experiment 1 we presented greyscale photos of objects. They were either preceded by a congruent object label, an incongruent label, or white noise. Detection sensitivity (d') and hit rates were significantly poorer for suppressed objects preceded by an incongruent label compared to a congruent label or noise. In Experiment 2, targets were coloured discs preceded by a colour term. Detection sensitivity was significantly worse for suppressed colour patches preceded by an incongruent colour term as compared to a congruent term or white noise. In Experiment 3 targets were suppressed greyscale object images preceded by an auditory presentation of a colour term. On congruent trials the colour term matched the object's stereotypical colour and on incongruent trials the colour term mismatched. Detection sensitivity was significantly poorer on incongruent trials than congruent trials. Overall, these findings suggest that colour terms affect awareness of coloured stimuli and colour- associated objects, and provide new evidence for language-perception interaction in the brain.
Collapse
Affiliation(s)
- Lewis Forder
- The Sussex Colour Group, School of Psychology, University of Sussex, Falmer, Brighton, United Kingdom
| | - Olivia Taylor
- The Sussex Colour Group, School of Psychology, University of Sussex, Falmer, Brighton, United Kingdom
| | - Helen Mankin
- The Sussex Colour Group, School of Psychology, University of Sussex, Falmer, Brighton, United Kingdom
| | - Ryan B. Scott
- School of Psychology, University of Sussex, Falmer, Brighton, United Kingdom
| | - Anna Franklin
- The Sussex Colour Group, School of Psychology, University of Sussex, Falmer, Brighton, United Kingdom
| |
Collapse
|