1
|
Craighero L. An embodied approach to fetal and newborn perceptual and sensorimotor development. Brain Cogn 2024; 179:106184. [PMID: 38843762 DOI: 10.1016/j.bandc.2024.106184] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2024] [Revised: 05/28/2024] [Accepted: 05/29/2024] [Indexed: 06/17/2024]
Abstract
The embodied approach argues that interaction with the environment plays a crucial role in brain development and that the presence of sensory effects generated by movements is fundamental. The movement of the fetus is initially random. Then, the repeated execution of the movement creates a link between it and its sensory effects, allowing the selection of movements that produce expected sensations. During fetal life, the brain develops from a transitory fetal circuit to the permanent cortical circuit, which completes development after birth. Accordingly, this process must concern the interaction of the fetus with the intrauterine environment and of the newborn with the new aerial environment, which provides a new sensory stimulation, light. The goal of the present review is to provide suggestions for neuroscientific research capable of shedding light on brain development process by describing from a functional point of view the relationship between the motor and sensory abilities of fetuses and newborns and the increasing complexity of their interaction with objects in the womb and outside of it.
Collapse
Affiliation(s)
- Laila Craighero
- Department of Neuroscience and Rehabilitation, University of Ferrara, via Fossato di Mortara 19, 44121 Ferrara, Italy.
| |
Collapse
|
2
|
Todd JT, Bahrick LE. Individual Differences in Multisensory Attention Skills in Children with Autism Spectrum Disorder Predict Language and Symptom Severity: Evidence from the Multisensory Attention Assessment Protocol (MAAP). J Autism Dev Disord 2023; 53:4685-4710. [PMID: 36181648 PMCID: PMC10065966 DOI: 10.1007/s10803-022-05752-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/07/2022] [Indexed: 01/27/2023]
Abstract
Children with autism spectrum disorders (ASD) show atypical attention, particularly for social events. The new Multisensory Attention Assessment Protocol (MAAP) assesses fine-grained individual differences in attention disengagement, maintenance, and audiovisual matching for social and nonsocial events. We investigated the role of competing stimulation on attention, and relations with language and symptomatology in children with ASD and typical controls. Findings revealed: (1) the MAAP differentiated children with ASD from controls, (2) greater attention to social events predicted better language for both groups and lower symptom severity in children with ASD, (3) different pathways from attention to language were evident in children with ASD versus controls. The MAAP provides an ideal attention assessment for revealing diagnostic group differences and relations with outcomes.
Collapse
Affiliation(s)
- James Torrence Todd
- Department of Psychology, Florida International University, 11200 South West 8 Street, Miami, FL, 33199, USA.
| | - Lorraine E Bahrick
- Department of Psychology, Florida International University, 11200 South West 8 Street, Miami, FL, 33199, USA
| |
Collapse
|
3
|
Edgar EV, Todd JT, Bahrick LE. Intersensory processing of faces and voices at 6 months predicts language outcomes at 18, 24, and 36 months of age. INFANCY 2023; 28:569-596. [PMID: 36760157 PMCID: PMC10564323 DOI: 10.1111/infa.12533] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2021] [Revised: 01/04/2023] [Accepted: 01/13/2023] [Indexed: 02/11/2023]
Abstract
Intersensory processing of social events (e.g., matching sights and sounds of audiovisual speech) is a critical foundation for language development. Two recently developed protocols, the Multisensory Attention Assessment Protocol (MAAP) and the Intersensory Processing Efficiency Protocol (IPEP), assess individual differences in intersensory processing at a sufficiently fine-grained level for predicting developmental outcomes. Recent research using the MAAP demonstrates 12-month intersensory processing of face-voice synchrony predicts language outcomes at 18- and 24-months, holding traditional predictors (parent language input, SES) constant. Here, we build on these findings testing younger infants using the IPEP, a more comprehensive, fine-grained index of intersensory processing. Using a longitudinal sample of 103 infants, we tested whether intersensory processing (speed, accuracy) of faces and voices at 3- and 6-months predicts language outcomes at 12-, 18-, 24-, and 36-months, holding traditional predictors constant. Results demonstrate intersensory processing of faces and voices at 6-months (but not 3-months) accounted for significant unique variance in language outcomes at 18-, 24-, and 36-months, beyond that of traditional predictors. Findings highlight the importance of intersensory processing of face-voice synchrony as a foundation for language development as early as 6-months and reveal that individual differences assessed by the IPEP predict language outcomes even 2.5-years later.
Collapse
|
4
|
The temporal dynamics of labelling shape infant object recognition. Infant Behav Dev 2022; 67:101698. [DOI: 10.1016/j.infbeh.2022.101698] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2020] [Revised: 12/06/2021] [Accepted: 01/26/2022] [Indexed: 11/22/2022]
|
5
|
Fiber tracing and microstructural characterization among audiovisual integration brain regions in neonates compared with young adults. Neuroimage 2022; 254:119141. [PMID: 35342006 DOI: 10.1016/j.neuroimage.2022.119141] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2021] [Revised: 02/23/2022] [Accepted: 03/21/2022] [Indexed: 11/23/2022] Open
Abstract
Audiovisual integration has been related with cognitive-processing and behavioral advantages, as well as with various socio-cognitive disorders. While some studies have identified brain regions instantiating this ability shortly after birth, little is known about the structural pathways connecting them. The goal of the present study was to reconstruct fiber tracts linking AVI regions in the newborn in-vivo brain and assess their adult-likeness by comparing them with analogous fiber tracts of young adults. We performed probabilistic tractography and compared connective probabilities between a sample of term-born neonates (N = 311; the Developing Human Connectome Project (dHCP, http://www.developingconnectome.org) and young adults (N = 311 The Human Connectome Project; https://www.humanconnectome.org/) by means of a classification algorithm. Furthermore, we computed Dice coefficients to assess between-group spatial similarity of the reconstructed fibers and used diffusion metrics to characterize neonates' AVI brain network in terms of microstructural properties, interhemispheric differences and the association with perinatal covariates and biological sex. Overall, our results indicate that the AVI fiber bundles were successfully reconstructed in a vast majority of neonates, similarly to adults. Connective probability distributional similarities and spatial overlaps of AVI fibers between the two groups differed across the reconstructed fibers. There was a rank-order correspondence of the fibers' connective strengths across the groups. Additionally, the study revealed patterns of diffusion metrics in line with early white matter developmental trajectories and a developmental advantage for females. Altogether, these findings deliver evidence of meaningful structural connections among AVI regions in the newborn in-vivo brain.
Collapse
|
6
|
Development of multisensory integration following prolonged early-onset visual deprivation. Curr Biol 2021; 31:4879-4885.e6. [PMID: 34534443 DOI: 10.1016/j.cub.2021.08.060] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2021] [Revised: 07/12/2021] [Accepted: 08/23/2021] [Indexed: 11/23/2022]
Abstract
Adult humans make effortless use of multisensory signals and typically integrate them in an optimal fashion.1 This remarkable ability takes many years for normally sighted children to develop.2,3 Would individuals born blind or with extremely low vision still be able to develop multisensory integration later in life when surgically treated for sight restoration? Late acquisition of such capability would be a vivid example of the brain's ability to retain high levels of plasticity. We studied the development of multisensory integration in individuals suffering from congenital dense bilateral cataract, surgically treated years after birth. We assessed cataract-treated individuals' reliance on their restored visual abilities when estimating the size of an object simultaneously explored by touch. Within weeks to months after surgery, when combining information from vision and touch, they developed a multisensory weighting behavior similar to matched typically sighted controls. Next, we tested whether cataract-treated individuals benefited from integrating vision with touch by increasing the precision of size estimates, as it occurs when integrating signals in a statistically optimal fashion.1 For participants retested multiple times, such a benefit developed within months after surgery to levels of precision indistinguishable from optimal behavior. To summarize, the development of multisensory integration does not merely depend on age, but requires extensive multisensory experience with the world, rendered possible by the improved post-surgical visual acuity. We conclude that early exposure to multisensory signals is not essential for the development of multisensory integration, which can still be acquired even after many years of visual deprivation.
Collapse
|
7
|
Turoman N, Tivadar RI, Retsa C, Maillard AM, Scerif G, Matusz PJ. The development of attentional control mechanisms in multisensory environments. Dev Cogn Neurosci 2021; 48:100930. [PMID: 33561691 PMCID: PMC7873372 DOI: 10.1016/j.dcn.2021.100930] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2020] [Revised: 11/26/2020] [Accepted: 01/26/2021] [Indexed: 01/02/2023] Open
Abstract
Outside the laboratory, people need to pay attention to relevant objects that are typically multisensory, but it remains poorly understood how the underlying neurocognitive mechanisms develop. We investigated when adult-like mechanisms controlling one's attentional selection of visual and multisensory objects emerge across childhood. Five-, 7-, and 9-year-olds were compared with adults in their performance on a computer game-like multisensory spatial cueing task, while 129-channel EEG was simultaneously recorded. Markers of attentional control were behavioural spatial cueing effects and the N2pc ERP component (analysed traditionally and using a multivariate electrical neuroimaging framework). In behaviour, adult-like visual attentional control was present from age 7 onwards, whereas multisensory control was absent in all children groups. In EEG, multivariate analyses of the activity over the N2pc time-window revealed stable brain activity patterns in children. Adult-like visual-attentional control EEG patterns were present age 7 onwards, while multisensory control activity patterns were found in 9-year-olds (albeit behavioural measures showed no effects). By combining rigorous yet naturalistic paradigms with multivariate signal analyses, we demonstrated that visual attentional control seems to reach an adult-like state at ∼7 years, before adult-like multisensory control, emerging at ∼9 years. These results enrich our understanding of how attention in naturalistic settings develops.
Collapse
Affiliation(s)
- Nora Turoman
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland; Information Systems Institute at the University of Applied Sciences Western Switzerland (HES-SO Valais), Sierre, 3960, Switzerland; Working Memory, Cognition and Development lab, Department of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland
| | - Ruxandra I Tivadar
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland; Cognitive Computational Neuroscience group, Institute of Computer Science, Faculty of Science, University of Bern, Bern, Switzerland
| | - Chrysa Retsa
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland
| | - Anne M Maillard
- Service des Troubles du Spectre de l'Autisme et apparentés, Department of Psychiatry, University Hospital Center and University of Lausanne, Lausanne, Switzerland
| | - Gaia Scerif
- Department of Experimental Psychology, University of Oxford, Oxfordshire, UK
| | - Pawel J Matusz
- The LINE (Laboratory for Investigative Neurophysiology), Department of Radiology and Clinical Neurosciences, University Hospital Center and University of Lausanne, Lausanne, Switzerland; Information Systems Institute at the University of Applied Sciences Western Switzerland (HES-SO Valais), Sierre, 3960, Switzerland; Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USA.
| |
Collapse
|
8
|
Ujiie Y, Kanazawa S, Yamaguchi MK. Development of the multisensory perception of water in infancy. J Vis 2020; 20:5. [PMID: 32749446 PMCID: PMC7438635 DOI: 10.1167/jov.20.8.5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Material perception is facilitated by multisensory interactions that enable us to associate the visual properties of a material with its auditory properties. Such interactions develop during infancy and are assumed to depend on the familiarity of materials. Here, we aimed to pinpoint the age at which infants acquire multisensory interactions for the perception of water, which is a familiar material to them. We presented two side-by-side movies of pouring water and ice while providing the corresponding sounds of water and ice, as well as silence. We found that infants older than 5 months of age looked longer at the water movie when they heard the sound of water. Conversely, they did not look at the ice movie when they heard the sound of ice. These results indicate that at approximately 5 months of age, infants develop multisensory interactions between auditory and visual properties of water, but not of ice. The contrasting results between water and ice suggest that the development of multisensory material perception depends on the frequency of interactions with materials during infancy.
Collapse
|
9
|
Amadeo MB, Campus C, Gori M. Time attracts auditory space representation during development. Behav Brain Res 2019; 376:112185. [PMID: 31472192 DOI: 10.1016/j.bbr.2019.112185] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2019] [Revised: 08/16/2019] [Accepted: 08/28/2019] [Indexed: 10/26/2022]
Abstract
Vision is the most accurate sense for spatial representation, whereas audition is for temporal representation. However, how different sensory modalities shape the development of spatial and temporal representations is still unclear. Here, 45 children aged 11-13 years were tested to investigate the abilities to evaluate spatial features of auditory stimuli during bisection tasks, while conflicting or non-conflicting spatial and temporal information was delivered. Since audition is fundamental for temporal representation, the hypothesis was that temporal information could influence auditory spatial representation development. Results show a strong interaction between the temporal and the spatial domain. Younger children are not able to build complex spatial representations when the temporal domain is uninformative about space. However, when the spatial information is coherent with the temporal information children of all age are able to decode complex spatial relationships. When spatial and temporal cues are conflicting, younger children are strongly attracted by the temporal instead of spatial information, while older participants result unaffected by the cross-domain conflict. These findings suggest that during development temporal representation of events is used to infer spatial coordinates of the environment, offering important opportunities for new teaching and rehabilitation strategies.
Collapse
Affiliation(s)
- Maria Bianca Amadeo
- Unit for Visually Impaired People (U-VIP), Fondazione Istituto Italiano di Tecnologia, Via E. Melen, 83, 16152 Genova Italy; Università degli studi di Genova, Department of Informatics, Bioengineering, Robotics and Systems Engineering, Via all'Opera Pia, 13, 16145 Genova Italy
| | - Claudio Campus
- Unit for Visually Impaired People (U-VIP), Fondazione Istituto Italiano di Tecnologia, Via E. Melen, 83, 16152 Genova Italy
| | - Monica Gori
- Unit for Visually Impaired People (U-VIP), Fondazione Istituto Italiano di Tecnologia, Via E. Melen, 83, 16152 Genova Italy.
| |
Collapse
|
10
|
Lalonde K, Werner LA. Infants and Adults Use Visual Cues to Improve Detection and Discrimination of Speech in Noise. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2019; 62:3860-3875. [PMID: 31618097 PMCID: PMC7201336 DOI: 10.1044/2019_jslhr-h-19-0106] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/06/2019] [Revised: 05/30/2019] [Accepted: 07/08/2019] [Indexed: 06/10/2023]
Abstract
Purpose This study assessed the extent to which 6- to 8.5-month-old infants and 18- to 30-year-old adults detect and discriminate auditory syllables in noise better in the presence of visual speech than in auditory-only conditions. In addition, we examined whether visual cues to the onset and offset of the auditory signal account for this benefit. Method Sixty infants and 24 adults were randomly assigned to speech detection or discrimination tasks and were tested using a modified observer-based psychoacoustic procedure. Each participant completed 1-3 conditions: auditory-only, with visual speech, and with a visual signal that only cued the onset and offset of the auditory syllable. Results Mixed linear modeling indicated that infants and adults benefited from visual speech on both tasks. Adults relied on the onset-offset cue for detection, but the same cue did not improve their discrimination. The onset-offset cue benefited infants for both detection and discrimination. Whereas the onset-offset cue improved detection similarly for infants and adults, the full visual speech signal benefited infants to a lesser extent than adults on the discrimination task. Conclusions These results suggest that infants' use of visual onset-offset cues is mature, but their ability to use more complex visual speech cues is still developing. Additional research is needed to explore differences in audiovisual enhancement (a) of speech discrimination across speech targets and (b) with increasingly complex tasks and stimuli.
Collapse
Affiliation(s)
- Kaylah Lalonde
- Department of Speech & Hearing Sciences, University of Washington, Seattle
| | - Lynne A. Werner
- Department of Speech & Hearing Sciences, University of Washington, Seattle
| |
Collapse
|
11
|
Curtindale LM, Bahrick LE, Lickliter R, Colombo J. Effects of multimodal synchrony on infant attention and heart rate during events with social and nonsocial stimuli. J Exp Child Psychol 2019; 178:283-294. [PMID: 30445204 PMCID: PMC6980371 DOI: 10.1016/j.jecp.2018.10.006] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2018] [Revised: 09/16/2018] [Accepted: 10/16/2018] [Indexed: 11/25/2022]
Abstract
Attention is a state of readiness or alertness, associated with behavioral and psychophysiological responses, that facilitates learning and memory. Multisensory and dynamic events have been shown to elicit more attention and produce greater sustained attention in infants than auditory or visual events alone. Such redundant and often temporally synchronous information guides selectivity and facilitates perception, learning, and memory of properties of events specified by redundancy. In addition, events involving faces or other social stimuli provide an extraordinary amount of redundant information that attracts and sustains attention. In the current study, 4- and 8-month-old infants were shown 2-min multimodal videos featuring social or nonsocial stimuli to determine the relative roles of synchrony and stimulus category in inducing attention. Behavioral measures included average looking time and peak look duration, and convergent measurement of heart rate (HR) allowed for the calculation of HR-defined phases of attention: Orienting (OR), sustained attention (SA), and attention termination (AT). The synchronous condition produced an earlier onset of SA (less time in OR) and a deeper state of SA than the asynchronous condition. Social stimuli attracted and held attention (longer duration of peak looks and lower HR than nonsocial stimuli). Effects of synchrony and the social nature of stimuli were additive, suggesting independence of their influence on attention. These findings are the first to demonstrate different HR-defined phases of attention as a function of intersensory redundancy, suggesting greater salience and deeper processing of naturalistic synchronous audiovisual events compared with asynchronous ones.
Collapse
Affiliation(s)
- Lori M Curtindale
- Department of Psychology, East Carolina University, Greenville, NC 27858, USA.
| | - Lorraine E Bahrick
- Department of Psychology, Florida International University, Miami, FL 33199, USA
| | - Robert Lickliter
- Department of Psychology, Florida International University, Miami, FL 33199, USA
| | - John Colombo
- Department of Psychology, University of Kansas, Lawrence, KS 66045, USA
| |
Collapse
|
12
|
Bahrick LE, Soska KC, Todd JT. Assessing individual differences in the speed and accuracy of intersensory processing in young children: The intersensory processing efficiency protocol. Dev Psychol 2018; 54:2226-2239. [PMID: 30346188 PMCID: PMC6261800 DOI: 10.1037/dev0000575] [Citation(s) in RCA: 39] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Detecting intersensory redundancy guides cognitive, social, and language development. Yet, researchers lack fine-grained, individual difference measures needed for studying how early intersensory skills lead to later outcomes. The intersensory processing efficiency protocol (IPEP) addresses this need. Across a number of brief trials, participants must find a sound-synchronized visual target event (social, nonsocial) amid five visual distractor events, simulating the "noisiness" of natural environments. Sixty-four 3- to 5-year-old children were tested using remote eye-tracking. Children showed intersensory processing by attending to the sound-synchronous event more frequently and longer than in a silent visual control, and more frequently than expected by chance. The IPEP provides a fine-grained, nonverbal method for characterizing individual differences in intersensory processing appropriate for infants and children. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Collapse
Affiliation(s)
| | - Kasey C Soska
- Department of Psychology, Florida International University
| | | |
Collapse
|
13
|
Hannon EE, Schachner A, Nave-Blodgett JE. Babies know bad dancing when they see it: Older but not younger infants discriminate between synchronous and asynchronous audiovisual musical displays. J Exp Child Psychol 2017; 159:159-174. [PMID: 28288412 DOI: 10.1016/j.jecp.2017.01.006] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2016] [Revised: 01/17/2017] [Accepted: 01/17/2017] [Indexed: 10/20/2022]
Abstract
Movement to music is a universal human behavior, yet little is known about how observers perceive audiovisual synchrony in complex musical displays such as a person dancing to music, particularly during infancy and childhood. In the current study, we investigated how perception of musical audiovisual synchrony develops over the first year of life. We habituated infants to a video of a person dancing to music and subsequently presented videos in which the visual track was matched (synchronous) or mismatched (asynchronous) with the audio track. In a visual-only control condition, we presented the same visual stimuli with no sound. In Experiment 1, we found that older infants (8-12months) exhibited a novelty preference for the mismatched movie when both auditory information and visual information were available and showed no preference when only visual information was available. By contrast, younger infants (5-8months) in Experiment 2 did not discriminate matching stimuli from mismatching stimuli. This suggests that the ability to perceive musical audiovisual synchrony may develop during the second half of the first year of infancy.
Collapse
Affiliation(s)
- Erin E Hannon
- Department of Psychology, University of Nevada, Las Vegas, Las Vegas, NV 89154, USA.
| | - Adena Schachner
- Department of Psychology, University of California, San Diego, La Jolla, CA 92093, USA
| | | |
Collapse
|
14
|
Hyde DC, Flom R, Porter CL. Behavioral and Neural Foundations of Multisensory Face-Voice Perception in Infancy. Dev Neuropsychol 2017; 41:273-292. [PMID: 28059567 DOI: 10.1080/87565641.2016.1255744] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
In this article, we describe behavioral and neurophysiological evidence for infants' multimodal face-voice perception. We argue that the behavioral development of face-voice perception, like multimodal perception more broadly, is consistent with the intersensory redundancy hypothesis (IRH). Furthermore, we highlight that several recently observed features of the neural responses in infants converge with the behavioral predictions of the intersensory redundancy hypothesis. Finally, we discuss the potential benefits of combining brain and behavioral measures to study multisensory processing, as well as some applications of this work for atypical development.
Collapse
Affiliation(s)
- Daniel C Hyde
- a Department of Psychology , University of Illinois at Urbana-Champaign , Champaign , Illinois
| | - Ross Flom
- b Department of Psychology , Brigham Young University , Provo , Utah
| | - Chris L Porter
- c School of Family Life , Brigham Young University , Provo , Utah
| |
Collapse
|
15
|
Bahrick LE, Todd JT, Castellanos I, Sorondo BM. Enhanced attention to speaking faces versus other event types emerges gradually across infancy. Dev Psychol 2016; 52:1705-1720. [PMID: 27786526 PMCID: PMC5291072 DOI: 10.1037/dev0000157] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The development of attention to dynamic faces versus objects providing synchronous audiovisual versus silent visual stimulation was assessed in a large sample of infants. Maintaining attention to the faces and voices of people speaking is critical for perceptual, cognitive, social, and language development. However, no studies have systematically assessed when, if, or how attention to speaking faces emerges and changes across infancy. Two measures of attention maintenance, habituation time (HT) and look-away rate (LAR), were derived from cross-sectional data of 2- to 8-month-old infants (N = 801). Results indicated that attention to audiovisual faces and voices was maintained across age, whereas attention to each of the other event types (audiovisual objects, silent dynamic faces, silent dynamic objects) declined across age. This reveals a gradually emerging advantage in attention maintenance (longer HTs, lower LARs) for audiovisual speaking faces compared with the other 3 event types. At 2 months, infants showed no attentional advantage for faces (with greater attention to audiovisual than to visual events); at 3 months, they attended more to dynamic faces than objects (in the presence or absence of voices), and by 4 to 5 and 6 to 8 months, significantly greater attention emerged to temporally coordinated faces and voices of people speaking compared with all other event types. Our results indicate that selective attention to coordinated faces and voices over other event types emerges gradually across infancy, likely as a function of experience with multimodal, redundant stimulation from person and object events. (PsycINFO Database Record
Collapse
Affiliation(s)
| | | | - Irina Castellanos
- Department of Otolaryngology – Head and Neck Surgery, The Ohio State University, Columbus, OH
| | - Barbara M. Sorondo
- Flordia International University Libraries, Florida International University, Miami, FL
| |
Collapse
|
16
|
Johnson KM, Woods RJ. Give Me a Hand: Adult Involvement During Object Exploration Affects Object Individuation in Infancy. INFANT AND CHILD DEVELOPMENT 2016; 25:406-425. [PMID: 28082834 PMCID: PMC5222598 DOI: 10.1002/icd.1942] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2014] [Accepted: 09/10/2015] [Indexed: 11/11/2022]
Abstract
The development of object individuation, a fundamental ability that supports identification and discrimination of objects across discrete encounters, has been examined extensively by researchers. There are significant advancements in infants' ability to individuate objects during the first year-and-a-half. Experimental work has established a timeline of object individuation abilities and revealed some mechanisms underlying this ability, however, the influence of adult assistance during object exploration has not yet been explored. The current study investigates the effect of adult involvement during object exploration on infants' object individuation abilities. In Experiment 1a and 1b, we examined 9.5-month-old infants' colour-based object individuation following adult-assisted multisensory object exploration. Two components of adult interaction were of particular interest: facilitation of object manipulation (grasping, rotating, and attention-getting behaviours) and social engagement (smiling, pointing, attention-getting verbalizations, and object-directed gaze). Experiment 2a and 2b assessed these components with 4.5-month-olds to examine their impact across development. The results showed that after adult-guided object exploration, both 9.5- and 4.5-month-old infants successfully individuated previously undifferentiated objects. Results of Experiments 1b and 2b provide implications for the mechanisms underlying the scaffolding influence of adult interaction during infant behaviours.
Collapse
Affiliation(s)
- Kristin M. Johnson
- Department of Psychology and Neuroscience, Duke University, Durham, NC, USA
| | - Rebecca J. Woods
- Department of Human Development and Family Science, North Dakota State University, Fargo, ND, USA
| |
Collapse
|
17
|
Audiovisual alignment of co-speech gestures to speech supports word learning in 2-year-olds. J Exp Child Psychol 2016; 145:1-10. [DOI: 10.1016/j.jecp.2015.12.002] [Citation(s) in RCA: 42] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2015] [Revised: 12/08/2015] [Accepted: 12/11/2015] [Indexed: 11/23/2022]
|
18
|
Greenfield K, Ropar D, Smith AD, Carey M, Newport R. Visuo-tactile integration in autism: atypical temporal binding may underlie greater reliance on proprioceptive information. Mol Autism 2015; 6:51. [PMID: 26380064 PMCID: PMC4570750 DOI: 10.1186/s13229-015-0045-9] [Citation(s) in RCA: 45] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2015] [Accepted: 09/07/2015] [Indexed: 12/17/2022] Open
Abstract
BACKGROUND Evidence indicates that social functioning deficits and sensory sensitivities in autism spectrum disorder (ASD) are related to atypical sensory integration. The exact mechanisms underlying these integration difficulties are unknown; however, two leading accounts are (1) an over-reliance on proprioception and (2) atypical visuo-tactile temporal binding. We directly tested these theories by selectively manipulating proprioceptive alignment and visuo-tactile synchrony to assess the extent that these impact upon body ownership. METHODS Children with ASD and typically developing controls placed their hand into a multisensory illusion apparatus, which presented two, identical live video images of their own hand in the same plane as their actual hand. One virtual hand was aligned proprioceptively with the actual hand (the veridical hand), and the other was displaced to the left or right. While a brushstroke was applied to the participants' actual (hidden) hand, they observed the two virtual images of their hand also being stroked and were asked to identify their real hand. During brushing, one of three different temporal delays was applied to either the displaced hand or the veridical hand. Thus, only one virtual hand had synchronous visuo-tactile inputs. RESULTS Results showed that visuo-tactile synchrony overrides incongruent proprioceptive inputs in typically developing children but not in autistic children. Evidence for both temporally extended visuo-tactile binding and a greater reliance on proprioception are discussed. CONCLUSIONS This is the first study to provide definitive evidence for temporally extended visuo-tactile binding in ASD. This may result in reduced processing of amodal inputs (i.e. temporal synchrony) over modal-specific information (i.e. proprioception). This would likely lead to failures in appropriately binding information from related events, which would impact upon sensitivity to sensory stimuli, body representation and social processes such as empathy and imitation.
Collapse
Affiliation(s)
- Katie Greenfield
- School of Psychology, The University of Nottingham, University Park, Nottingham, NG7 2RD UK
| | - Danielle Ropar
- School of Psychology, The University of Nottingham, University Park, Nottingham, NG7 2RD UK
| | - Alastair D Smith
- School of Psychology, The University of Nottingham, University Park, Nottingham, NG7 2RD UK
| | - Mark Carey
- School of Psychology, The University of Nottingham, University Park, Nottingham, NG7 2RD UK
| | - Roger Newport
- School of Psychology, The University of Nottingham, University Park, Nottingham, NG7 2RD UK
| |
Collapse
|
19
|
Bahrick LE, Lickliter R, Castellanos I, Todd JT. Intrasensory Redundancy Facilitates Infant Detection of Tempo: Extending Predictions of the Intersensory Redundancy Hypothesis. INFANCY 2015; 20:377-404. [PMID: 26207101 PMCID: PMC4508026 DOI: 10.1111/infa.12081] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2014] [Accepted: 02/23/2015] [Indexed: 11/26/2022]
Abstract
Research has demonstrated that intersensory redundancy (stimulation synchronized across multiple senses) is highly salient and facilitates processing of amodal properties in multimodal events, bootstrapping early perceptual development. The present study is the first to extend this central principle of the intersensory redundancy hypothesis (IRH) to certain types of intrasensory redundancy (stimulation synchronized within a single sense). Infants were habituated to videos of a toy hammer tapping silently (unimodal control), depicting intersensory redundancy (synchronized with a soundtrack) or intrasensory redundancy (synchronized with another visual event; light flashing or bat tapping). In Experiment 1, 2-month-olds showed both intersensory and intrasensory facilitation (with respect to the unimodal control) for detecting a change in tempo. However, intrasensory facilitation was found when the hammer was synchronized with the light flashing (different motion) but not with the bat tapping (same motion). Experiment 2 tested 3-month-olds using a somewhat easier tempo contrast. Results supported a similarity hypothesis: intrasensory redundancy between two dissimilar events was more effective than that between two similar events for promoting processing of amodal properties. These findings extend the IRH and indicate that in addition to intersensory redundancy, intrasensory redundancy between two synchronized dissimilar visual events is also effective in promoting perceptual processing of amodal event properties.
Collapse
Affiliation(s)
| | - Robert Lickliter
- Department of Psychology, Florida International University, Miami, FL
| | - Irina Castellanos
- Department of Otolaryngology Head and Neck Surgery, Indiana University School of Medicine, Indianapolis, IN
| | | |
Collapse
|
20
|
Gerson SA, Schiavio A, Timmers R, Hunnius S. Active Drumming Experience Increases Infants' Sensitivity to Audiovisual Synchrony during Observed Drumming Actions. PLoS One 2015; 10:e0130960. [PMID: 26111226 PMCID: PMC4482535 DOI: 10.1371/journal.pone.0130960] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2014] [Accepted: 05/27/2015] [Indexed: 11/19/2022] Open
Abstract
In the current study, we examined the role of active experience on sensitivity to multisensory synchrony in six-month-old infants in a musical context. In the first of two experiments, we trained infants to produce a novel multimodal effect (i.e., a drum beat) and assessed the effects of this training, relative to no training, on their later perception of the synchrony between audio and visual presentation of the drumming action. In a second experiment, we then contrasted this active experience with the observation of drumming in order to test whether observation of the audiovisual effect was as effective for sensitivity to multimodal synchrony as active experience. Our results indicated that active experience provided a unique benefit above and beyond observational experience, providing insights on the embodied roots of (early) music perception and cognition.
Collapse
Affiliation(s)
- Sarah A. Gerson
- University of St Andrews, School of Psychology & Neuroscience, St Andrews, United Kingdom
- Donders Institute for Brain, Cognition, and Behaviour, Center for Cognition, Radboud University, Nijmegen, The Netherlands
| | - Andrea Schiavio
- Music Mind Machine in Sheffield, Department of Music, The University of Sheffield, Sheffield, United Kingdom
| | - Renee Timmers
- Music Mind Machine in Sheffield, Department of Music, The University of Sheffield, Sheffield, United Kingdom
| | - Sabine Hunnius
- Donders Institute for Brain, Cognition, and Behaviour, Center for Cognition, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
21
|
Ghanouni P, Memari AH, Shayestehfar M, Moshayedi P, Gharibzadeh S, Ziaee V. Biological motion perception is affected by age and cognitive style in children aged 8-15. Neurol Res Int 2015; 2015:594042. [PMID: 25861473 PMCID: PMC4378609 DOI: 10.1155/2015/594042] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2014] [Revised: 02/19/2015] [Accepted: 02/20/2015] [Indexed: 11/25/2022] Open
Abstract
The current paper aims to address the question of how biological motion perception in different social contexts is influenced by age or also affected by cognitive styles. We examined developmental changes of biological motion perception among 141 school children aged 8-15 using point-light displays in monadic and dyadic social contexts. Furthermore, the cognitive styles of participants were investigated using empathizing-systemizing questionnaires. Results showed that the age and empathizing ability strongly predicted improvement in action perception in both contexts. However the systemizing ability was an independent predictor of performance only in monadic contexts. Furthermore, accuracy of action perception increased significantly from 46.4% (SD = 16.1) in monadic to 62.5% (SD = 11.5) in dyadic social contexts. This study can help to identify the roles of social context in biological motion perception and shows that children with different cognitive styles may present different biological motion perception.
Collapse
Affiliation(s)
- Parisa Ghanouni
- Occupational Science and Occupational Therapy, Faculty of Medicine, University of British Columbia, Vancouver, Canada
| | - Amir Hossein Memari
- Neuroscience Institute, Sports Medicine Research Center, Tehran University of Medical Sciences, Tehran, Iran
| | - Monir Shayestehfar
- Neuroscience Institute, Sports Medicine Research Center, Tehran University of Medical Sciences, Tehran, Iran
| | - Pouria Moshayedi
- Department of Neurology, University of California, Los Angeles, CA, USA
| | - Shahriar Gharibzadeh
- Department of Biomedical Engineering, Amirkabir University of Technology, Tehran, Iran
| | - Vahid Ziaee
- Growth and Development Research Center, Tehran University of Medical Sciences, Tehran, Iran
| |
Collapse
|
22
|
Dionne-Dostie E, Paquette N, Lassonde M, Gallagher A. Multisensory integration and child neurodevelopment. Brain Sci 2015; 5:32-57. [PMID: 25679116 PMCID: PMC4390790 DOI: 10.3390/brainsci5010032] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2014] [Accepted: 01/27/2015] [Indexed: 12/17/2022] Open
Abstract
A considerable number of cognitive processes depend on the integration of multisensory information. The brain integrates this information, providing a complete representation of our surrounding world and giving us the ability to react optimally to the environment. Infancy is a period of great changes in brain structure and function that are reflected by the increase of processing capacities of the developing child. However, it is unclear if the optimal use of multisensory information is present early in childhood or develops only later, with experience. The first part of this review has focused on the typical development of multisensory integration (MSI). We have described the two hypotheses on the developmental process of MSI in neurotypical infants and children, and have introduced MSI and its neuroanatomic correlates. The second section has discussed the neurodevelopmental trajectory of MSI in cognitively-challenged infants and children. A few studies have brought to light various difficulties to integrate sensory information in children with a neurodevelopmental disorder. Consequently, we have exposed certain possible neurophysiological relationships between MSI deficits and neurodevelopmental disorders, especially dyslexia and attention deficit disorder with/without hyperactivity.
Collapse
Affiliation(s)
- Emmanuelle Dionne-Dostie
- Sainte-Justine University Hospital Research Center, Montreal H3T1C5, QC, Canada.
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Departement of Psychology, University of Montreal, C.P. 6128, Montreal H3C3J7, QC, Canada.
| | - Natacha Paquette
- Sainte-Justine University Hospital Research Center, Montreal H3T1C5, QC, Canada.
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Departement of Psychology, University of Montreal, C.P. 6128, Montreal H3C3J7, QC, Canada.
| | - Maryse Lassonde
- Sainte-Justine University Hospital Research Center, Montreal H3T1C5, QC, Canada.
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Departement of Psychology, University of Montreal, C.P. 6128, Montreal H3C3J7, QC, Canada.
| | - Anne Gallagher
- Sainte-Justine University Hospital Research Center, Montreal H3T1C5, QC, Canada.
- Centre de Recherche en Neuropsychologie et Cognition (CERNEC), Departement of Psychology, University of Montreal, C.P. 6128, Montreal H3C3J7, QC, Canada.
| |
Collapse
|
23
|
Gogate L, Maganti M, Bahrick LE. Cross-cultural evidence for multimodal motherese: Asian Indian mothers' adaptive use of synchronous words and gestures. J Exp Child Psychol 2015; 129:110-26. [PMID: 25285369 PMCID: PMC4252564 DOI: 10.1016/j.jecp.2014.09.002] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2014] [Revised: 09/02/2014] [Accepted: 09/05/2014] [Indexed: 10/24/2022]
Abstract
In a quasi-experimental study, 24 Asian Indian mothers were asked to teach novel (target) names for two objects and two actions to their children of three different levels of lexical mapping development: prelexical (5-8 months), early lexical (9-17 months), and advanced lexical (20-43 months). Target naming (n=1482) and non-target naming (other, n=2411) were coded for synchronous spoken words and object motion (multimodal motherese) and other naming styles. Indian mothers abundantly used multimodal motherese with target words to highlight novel word-referent relations, paralleling earlier findings from American mothers. They used it with target words more often for prelexical infants than for advanced lexical children and to name target actions later in children's development. Unlike American mothers, Indian mothers also abundantly used multimodal motherese to name target objects later in children's development. Finally, monolingual mothers who spoke a verb-dominant Indian language used multimodal motherese more often than bilingual mothers who also spoke noun-dominant English to their children. The findings suggest that within a dynamic and reciprocal mother-infant communication system, multimodal motherese adapts to unify novel words and referents across cultures. It adapts to children's level of lexical development and to ambient language-specific lexical dominance hierarchies.
Collapse
Affiliation(s)
- Lakshmi Gogate
- Psychology, Florida Gulf Coast University, Fort Myers, FL 33965
| | - Madhavilatha Maganti
- University of Hyderabad, Center for Neural and Cognitive Sciences, Gachibowli, Hyderabad, Andhra Pradesh, India
| | - Lorraine E. Bahrick
- Psychology, Florida International University, DM Building, Miami, Florida 33199
| |
Collapse
|
24
|
Brito NH, Grenell A, Barr R. Specificity of the bilingual advantage for memory: examining cued recall, generalization, and working memory in monolingual, bilingual, and trilingual toddlers. Front Psychol 2014; 5:1369. [PMID: 25520686 PMCID: PMC4251311 DOI: 10.3389/fpsyg.2014.01369] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2014] [Accepted: 11/10/2014] [Indexed: 11/13/2022] Open
Abstract
The specificity of the bilingual advantage in memory was examined by testing groups of monolingual, bilingual, and trilingual 24-month-olds on tasks tapping cued recall, memory generalization and working memory. For the cued recall and memory generalization conditions, there was a 24-h delay between time of encoding and time of retrieval. In addition to the memory tasks, parent-toddler dyads completed a picture-book reading task, in order to observe emotional responsiveness, and a parental report of productive vocabulary. Results indicated no difference between language groups on cued recall, working memory, emotional responsiveness, or productive vocabulary, but a significant difference was found in the memory generalization condition with only the bilingual group outperforming the baseline control group. These results replicate and extend results from past studies (Brito and Barr, 2012, 2014; Brito et al., 2014) and suggest a bilingual advantage specific to memory generalization.
Collapse
Affiliation(s)
- Natalie H. Brito
- Robert Wood Johnson Foundation Health and Society Scholars, Columbia University in the City of New YorkNew York, NY, USA
| | - Amanda Grenell
- Department of Psychology, Georgetown UniversityWashington, DC, USA
| | - Rachel Barr
- Department of Psychology, Georgetown UniversityWashington, DC, USA
| |
Collapse
|
25
|
ter Schure S, Mandell DJ, Escudero P, Raijmakers MEJ, Johnson SP. Learning Stimulus-Location Associations in 8- and 11-Month-Old Infants: Multimodal versus Unimodal Information. INFANCY 2014; 19:476-495. [PMID: 25147483 PMCID: PMC4136389 DOI: 10.1111/infa.12057] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2013] [Accepted: 06/17/2014] [Indexed: 11/29/2022]
Abstract
Research on the influence of multimodal information on infants' learning is inconclusive. While one line of research finds that multimodal input has a negative effect on learning, another finds positive effects. The present study aims to shed some new light on this discussion by studying the influence of multimodal information and accompanying stimulus complexity on the learning process. We assessed the influence of multimodal input on the trial-by-trial learning of 8- and 11-month-old infants. Using an anticipatory eye movement paradigm, we measured how infants learn to anticipate the correct stimulus-location associations when exposed to visual-only, auditory-only (unimodal), or auditory and visual (multimodal) information. Our results show that infants in both the multimodal and visual-only conditions learned the stimulus-location associations. Although infants in the visual-only condition appeared to learn in fewer trials, infants in the multimodal condition showed better anticipating behavior: as a group, they had a higher chance of anticipating correctly on more consecutive trials than infants in the visual-only condition. These findings suggest that effects of multimodal information on infant learning operate chiefly through effects on infants' attention.
Collapse
Affiliation(s)
| | | | - Paola Escudero
- Cognitive Science Center Amsterdam, University of Amsterdam
- MARCS Institute, University of Western Sydney
| | | | | |
Collapse
|
26
|
Gogate L, Maganti M, Perenyi A. Preterm and term infants’ perception of temporally coordinated syllable–object pairings: implications for lexical development. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2014; 57:187-198. [PMID: 24023374 DOI: 10.1044/1092-4388(2013/12-0403)] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
PURPOSE This experimental study examined term infants (n = 34) and low-risk near-term preterm infants (gestational age 32–36 weeks) at 2 months chronological age (n = 34) and corrected age (n =16). The study investigated whether the preterm infants presented with a delay in their sensitivity to synchronous syllable–object pairings when compared with term infants. METHOD First, infants were habituated to a single syllable, [tah] or [gah], spoken in synchrony with the motions of 1 of 4 toy objects, a crab, a porcupine, a star, or a lamb chop. Next, the infants received 2 syllable- and 2 object-change test trials, counterbalanced for order. RESULTS After factoring out differential looking time during habituation, the study found that preterm infants showed attenuated looks to the change in the object and the change in the syllable relative to term infants. CONCLUSIONS These findings suggest that even near-term preterm infants present with a delay in their sensitivity to synchrony in syllable–object pairings relative to term infants. Given the important role that synchrony plays in word mapping at 6–9 months, this early delay in sensitivity to synchrony might be an indicator of word mapping delays found in older preterm infants.
Collapse
|
27
|
Bahrick LE, Lickliter R, Castellanos I. The development of face perception in infancy: intersensory interference and unimodal visual facilitation. Dev Psychol 2013; 49:1919-30. [PMID: 23244407 PMCID: PMC3975831 DOI: 10.1037/a0031238] [Citation(s) in RCA: 55] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Although research has demonstrated impressive face perception skills of young infants, little attention has focused on conditions that enhance versus impair infant face perception. The present studies tested the prediction, generated from the intersensory redundancy hypothesis (IRH), that face discrimination, which relies on detection of visual featural information, would be impaired in the context of intersensory redundancy provided by audiovisual speech and enhanced in the absence of intersensory redundancy (unimodal visual and asynchronous audiovisual speech) in early development. Later in development, following improvements in attention, faces should be discriminated in both redundant audiovisual and nonredundant stimulation. Results supported these predictions. Two-month-old infants discriminated a novel face in unimodal visual and asynchronous audiovisual speech but not in synchronous audiovisual speech. By 3 months, face discrimination was evident even during synchronous audiovisual speech. These findings indicate that infant face perception is enhanced and emerges developmentally earlier following unimodal visual than synchronous audiovisual exposure and that intersensory redundancy generated by naturalistic audiovisual speech can interfere with face processing.
Collapse
|
28
|
Kopp F, Dietrich C. Neural dynamics of audiovisual synchrony and asynchrony perception in 6-month-old infants. Front Psychol 2013; 4:2. [PMID: 23346071 PMCID: PMC3549545 DOI: 10.3389/fpsyg.2013.00002] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2012] [Accepted: 01/03/2013] [Indexed: 12/03/2022] Open
Abstract
Young infants are sensitive to multisensory temporal synchrony relations, but the neural dynamics of temporal interactions between vision and audition in infancy are not well understood. We investigated audiovisual synchrony and asynchrony perception in 6-month-old infants using event-related brain potentials (ERP). In a prior behavioral experiment (n = 45), infants were habituated to an audiovisual synchronous stimulus and tested for recovery of interest by presenting an asynchronous test stimulus in which the visual stream was delayed with respect to the auditory stream by 400 ms. Infants who behaviorally discriminated the change in temporal alignment were included in further analyses. In the EEG experiment (final sample: n = 15), synchronous and asynchronous stimuli (visual delay of 400 ms) were presented in random order. Results show latency shifts in the auditory ERP components N1 and P2 as well as the infant ERP component Nc. Latencies in the asynchronous condition were significantly longer than in the synchronous condition. After video onset but preceding the auditory onset, amplitude modulations propagating from posterior to anterior sites and related to the Pb component of infants' ERP were observed. Results suggest temporal interactions between the two modalities. Specifically, they point to the significance of anticipatory visual motion for auditory processing, and indicate young infants' predictive capacities for audiovisual temporal synchrony relations.
Collapse
Affiliation(s)
- Franziska Kopp
- Center for Lifespan Psychology, Max Planck Institute for Human DevelopmentBerlin, Germany
| | - Claudia Dietrich
- Center for Lifespan Psychology, Max Planck Institute for Human DevelopmentBerlin, Germany
| |
Collapse
|
29
|
|
30
|
Lickliter R, Bahrick LE. The concept of homology as a basis for evaluating developmental mechanisms: exploring selective attention across the life-span. Dev Psychobiol 2013; 55:76-83. [PMID: 22711341 PMCID: PMC3962041 DOI: 10.1002/dev.21037] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2011] [Accepted: 03/28/2012] [Indexed: 11/11/2022]
Abstract
Research with human infants as well as non-human animal embryos and infants has consistently demonstrated the benefits of intersensory redundancy for perceptual learning and memory for redundantly specified information during early development. Studies of infant affect discrimination, face discrimination, numerical discrimination, sequence detection, abstract rule learning, and word comprehension and segmentation have all shown that intersensory redundancy promotes earlier detection of these properties when compared to unimodal exposure to the same properties. Here we explore the idea that such intersensory facilitation is evident across the life-span and that this continuity is an example of a developmental behavioral homology. We present evidence that intersensory facilitation is most apparent during early phases of learning for a variety of tasks, regardless of developmental level, including domains that are novel or tasks that require discrimination of fine detail or speeded responses. Under these conditions, infants, children, and adults all show intersensory facilitation, suggesting a developmental homology. We discuss the challenge and propose strategies for establishing appropriate guidelines for identifying developmental behavioral homologies. We conclude that evaluating the extent to which continuities observed across development are homologous can contribute to a better understanding of the processes of development.
Collapse
Affiliation(s)
- Robert Lickliter
- Department of Psychology, Florida International University, Miami, FL, USA.
| | | |
Collapse
|
31
|
Kirkham NZ, Richardson DC, Wu R, Johnson SP. The importance of “what”: Infants use featural information to index events. J Exp Child Psychol 2012; 113:430-9. [DOI: 10.1016/j.jecp.2012.07.001] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2011] [Revised: 06/27/2012] [Accepted: 07/02/2012] [Indexed: 11/26/2022]
|
32
|
Bremner JG, Slater AM, Johnson SP, Mason UC, Spring J. The effects of auditory information on 4-month-old infants' perception of trajectory continuity. Child Dev 2012; 83:954-64. [PMID: 22364395 PMCID: PMC3342422 DOI: 10.1111/j.1467-8624.2012.01739.x] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
Young infants perceive an object's trajectory as continuous across occlusion provided the temporal or spatial gap in perception is small. In 3 experiments involving 72 participants the authors investigated the effects of different forms of auditory information on 4-month-olds' perception of trajectory continuity. Provision of dynamic auditory information about the object's trajectory enhanced perception of trajectory continuity. However, a smaller positive effect was also obtained when the sound was continuous but provided no information about the object's location. Finally, providing discontinuous auditory information or auditory information that was dislocated relative to vision had negative effects on trajectory perception. These results are discussed relative to the intersensory redundancy hypothesis and emphasize the need to take an intersensory approach to infant perception.
Collapse
Affiliation(s)
- J Gavin Bremner
- Psychology Department, Centre for Research in Human Development and Learning, Lancaster University, Lancaster, UK.
| | | | | | | | | |
Collapse
|
33
|
Bremner JG, Slater AM, Johnson SP, Mason UC, Spring J. The effects of auditory information on 4-month-old infants' perception of trajectory continuity. Child Dev 2012. [PMID: 22364395 DOI: 10.1111/j.1467‐8624.2012.01739.x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Young infants perceive an object's trajectory as continuous across occlusion provided the temporal or spatial gap in perception is small. In 3 experiments involving 72 participants the authors investigated the effects of different forms of auditory information on 4-month-olds' perception of trajectory continuity. Provision of dynamic auditory information about the object's trajectory enhanced perception of trajectory continuity. However, a smaller positive effect was also obtained when the sound was continuous but provided no information about the object's location. Finally, providing discontinuous auditory information or auditory information that was dislocated relative to vision had negative effects on trajectory perception. These results are discussed relative to the intersensory redundancy hypothesis and emphasize the need to take an intersensory approach to infant perception.
Collapse
Affiliation(s)
- J Gavin Bremner
- Psychology Department, Centre for Research in Human Development and Learning, Lancaster University, Lancaster, UK.
| | | | | | | | | |
Collapse
|
34
|
Abstract
This article is an attempt to synthesize the current knowledge about synaesthesia from many fields such as literature, arts, multimedia, medicine, or psychology. The main goal of this paper is to classify various types and forms of synaesthesia. Besides developmental synaesthesia being likely to play a crucial role in developing cognitive functions (constitutional or neonatal synaesthesia) there are types of synaesthesia acquired during adulthood (e.g., phantom or artificial synaesthesia), momentary synaesthesia triggered temporarily in people who do not show signs of synaesthesia every day (e.g., virtual, narcotic, or posthypnotic synaesthesia), and associational synaesthesia which, semantically speaking, refers to some universal sense relations (e.g., literary, artistic, and multimedia synaesthesia). There is a hypothesis that every kind of synaesthesia holds a different function—compensatory or integrative. It was suggested that synaesthesia can be described in one dimension, showing the intensity of this phenomenon. The stronger types of synaesthesia are: semantic, conceptual, intermodal, synthetic, comprehensive, external and bidirectional. The weaker types of synaesthesia are: sensory, perceptual, intramodal, analytic, partial, internal and unidirectional. There are huge individual differences in the manner that synaesthesia presents itself. By including a classification of kinds, types, and forms of synaesthesia into future experimental research will ensure a better understanding of the nature of this phenomenon, its mechanisms and the role that it plays in developing cognitive processes.
Collapse
|
35
|
|
36
|
Flom R, Johnson S. The effects of adults' affective expression and direction of visual gaze on 12-month-olds' visual preferences for an object following a 5-minute, 1-day, or 1-month delay. BRITISH JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2011; 29:64-85. [DOI: 10.1348/026151010x512088] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
37
|
Hyde DC, Jones BL, Flom R, Porter CL. Neural signatures of face-voice synchrony in 5-month-old human infants. Dev Psychobiol 2011; 53:359-70. [DOI: 10.1002/dev.20525] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2010] [Accepted: 12/07/2010] [Indexed: 11/10/2022]
|
38
|
Bahrick LE, Lickliter R, Castellanos I, Vaillant-Molina M. Increasing task difficulty enhances effects of intersensory redundancy: testing a new prediction of the Intersensory Redundancy Hypothesis. Dev Sci 2010; 13:731-7. [PMID: 20712739 PMCID: PMC2931424 DOI: 10.1111/j.1467-7687.2009.00928.x] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Prior research has demonstrated intersensory facilitation for perception of amodal properties of events such as tempo and rhythm in early development, supporting predictions of the Intersensory Redundancy Hypothesis (IRH). Specifically, infants discriminate amodal properties in bimodal, redundant stimulation but not in unimodal, nonredundant stimulation in early development, whereas later in development infants can detect amodal properties in both redundant and nonredundant stimulation. The present study tested a new prediction of the IRH: that effects of intersensory redundancy on attention and perceptual processing are most apparent in tasks of high difficulty relative to the skills of the perceiver. We assessed whether by increasing task difficulty, older infants would revert to patterns of intersensory facilitation shown by younger infants. Results confirmed our prediction and demonstrated that in difficult tempo discrimination tasks, 5-month-olds perform like 3-month-olds, showing intersensory facilitation for tempo discrimination. In contrast, in tasks of low and moderate difficulty, 5-month-olds discriminate tempo changes in both redundant audiovisual and nonredundant unimodal visual stimulation. These findings indicate that intersensory facilitation is most apparent for tasks of relatively high difficulty and may therefore persist across the lifespan.
Collapse
Affiliation(s)
- Lorraine E Bahrick
- Department of Psychology, Florida International University, Miami, FL 33199, USA
| | | | | | | |
Collapse
|
39
|
Wilcox T, Smith TR. The development of infants' use of property-poor sounds to individuate objects. Infant Behav Dev 2010; 33:596-604. [PMID: 20701977 DOI: 10.1016/j.infbeh.2010.07.011] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2009] [Revised: 06/29/2010] [Accepted: 07/16/2010] [Indexed: 10/19/2022]
Abstract
There is evidence that infants as young as 4.5 months use property-rich but not property-poor sounds as the basis for individuating objects (Wilcox, Woods, Tuggy, & Napoli, 2006). The current research sought to identify the age at which infants demonstrate the capacity to use property-poor sounds. Using the task of Wilcox et al., infants aged 7 and 9 months were tested. The results revealed that 9- but not 7-month-olds demonstrated sensitivity to property-poor sounds (electronic tones) in an object individuation task. Additional results confirmed that the younger infants were sensitive to property-rich sounds (rattle sounds). These are the first positive results obtained with property-poor sounds in infants and lay the foundation for future research to identify the underlying basis for the developmental hierarchy favoring property-rich over property-poor sounds and possible mechanisms for change.
Collapse
Affiliation(s)
- Teresa Wilcox
- Department of Psychology, Texas A&M University, College Station, TX 77843, USA.
| | | |
Collapse
|
40
|
Mitchel AD, Weiss DJ. What's in a face? Visual contributions to speech segmentation. ACTA ACUST UNITED AC 2010. [DOI: 10.1080/01690960903209888] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
41
|
Jaime M, Bahrick L, Lickliter R. The Critical Role of Temporal Synchrony in the Salience of Intersensory Redundancy During Prenatal Development. INFANCY 2010; 15:61-82. [PMID: 21479154 PMCID: PMC3071537 DOI: 10.1111/j.1532-7078.2009.00008.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
We explored the amount and timing of temporal synchrony necessary to facilitate prenatal perceptual learning using an animal model, the bobwhite quail. Quail embryos were exposed to various audiovisual combinations of a bob-white maternal call paired with patterned light during the late stages of prenatal development and were tested postnatally for evidence of prenatal auditory learning of the familiarized call. Results revealed that a maternal call paired with a single pulse of light synchronized with one note of the five note call was sufficient to facilitate embryos' prenatal perceptual learning of the entire call. A synchronous note occurring at the onset of the call burst was most effective at facilitating learning. These findings highlight quail embryos' remarkable sensitivity to temporal synchrony and indicate its role in promoting learning of redundantly specified stimulus properties during prenatal development.
Collapse
Affiliation(s)
- Mark Jaime
- Department of Psychology University of Miami
| | | | | |
Collapse
|
42
|
Barutchu A, Danaher J, Crewther SG, Innes-Brown H, Shivdasani MN, Paolini AG. Audiovisual integration in noise by children and adults. J Exp Child Psychol 2010; 105:38-50. [DOI: 10.1016/j.jecp.2009.08.005] [Citation(s) in RCA: 68] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2008] [Revised: 08/31/2009] [Accepted: 08/31/2009] [Indexed: 11/28/2022]
|
43
|
Scofield J, Hernandez-Reif M, Keith AB. Preschool Children's Multimodal Word Learning. JOURNAL OF COGNITION AND DEVELOPMENT 2009. [DOI: 10.1080/15248370903417662] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
44
|
Adachi I, Kuwahata H, Fujita K, Tomonaga M, Matsuzawa T. Plasticity of ability to form cross-modal representations in infant Japanese macaques. Dev Sci 2009; 12:446-52. [PMID: 19371369 DOI: 10.1111/j.1467-7687.2008.00780.x] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
In a previous study, Adachi, Kuwahata, Fujita, Tomonaga & Matsuzawa demonstrated that infant Japanese macaques (Macaca fuscata) form cross-modal representations of conspecifics but not of humans. However, because the subjects in the experiment were raised in a large social group and had considerably less exposure to humans than to conspecifics, it was an open question whether their lack of cross-modal representation of humans simply reflected their lower levels of exposure to humans or was caused by some innate restrictions on the ability. To answer the question, we used the same procedure but tested infant Japanese macaques with more extensive experience of humans in daily life. Briefly, we presented monkeys with a photograph of either a monkey or a human face on an LCD monitor after playing a vocalization of one of these two species. The subjects looked at the monitor longer when a voice and a face were mismatched than when they were matched, irrespective of whether the preceding vocalization was a monkey's or a human's. This suggests that once monkeys have extensive experience with humans, they will form a cross-modal representation of humans as well as of conspecifics.
Collapse
Affiliation(s)
- Ikuma Adachi
- Department of Psychology, Graduate School of Letters, Kyoto University, Inuyama-city, Aichi 484-8506, Japan.
| | | | | | | | | |
Collapse
|
45
|
|
46
|
Flom R, Gentile DA, Pick AD. Infants’ discrimination of happy and sad music. Infant Behav Dev 2008; 31:716-28. [DOI: 10.1016/j.infbeh.2008.04.004] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2007] [Revised: 04/09/2008] [Accepted: 04/14/2008] [Indexed: 10/22/2022]
|
47
|
Jones EJH, Herbert JS. The Effect of Learning Experiences and Context on Infant Imitation and Generalization. INFANCY 2008. [DOI: 10.1080/15250000802458773] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
48
|
Guihou A, Vauclair J. Intermodal matching of vision and audition in infancy: A proposal for a new taxonomy. EUROPEAN JOURNAL OF DEVELOPMENTAL PSYCHOLOGY 2008. [DOI: 10.1080/17405620600760409] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
49
|
Schmuckler MA, Jewell DT. Infants' visual-proprioceptive intermodal perception with imperfect contingency information. Dev Psychobiol 2007; 49:387-98. [PMID: 17455236 DOI: 10.1002/dev.20214] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Two experiments explored 5-month-old infants' recognition of self-movement in the context of imperfect contingencies between felt and seen movement. Previous work has shown that infants can discriminate a display of another child's movements from an on-line video display of their own movements, even when featural information is removed. These earlier findings were extended by demonstrating self versus other discrimination when the visual information for movement was an unrelated object (a fluorescent mobile) directly attached to the child's leg, thus producing imperfect spatial and temporal contingency information. In contrast, intermodal recognition failed when the mobile was indirectly attached to infants' legs, thus eliminating spatial contingencies altogether and further weakening temporal contingencies. Together, these studies reveal that even imperfect contingency information can drive intermodal perception, given appropriate levels of spatial and temporal contingency information.
Collapse
Affiliation(s)
- Mark A Schmuckler
- Department of Psychology, University of Toronto at Scarborough, Scarborough Ontario, Canada M1C 1A4.
| | | |
Collapse
|
50
|
Taga G, Asakawa K. Selectivity and localization of cortical response to auditory and visual stimulation in awake infants aged 2 to 4 months. Neuroimage 2007; 36:1246-52. [PMID: 17524672 DOI: 10.1016/j.neuroimage.2007.04.037] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2006] [Revised: 03/23/2007] [Accepted: 04/02/2007] [Indexed: 11/20/2022] Open
Abstract
To better understand the development of multimodal perception, we examined selectivity and localization of cortical responses to auditory and visual stimuli in young infants. Near-infrared optical topography with 24 channels was used to measure event-related cerebral oxygenation changes of the bilateral temporal cortex in 15 infants aged 2 to 4 months, when they were exposed to speech sounds lasting 3 s and checkerboard pattern reversals lasting 3 s, which were asynchronously presented with different alternating intervals. Group analysis revealed focal increases in oxy-hemoglobin and decreases in deoxy-hemoglobin in both hemispheres in response to auditory, but not to visual, stimulation. These results indicate that localized areas of the primary auditory cortex and the auditory association cortex are involved in auditory perception in infants as young as 2 months of age. In contrast to the hypothesis that perception of distinct sensory modalities may not be separated due to cross talk over the immature cortex in young infants, the present study suggests that unrelated visual events do not influence on the auditory perception of awake infants.
Collapse
Affiliation(s)
- Gentaro Taga
- Graduate School of Education, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033, Japan.
| | | |
Collapse
|