1
|
Sulfaro AA, Robinson AK, Carlson TA. Properties of imagined experience across visual, auditory, and other sensory modalities. Conscious Cogn 2024; 117:103598. [PMID: 38086154 DOI: 10.1016/j.concog.2023.103598] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2023] [Revised: 10/13/2023] [Accepted: 10/23/2023] [Indexed: 01/16/2024]
Abstract
Little is known about the perceptual characteristics of mental images nor how they vary across sensory modalities. We conducted an exhaustive survey into how mental images are experienced across modalities, mainly targeting visual and auditory imagery of a single stimulus, the letter "O", to facilitate direct comparisons. We investigated temporal properties of mental images (e.g. onset latency, duration), spatial properties (e.g. apparent location), effort (e.g. ease, spontaneity, control), movement requirements (e.g. eye movements), real-imagined interactions (e.g. inner speech while reading), beliefs about imagery norms and terminologies, as well as respondent confidence. Participants also reported on the five traditional senses and their prominence during thinking, imagining, and dreaming. Overall, visual and auditory experiences dominated mental events, although auditory mental images were superior to visual mental images on almost every metric tested except regarding spatial properties. Our findings suggest that modality-specific differences in mental imagery may parallel those of other sensory neural processes.
Collapse
Affiliation(s)
- Alexander A Sulfaro
- School of Psychology, Griffith Taylor Building, The University of Sydney, Camperdown 2006, New South Wales, Australia.
| | - Amanda K Robinson
- School of Psychology, Griffith Taylor Building, The University of Sydney, Camperdown 2006, New South Wales, Australia; Queensland Brain Institute, The University of Queensland, St Lucia 4072, Queensland, Australia.
| | - Thomas A Carlson
- School of Psychology, Griffith Taylor Building, The University of Sydney, Camperdown 2006, New South Wales, Australia.
| |
Collapse
|
2
|
Körner A, Strack F. Articulation posture influences pitch during singing imagery. Psychon Bull Rev 2023; 30:2187-2195. [PMID: 37221280 PMCID: PMC10728233 DOI: 10.3758/s13423-023-02306-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/03/2023] [Indexed: 05/25/2023]
Abstract
Facial muscle activity contributes to singing and to articulation: in articulation, mouth shape can alter vowel identity; and in singing, facial movement correlates with pitch changes. Here, we examine whether mouth posture causally influences pitch during singing imagery. Based on perception-action theories and embodied cognition theories, we predict that mouth posture influences pitch judgments even when no overt utterances are produced. In two experiments (total N = 160), mouth posture was manipulated to resemble the articulation of either /i/ (as in English meet; retracted lips) or /o/ (as in French rose; protruded lips). Holding this mouth posture, participants were instructed to mentally "sing" given songs (which were all positive in valence) while listening with their inner ear and, afterwards, to assess the pitch of their mental chant. As predicted, compared to the o-posture, the i-posture led to higher pitch in mental singing. Thus, bodily states can shape experiential qualities, such as pitch, during imagery. This extends embodied music cognition and demonstrates a new link between language and music.
Collapse
Affiliation(s)
- Anita Körner
- Department of Psychology, University of Kassel, Holländische Straße 36-38, 34127, Kassel, Germany.
| | - Fritz Strack
- Department of Psychology, University of Würzburg, Würzburg, Germany
| |
Collapse
|
3
|
Sulfaro AA, Robinson AK, Carlson TA. Modelling perception as a hierarchical competition differentiates imagined, veridical, and hallucinated percepts. Neurosci Conscious 2023; 2023:niad018. [PMID: 37621984 PMCID: PMC10445666 DOI: 10.1093/nc/niad018] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Revised: 07/03/2023] [Accepted: 07/14/2023] [Indexed: 08/26/2023] Open
Abstract
Mental imagery is a process by which thoughts become experienced with sensory characteristics. Yet, it is not clear why mental images appear diminished compared to veridical images, nor how mental images are phenomenologically distinct from hallucinations, another type of non-veridical sensory experience. Current evidence suggests that imagination and veridical perception share neural resources. If so, we argue that considering how neural representations of externally generated stimuli (i.e. sensory input) and internally generated stimuli (i.e. thoughts) might interfere with one another can sufficiently differentiate between veridical, imaginary, and hallucinatory perception. We here use a simple computational model of a serially connected, hierarchical network with bidirectional information flow to emulate the primate visual system. We show that modelling even first approximations of neural competition can more coherently explain imagery phenomenology than non-competitive models. Our simulations predict that, without competing sensory input, imagined stimuli should ubiquitously dominate hierarchical representations. However, with competition, imagination should dominate high-level representations but largely fail to outcompete sensory inputs at lower processing levels. To interpret our findings, we assume that low-level stimulus information (e.g. in early visual cortices) contributes most to the sensory aspects of perceptual experience, while high-level stimulus information (e.g. towards temporal regions) contributes most to its abstract aspects. Our findings therefore suggest that ongoing bottom-up inputs during waking life may prevent imagination from overriding veridical sensory experience. In contrast, internally generated stimuli may be hallucinated when sensory input is dampened or eradicated. Our approach can explain individual differences in imagery, along with aspects of daydreaming, hallucinations, and non-visual mental imagery.
Collapse
Affiliation(s)
- Alexander A Sulfaro
- School of Psychology, Griffith Taylor Building, The University of Sydney, Camperdown, NSW 2006, Australia
| | - Amanda K Robinson
- School of Psychology, Griffith Taylor Building, The University of Sydney, Camperdown, NSW 2006, Australia
- Queensland Brain Institute, QBI Building 79, The University of Queensland, St Lucia, QLD 4067, Australia
| | - Thomas A Carlson
- School of Psychology, Griffith Taylor Building, The University of Sydney, Camperdown, NSW 2006, Australia
| |
Collapse
|
4
|
May L, Halpern AR, Paulsen SD, Casey MA. Imagined Musical Scale Relationships Decoded from Auditory Cortex. J Cogn Neurosci 2022; 34:1326-1339. [PMID: 35554552 DOI: 10.1162/jocn_a_01858] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Notes in a musical scale convey different levels of stability or incompleteness, forming what is known as a tonal hierarchy. Levels of stability conveyed by these scale degrees are partly responsible for generating expectations as a melody proceeds, for emotions deriving from fulfillment (or not) of those expectations, and for judgments of overall melodic well-formedness. These functions can be extracted even during imagined music. We investigated whether patterns of neural activity in fMRI could be used to identify heard and imagined notes, and if patterns associated with heard notes could identify notes that were merely imagined. We presented trained musicians with the beginning of a scale (key and timbre were varied). The next note in the scale was either heard or imagined. A probe tone task assessed sensitivity to the tonal hierarchy, and state and trait measures of imagery were included as predictors. Multivoxel classification yielded above-chance results in primary auditory cortex (Heschl's gyrus) for heard scale-degree decoding. Imagined scale-degree decoding was successful in multiple cortical regions spanning bilateral superior temporal, inferior parietal, precentral, and inferior frontal areas. The right superior temporal gyrus yielded successful cross-decoding of heard-to-imagined scale-degree, indicating a shared pathway between tonal-hierarchy perception and imagery. Decoding in right and left superior temporal gyrus and right inferior frontal gyrus was more successful in people with more differentiated tonal hierarchies and in left inferior frontal gyrus among people with higher self-reported auditory imagery vividness, providing a link between behavioral traits and success of neural decoding. These results point to the neural specificity of imagined auditory experiences-even of such functional knowledge-but also document informative individual differences in the precision of that neural response.
Collapse
|
5
|
Copelli F, Rovetti J, Ammirante P, Russo FA. Human mirror neuron system responsivity to unimodal and multimodal presentations of action. Exp Brain Res 2021; 240:537-548. [PMID: 34817643 DOI: 10.1007/s00221-021-06266-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2020] [Accepted: 11/01/2021] [Indexed: 11/28/2022]
Abstract
This study aims to clarify unresolved questions from two earlier studies by McGarry et al. Exp Brain Res 218(4): 527-538, 2012 and Kaplan and Iacoboni Cogn Process 8: 103-113, 2007 on human mirror neuron system (hMNS) responsivity to multimodal presentations of actions. These questions are: (1) whether the two frontal areas originally identified by Kaplan and Iacoboni (ventral premotor cortex [vPMC] and inferior frontal gyrus [IFG]) are both part of the hMNS (i.e., do they respond to execution as well as observation), (2) whether both areas yield effects of biologicalness (biological, control) and modality (audio, visual, audiovisual), and (3) whether the vPMC is preferentially responsive to multimodal input. To resolve these questions about the hMNS, we replicated and extended McGarry et al.'s electroencephalography (EEG) study, while incorporating advanced source localization methods. Participants were asked to execute movements (ripping paper) as well as observe those movements across the same three modalities (audio, visual, and audiovisual), all while 64-channel EEG data was recorded. Two frontal sources consistent with those identified in prior studies showed mu event-related desynchronization (mu-ERD) under execution and observation conditions. These sources also showed a greater response to biological movement than to control stimuli as well as a distinct visual advantage, with greater responsivity to visual and audiovisual compared to audio conditions. Exploratory analyses of mu-ERD in the vPMC under visual and audiovisual observation conditions suggests that the hMNS tracks the magnitude of visual movement over time.
Collapse
Affiliation(s)
- Fran Copelli
- Department of Psychology, Ryerson University, Toronto, ON, Canada
| | - Joseph Rovetti
- Department of Psychology, Ryerson University, Toronto, ON, Canada
| | - Paolo Ammirante
- Department of Psychology, Ryerson University, Toronto, ON, Canada
| | - Frank A Russo
- Department of Psychology, Ryerson University, Toronto, ON, Canada.
| |
Collapse
|
6
|
What do less accurate singers remember? Pitch-matching ability and long-term memory for music. Atten Percept Psychophys 2021; 84:260-269. [PMID: 34796466 DOI: 10.3758/s13414-021-02391-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/05/2021] [Indexed: 11/08/2022]
Abstract
We have only a partial understanding of how people remember nonverbal information such as melodies. Although once learned, melodies can be retained well over long periods of time, remembering newly presented melodies is on average quite difficult. People vary considerably, however, in their level of success in both memory situations. Here, we examine a skill we anticipated would be correlated with memory for melodies: the ability to accurately reproduce pitches. Such a correlation would constitute evidence that melodic memory involves at least covert sensorimotor codes. Experiment 1 looked at episodic memory for new melodies among nonmusicians, both overall and with respect to the Vocal Memory Advantage (VMA): the superiority in remembering melodies presented as sung on a syllable compared to rendered on an instrument. Although we replicated the VMA, our prediction that better pitch matchers would have a larger VMA was not supported, although there was a modest correlation with memory for melodies presented in a piano timbre. Experiment 2 examined long-term memory for the starting pitch of familiar recorded music. Participants selected the starting note of familiar songs on a keyboard, without singing. Nevertheless, we found that better pitch-matchers were more accurate in reproducing the correct starting note. We conclude that sensorimotor coding may be used in storing and retrieving exact melodic information, but is not so useful during early encounters with melodies, as initial coding seems to involve more derived properties such as pitch contour and tonality.
Collapse
|
7
|
Marion G, Di Liberto GM, Shamma SA. The Music of Silence: Part I: Responses to Musical Imagery Encode Melodic Expectations and Acoustics. J Neurosci 2021; 41:7435-7448. [PMID: 34341155 PMCID: PMC8412990 DOI: 10.1523/jneurosci.0183-21.2021] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Revised: 06/23/2021] [Accepted: 06/28/2021] [Indexed: 02/06/2023] Open
Abstract
Musical imagery is the voluntary internal hearing of music in the mind without the need for physical action or external stimulation. Numerous studies have already revealed brain areas activated during imagery. However, it remains unclear to what extent imagined music responses preserve the detailed temporal dynamics of the acoustic stimulus envelope and, crucially, whether melodic expectations play any role in modulating responses to imagined music, as they prominently do during listening. These modulations are important as they reflect aspects of the human musical experience, such as its acquisition, engagement, and enjoyment. This study explored the nature of these modulations in imagined music based on EEG recordings from 21 professional musicians (6 females and 15 males). Regression analyses were conducted to demonstrate that imagined neural signals can be predicted accurately, similarly to the listening task, and were sufficiently robust to allow for accurate identification of the imagined musical piece from the EEG. In doing so, our results indicate that imagery and listening tasks elicited an overlapping but distinctive topography of neural responses to sound acoustics, which is in line with previous fMRI literature. Melodic expectation, however, evoked very similar frontal spatial activation in both conditions, suggesting that they are supported by the same underlying mechanisms. Finally, neural responses induced by imagery exhibited a specific transformation from the listening condition, which primarily included a relative delay and a polarity inversion of the response. This transformation demonstrates the top-down predictive nature of the expectation mechanisms arising during both listening and imagery.SIGNIFICANCE STATEMENT It is well known that the human brain is activated during musical imagery: the act of voluntarily hearing music in our mind without external stimulation. It is unclear, however, what the temporal dynamics of this activation are, as well as what musical features are precisely encoded in the neural signals. This study uses an experimental paradigm with high temporal precision to record and analyze the cortical activity during musical imagery. This study reveals that neural signals encode music acoustics and melodic expectations during both listening and imagery. Crucially, it is also found that a simple mapping based on a time-shift and a polarity inversion could robustly describe the relationship between listening and imagery signals.
Collapse
Affiliation(s)
- Guilhem Marion
- Laboratoire des Systèmes Perceptifs, Département d'Étude Cognitive, École Normale Supérieure, PSL, 75005, Paris, France
| | - Giovanni M Di Liberto
- Laboratoire des Systèmes Perceptifs, Département d'Étude Cognitive, École Normale Supérieure, PSL, 75005, Paris, France
- Trinity Centre for Biomedical Engineering, Trinity College Institute of Neuroscience, Department of Mechanical, Manufacturing and Biomedical Engineering, Trinity College, University of Dublin, D02 PN40, Dublin 2, Ireland
- School of Electrical and Electronic Engineering and UCD Centre for Biomedical Engineering, University College Dublin, D04 V1W8, Dublin 4, Ireland
| | - Shihab A Shamma
- Laboratoire des Systèmes Perceptifs, Département d'Étude Cognitive, École Normale Supérieure, PSL, 75005, Paris, France
- Institute for Systems Research, Electrical and Computer Engineering, University of Maryland, College Park, MD 20742
| |
Collapse
|
8
|
Liikkanen LA, Jakubowski K. Involuntary musical imagery as a component of ordinary music cognition: A review of empirical evidence. Psychon Bull Rev 2020; 27:1195-1217. [PMID: 32583211 PMCID: PMC7704448 DOI: 10.3758/s13423-020-01750-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Involuntary musical imagery (INMI) refers to a conscious mental experience of music that occurs without deliberate efforts to initiate or sustain it. This experience often consists of the repetition of a short fragment of a melody, colloquially called an "earworm." Here, we present the first comprehensive, qualitative review of published empirical research on INMI to date. We performed an extensive literature search and discovered, in total, 47 studies from 33 peer-reviewed articles that met the inclusion criteria for the review. In analyzing the content of these studies, we identified four major research themes, which concern the phenomenology, dynamics, individual differences, and musical features of INMI. The findings answer many questions of scientific interest-for instance, what is typical in terms of INMI frequency, duration, and content; which factors influence INMI onset; and whether demographic and personality factors can explain individual differences in susceptibility and responses to INMI. This review showcases INMI as a well-established phenomenon in light of a substantial body of empirical studies that have accumulated consistent results. Although the populations under study show an unfavorable bias towards Western, educated participants, the evidence depicts INMI as a universal psychological phenomenon, the possible function of which we do not yet fully understand. The concluding section introduces several suggestions for future research to expand on the topic.
Collapse
Affiliation(s)
- Lassi A Liikkanen
- Department of Digital Humanities, University of Helsinki, Helsinki, Finland
| | | |
Collapse
|
9
|
Costantino A, Di Stefano N, Taffoni F, Di Pino G, Casale M, Keller F. Embodying melody through a conducting baton: a pilot comparison between musicians and non-musicians. Exp Brain Res 2020; 238:2279-2291. [PMID: 32725358 DOI: 10.1007/s00221-020-05890-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Accepted: 07/20/2020] [Indexed: 10/23/2022]
Abstract
Finger-tapping tasks have been widely adopted to investigate auditory-motor synchronization, i.e., the coupling of movement with an external auditory rhythm. However, the discrete nature of these movements usually limits their application to the study of beat perception in the context of isochronous rhythms. The purpose of the present pilot study was to test an innovative task that allows investigating bodily responses to complex, non-isochronous rhythms. A conductor's baton was provided to 16 healthy subjects, divided into 2 different groups depending on the years of musical training they had received (musicians or non-musicians). Ad hoc-created melodies, including notes of different durations, were played to the subjects. Each subject was asked to move the baton up and down according to the changes in pitch contour. Software for video analysis and modelling (Tracker®) was used to track the movement of the baton tip. The main parameters used for the analysis were the velocity peaks in the vertical axis. In the musician group, the number of velocity peaks exactly matched the number of notes, while in the non-musician group, the number of velocity peaks exceeded the number of notes. An exploratory data analysis using Poincaré plots suggested a greater degree of coupling between hand-arm movements and melody in musicians both with isochronous and non-isochronous rhythms. The calculated root mean square error (RMSE) between the note onset times and the velocity peaks, and the analysis of the distribution of velocity peaks in relationship to note onset times confirmed the effect of musical training. Notwithstanding the small number of participants, these results suggest that this novel behavioural task could be used to investigate auditory-motor coupling in the context of music in an ecologically valid setting. Furthermore, the task may be used for rhythm training and rehabilitation in neurological patients with movement disorders.
Collapse
Affiliation(s)
- Andrea Costantino
- Integrated Sleep Surgery Team UCBM, Unit of Otolaryngology - Integrated Therapies in Otolaryngology, Campus Bio-Medico University, Rome, Italy.
| | - Nicola Di Stefano
- Department of Philosophy and Cultural Heritage, Ca' Foscari University of Venice, Venice, Italy
- FAST, Institute of Philosophy of Scientific and Technological Practice, Campus Bio-Medico University, Rome, Italy
| | - Fabrizio Taffoni
- Advanced Robotics and Human-Centred Technologies - CREO Lab, Campus Bio-Medico University, Rome, Italy
| | - Giovanni Di Pino
- Research Unit of Neurophysiology and Neuroengineering of Human-Technology Interaction (NeXTlab), Campus Bio-Medico University, Rome, Italy
| | - Manuele Casale
- Integrated Sleep Surgery Team UCBM, Unit of Otolaryngology - Integrated Therapies in Otolaryngology, Campus Bio-Medico University, Rome, Italy
| | - Flavio Keller
- FAST, Institute of Philosophy of Scientific and Technological Practice, Campus Bio-Medico University, Rome, Italy.
- Laboratory of Developmental Neuroscience and Neural Plasticity, Campus Bio-Medico University, Rome, Italy.
| |
Collapse
|
10
|
Vocal-motor interference eliminates the memory advantage for vocal melodies. Brain Cogn 2020; 145:105622. [PMID: 32949847 DOI: 10.1016/j.bandc.2020.105622] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2020] [Revised: 08/21/2020] [Accepted: 08/30/2020] [Indexed: 11/21/2022]
Abstract
Spontaneous motor cortical activity during passive perception of action has been interpreted as a sensorimotor simulation of the observed action. There is currently interest in how sensorimotor simulation can support higher-up cognitive functions, such as memory, but this is relatively unexplored in the auditory domain. In the present study, we examined whether the established memory advantage for vocal melodies over non-vocal melodies is attributable to stronger sensorimotor simulation during perception of vocal relative to non-vocal action. Participants listened to 24 unfamiliar folk melodies presented in vocal or piano timbres. These were encoded during three interference conditions: whispering (vocal-motor interference), tapping (non-vocal motor interference), and no-interference. Afterwards, participants heard the original 24 melodies presented among 24 foils and judged whether melodies were old or new. A vocal-memory advantage was found in the no-interference and tapping conditions; however, the advantage was eliminated in the whispering condition. This suggests that sensorimotor simulationduring the perception of vocal melodies is responsible for the observed vocal-memory advantage.
Collapse
|
11
|
Nalborczyk L, Grandchamp R, Koster EHW, Perrone-Bertolotti M, Lœvenbruck H. Can we decode phonetic features in inner speech using surface electromyography? PLoS One 2020; 15:e0233282. [PMID: 32459800 PMCID: PMC7252628 DOI: 10.1371/journal.pone.0233282] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2019] [Accepted: 05/01/2020] [Indexed: 11/18/2022] Open
Abstract
Although having a long history of scrutiny in experimental psychology, it is still controversial whether wilful inner speech (covert speech) production is accompanied by specific activity in speech muscles. We present the results of a preregistered experiment looking at the electromyographic correlates of both overt speech and inner speech production of two phonetic classes of nonwords. An automatic classification approach was undertaken to discriminate between two articulatory features contained in nonwords uttered in both overt and covert speech. Although this approach led to reasonable accuracy rates during overt speech production, it failed to discriminate inner speech phonetic content based on surface electromyography signals. However, exploratory analyses conducted at the individual level revealed that it seemed possible to distinguish between rounded and spread nonwords covertly produced, in two participants. We discuss these results in relation to the existing literature and suggest alternative ways of testing the engagement of the speech motor system during wilful inner speech production.
Collapse
Affiliation(s)
- Ladislas Nalborczyk
- Univ. Grenoble Alpes, CNRS, LPNC, Grenoble, France
- Department of Experimental Clinical and Health Psychology, Ghent University, Ghent, Belgium
- * E-mail:
| | | | - Ernst H. W. Koster
- Department of Experimental Clinical and Health Psychology, Ghent University, Ghent, Belgium
| | | | | |
Collapse
|
12
|
Abstract
Individuals with autism spectrum disorder (ASD) reportedly possess preserved or superior music-processing skills compared to their typically developing counterparts. We examined auditory imagery and earworms (tunes that get "stuck" in the head) in adults with ASD and controls. Both groups completed a short earworm questionnaire together with the Bucknell Auditory Imagery Scale. Results showed poorer auditory imagery in the ASD group for all types of auditory imagery. However, the ASD group did not report fewer earworms than matched controls. These data suggest a possible basis in poor auditory imagery for poor prosody in ASD, but also highlight a separability between auditory imagery and control of musical memories. The separability is present in the ASD group but not in typically developing individuals.
Collapse
Affiliation(s)
- Alex Bacon
- School of Psychology and Clinical Language Sciences, University of Reading, Earley Gate, Whiteknights, Reading, RG6 6AL, UK
| | - C Philip Beaman
- School of Psychology and Clinical Language Sciences, University of Reading, Earley Gate, Whiteknights, Reading, RG6 6AL, UK.
| | - Fang Liu
- School of Psychology and Clinical Language Sciences, University of Reading, Earley Gate, Whiteknights, Reading, RG6 6AL, UK
| |
Collapse
|