1
|
Körner A, Strack F. Articulation posture influences pitch during singing imagery. Psychon Bull Rev 2023; 30:2187-2195. [PMID: 37221280 PMCID: PMC10728233 DOI: 10.3758/s13423-023-02306-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/03/2023] [Indexed: 05/25/2023]
Abstract
Facial muscle activity contributes to singing and to articulation: in articulation, mouth shape can alter vowel identity; and in singing, facial movement correlates with pitch changes. Here, we examine whether mouth posture causally influences pitch during singing imagery. Based on perception-action theories and embodied cognition theories, we predict that mouth posture influences pitch judgments even when no overt utterances are produced. In two experiments (total N = 160), mouth posture was manipulated to resemble the articulation of either /i/ (as in English meet; retracted lips) or /o/ (as in French rose; protruded lips). Holding this mouth posture, participants were instructed to mentally "sing" given songs (which were all positive in valence) while listening with their inner ear and, afterwards, to assess the pitch of their mental chant. As predicted, compared to the o-posture, the i-posture led to higher pitch in mental singing. Thus, bodily states can shape experiential qualities, such as pitch, during imagery. This extends embodied music cognition and demonstrates a new link between language and music.
Collapse
Affiliation(s)
- Anita Körner
- Department of Psychology, University of Kassel, Holländische Straße 36-38, 34127, Kassel, Germany.
| | - Fritz Strack
- Department of Psychology, University of Würzburg, Würzburg, Germany
| |
Collapse
|
2
|
Teshima K, Ishida K, Nittono H. Auditory perceptual processing during musical imagery: An event-related potential study. Neurosci Lett 2021; 762:136148. [PMID: 34339803 DOI: 10.1016/j.neulet.2021.136148] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2021] [Revised: 07/25/2021] [Accepted: 07/28/2021] [Indexed: 10/20/2022]
Abstract
The perceptual processing of a sound is facilitated when the sound matches auditory imagery. Previous studies have shown that auditory imagery and actual sound activate the auditory cortex in a similar fashion. To investigate whether auditory imagery is a modality-specific representation or an amodal representation, the current study examined how watching silent music videos affected the auditory processing of sound excerpts. Twenty university students were asked to form musical imagery of Japanese popular songs while watching the official music videos. Event-related brain potentials were recorded in response to short sound excerpts from the on-screen video or from a different video. The results showed that the amplitude of the exogenous N1 component (90-110 ms) was smaller for imagery-matched than for unmatched sound excerpts. The electrical source of the difference was estimated in the auditory cortex. After the N1, the matched excerpts elicited a larger late positive potential (400-800 ms) than the unmatched excerpts. These findings suggest that auditory imagery involves modality-specific neural processing and that imagery-matched sounds are processed efficiently at an early stage, inducing additional cognitive processing at a later stage.
Collapse
Affiliation(s)
- Konomi Teshima
- Graduate School of Human Sciences, Osaka University, Japan
| | - Kai Ishida
- Graduate School of Human Sciences, Osaka University, Japan
| | | |
Collapse
|
3
|
Heaton P, Tsang WF, Jakubowski K, Mullensiefen D, Allen R. Discriminating autism and language impairment and specific language impairment through acuity of musical imagery. Res Dev Disabil 2018; 80:52-63. [PMID: 29913330 DOI: 10.1016/j.ridd.2018.06.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/17/2018] [Revised: 06/05/2018] [Accepted: 06/07/2018] [Indexed: 06/08/2023]
Abstract
Deficits in auditory short-term memory have been widely reported in children with Specific Language Impairment (SLI), and recent evidence suggests that children with Autism Spectrum Disorder and co-morbid language impairment (ALI) experience similar difficulties. Music, like language relies on auditory memory and the aim of the study was to extend work investigating the impact of auditory short-term memory impairments to musical perception in children with neurodevelopmental disorders. Groups of children with SLI and ALI were matched on chronological age (CA), receptive vocabulary, non-verbal intelligence and digit span, and compared with CA matched typically developing (TD) controls, on tests of pitch and temporal acuity within a voluntary musical imagery paradigm. The SLI participants performed at significantly lower levels than the ALI and TD groups on both conditions of the task and their musical imagery and digit span scores were positively correlated. In contrast ALI participants performed as well as TD controls on the tempo condition and better than TD controls on the pitch condition of the task. Whilst auditory short-term memory and receptive vocabulary impairments were similar across ALI and SLI groups, these were not associated with a deficit in voluntary musical imagery performance in the ALI group.
Collapse
Affiliation(s)
- Pamela Heaton
- Psychology, Goldsmiths University of London, New Cross, London, SE14 6NW, United Kingdom.
| | - Wai Fung Tsang
- Psychology, Goldsmiths University of London, New Cross, London, SE14 6NW, United Kingdom
| | - Kelly Jakubowski
- Music, University of Durham, Palace Green, Durham, DH1 3RL, United Kingdom
| | - Daniel Mullensiefen
- Psychology, Goldsmiths University of London, New Cross, London, SE14 6NW, United Kingdom
| | - Rory Allen
- Psychology, Goldsmiths University of London, New Cross, London, SE14 6NW, United Kingdom
| |
Collapse
|
4
|
Jakubowski K, Bashir Z, Farrugia N, Stewart L. Involuntary and voluntary recall of musical memories: A comparison of temporal accuracy and emotional responses. Mem Cognit 2018; 46:741-56. [PMID: 29380139 DOI: 10.3758/s13421-018-0792-x] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Comparisons between involuntarily and voluntarily retrieved autobiographical memories have revealed similarities in encoding and maintenance, with differences in terms of specificity and emotional responses. Our study extended this research area into the domain of musical memory, which afforded a unique opportunity to compare the same memory as accessed both involuntarily and voluntarily. Specifically, we compared instances of involuntary musical imagery (INMI, or “earworms”)—the spontaneous mental recall and repetition of a tune—to deliberate recall of the same tune as voluntary musical imagery (VMI) in terms of recall accuracy and emotional responses. Twenty participants completed two 3-day tasks. In an INMI task, participants recorded information about INMI episodes as they occurred; in a VMI task, participants were prompted via text message to deliberately imagine each tune they had previously experienced as INMI. In both tasks, tempi of the imagined tunes were recorded by tapping to the musical beat while wearing an accelerometer and additional information (e.g., tune name, emotion ratings) was logged in a diary. Overall, INMI and VMI tempo measurements for the same tune were strongly correlated. Tempo recall for tunes that have definitive, recorded versions was relatively accurate, and tunes that were retrieved deliberately (VMI) were not recalled more accurately in terms of tempo than spontaneous and involuntary instances of imagined music (INMI). Some evidence that INMI elicited stronger emotional responses than VMI was also revealed. These results demonstrate several parallels to previous literature on involuntary memories and add new insights on the phenomenology of INMI.
Collapse
|
5
|
Gabriel D, Wong TC, Nicolier M, Giustiniani J, Mignot C, Noiret N, Monnin J, Magnin E, Pazart L, Moulin T, Haffen E, Vandel P. Don't forget the lyrics! Spatiotemporal dynamics of neural mechanisms spontaneously evoked by gaps of silence in familiar and newly learned songs. Neurobiol Learn Mem 2016; 132:18-28. [PMID: 27131744 DOI: 10.1016/j.nlm.2016.04.011] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2015] [Revised: 04/18/2016] [Accepted: 04/24/2016] [Indexed: 10/21/2022]
Abstract
The vast majority of people experience musical imagery, the sensation of reliving a song in absence of any external stimulation. Internal perception of a song can be deliberate and effortful, but also may occur involuntarily and spontaneously. Moreover, musical imagery is also involuntarily used for automatically completing missing parts of music or lyrics from a familiar song. The aim of our study was to explore the onset of musical imagery dynamics that leads to the automatic completion of missing lyrics. High-density electroencephalography was used to record the cerebral activity of twenty healthy volunteers while they were passively listening to unfamiliar songs, very familiar songs, and songs previously listened to for two weeks. Silent gaps inserted into these songs elicited a series of neural activations encompassing perceptual, attentional and cognitive mechanisms (range 100-500ms). Familiarity and learning effects emerged as early as 100ms and lasted 400ms after silence occurred. Although participants reported more easily mentally imagining lyrics in familiar rather than passively learnt songs, the onset of neural mechanisms and the power spectrum underlying musical imagery were similar for both types of songs. This study offers new insights into the musical imagery dynamics evoked by gaps of silence and on the role of familiarity and learning processes in the generation of these dynamics. The automatic and effortless method presented here is a potentially useful tool to understand failure in the familiarity and learning processes of pathological populations.
Collapse
Affiliation(s)
- Damien Gabriel
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France; Neurosciences intégratives et cliniques EA 481, Univ. Franche-Comté, Univ. Bourgogne Franche-Comté, F-25000 Besançon, France.
| | - Thian Chiew Wong
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France
| | - Magali Nicolier
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France; Neurosciences intégratives et cliniques EA 481, Univ. Franche-Comté, Univ. Bourgogne Franche-Comté, F-25000 Besançon, France; Service de psychiatrie de l'adulte, CHRU Besançon, F-25000 Besançon, France
| | - Julie Giustiniani
- Service de psychiatrie de l'adulte, CHRU Besançon, F-25000 Besançon, France
| | - Coralie Mignot
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France
| | - Nicolas Noiret
- Centre Mémoire de Ressource et de Recherche de Franche-Comté, CHRU Besançon, F-25000 Besançon, France; Laboratoire de psychologie EA 3188, Université de Franche-Comté, Besançon, France
| | - Julie Monnin
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France; Neurosciences intégratives et cliniques EA 481, Univ. Franche-Comté, Univ. Bourgogne Franche-Comté, F-25000 Besançon, France; Service de psychiatrie de l'adulte, CHRU Besançon, F-25000 Besançon, France
| | - Eloi Magnin
- Centre Mémoire de Ressource et de Recherche de Franche-Comté, CHRU Besançon, F-25000 Besançon, France; Service de neurologie, CHRU Besançon, F-25000 Besançon, France
| | - Lionel Pazart
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France; Neurosciences intégratives et cliniques EA 481, Univ. Franche-Comté, Univ. Bourgogne Franche-Comté, F-25000 Besançon, France
| | - Thierry Moulin
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France; Neurosciences intégratives et cliniques EA 481, Univ. Franche-Comté, Univ. Bourgogne Franche-Comté, F-25000 Besançon, France; Service de neurologie, CHRU Besançon, F-25000 Besançon, France
| | - Emmanuel Haffen
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France; Neurosciences intégratives et cliniques EA 481, Univ. Franche-Comté, Univ. Bourgogne Franche-Comté, F-25000 Besançon, France; Service de psychiatrie de l'adulte, CHRU Besançon, F-25000 Besançon, France
| | - Pierre Vandel
- Centre d'investigation Clinique-Innovation Technologique CIC-IT 1431, Inserm, CHRU Besançon, F-25000 Besançon, France; Neurosciences intégratives et cliniques EA 481, Univ. Franche-Comté, Univ. Bourgogne Franche-Comté, F-25000 Besançon, France; Service de psychiatrie de l'adulte, CHRU Besançon, F-25000 Besançon, France; Centre Mémoire de Ressource et de Recherche de Franche-Comté, CHRU Besançon, F-25000 Besançon, France
| |
Collapse
|
6
|
Beaty RE, Burgin CJ, Nusbaum EC, Kwapil TR, Hodges DA, Silvia PJ. Music to the inner ears: exploring individual differences in musical imagery. Conscious Cogn 2013; 22:1163-73. [PMID: 24021845 DOI: 10.1016/j.concog.2013.07.006] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2012] [Revised: 07/13/2013] [Accepted: 07/20/2013] [Indexed: 11/24/2022]
Abstract
In two studies, we explored the frequency and phenomenology of musical imagery. Study 1 used retrospective reports of musical imagery to assess the contribution of individual differences to imagery characteristics. Study 2 used an experience sampling design to assess the phenomenology of musical imagery over the course of one week in a sample of musicians and non-musicians. Both studies found episodes of musical imagery to be common and positive: people rarely wanted such experiences to end and often heard music that was personally meaningful. Several variables predicted musical imagery, including personality, musical preferences, and positive mood. Musicians tended to hear musical imagery more often, but they reported less frequent episodes of deliberately-generated imagery. Taken together, the present research provides new insights into individual differences in musical imagery, and it supports the emerging view that such experiences are common, positive, and more voluntary than previously recognized.
Collapse
Affiliation(s)
- Roger E Beaty
- Department of Psychology, University of North Carolina at Greensboro, United States.
| | | | | | | | | | | |
Collapse
|