1
|
Lozano-Goupil J, Raffard S, Capdevielle D, Aigoin E, Marin L. Gesture-speech synchrony in schizophrenia: A pilot study using a kinematic-acoustic analysis. Neuropsychologia 2022; 174:108347. [PMID: 35970254 DOI: 10.1016/j.neuropsychologia.2022.108347] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2022] [Revised: 07/30/2022] [Accepted: 08/08/2022] [Indexed: 10/15/2022]
Abstract
Severe impairment of social functioning is the core feature of schizophrenia that persists despite treatment, and contributes to chronic functional disability. Abnormal non-verbal behaviors have been reported during interpersonal interactions but the temporal coordination of co-speech gestures with language abilities have been poorly studied to date in this pathology. Using the dynamical systems framework, the goal of the current study was to investigate whether gestures and speech synchrony is impaired in schizophrenia, exploring a new approach to report communicational skill disorders. Performing the first continuous kinematic-acoustic analysis in individuals with schizophrenia, we examined gesture-speech synchrony in solo spontaneous speech and in sensorimotor synchronization task. The experimental group consisted of twenty-eight participants with a diagnosis of schizophrenia and the control group consisted of twenty-four healthy participants age-gender-education matched. The results showed that spontaneous gesture-speech synchrony was preserved while intentional finger tapping-speech synchrony was impaired. In sensorimotor synchronization task, the schizophrenia group displayed greater asynchronies between finger tapping and syllable uttering and lower stability of coordination patterns. These findings suggest a specific deficit in time delay of information circulation and processing, especially in explicit functions. Thus, investigating intrapersonal coordination in schizophrenia may constitute a promising window into brain/behavior dynamic relationship.
Collapse
Affiliation(s)
- Juliette Lozano-Goupil
- EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Montpellier, France.
| | - Stéphane Raffard
- Univ Paul Valéry Montpellier 3, EPSYLON EA, 4556, Montpellier, France; University Department of Adult Psychiatry, CHU Montpellier, Montpellier, France
| | | | - Emilie Aigoin
- EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Montpellier, France
| | - Ludovic Marin
- EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Montpellier, France
| |
Collapse
|
2
|
Pearson L, Pouw W. Gesture-vocal coupling in Karnatak music performance: A neuro-bodily distributed aesthetic entanglement. Ann N Y Acad Sci 2022; 1515:219-236. [PMID: 35730069 DOI: 10.1111/nyas.14806] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
In many musical styles, vocalists manually gesture while they sing. Coupling between gesture kinematics and vocalization has been examined in speech contexts, but it is an open question how these couple in music making. We examine this in a corpus of South Indian, Karnatak vocal music that includes motion-capture data. Through peak magnitude analysis (linear mixed regression) and continuous time-series analyses (generalized additive modeling), we assessed whether vocal trajectories around peaks in vertical velocity, speed, or acceleration were coupling with changes in vocal acoustics (namely, F0 and amplitude). Kinematic coupling was stronger for F0 change versus amplitude, pointing to F0's musical significance. Acceleration was the most predictive for F0 change and had the most reliable magnitude coupling, showing a one-third power relation. That acceleration, rather than other kinematics, is maximally predictive for vocalization is interesting because acceleration entails force transfers onto the body. As a theoretical contribution, we argue that gesturing in musical contexts should be understood in relation to the physical connections between gesturing and vocal production that are brought into harmony with the vocalists' (enculturated) performance goals. Gesture-vocal coupling should, therefore, be viewed as a neuro-bodily distributed aesthetic entanglement.
Collapse
Affiliation(s)
- Lara Pearson
- Department of Music, Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Wim Pouw
- Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, The Netherlands.,Max Planck Institute for Psycholinguistics, Nijmegen, The Netherlands
| |
Collapse
|
3
|
Tolentino-Castro JW, Schroeger A, Cañal-Bruland R, Raab M. The impact of pitch on tempo-spatial accuracy and precision in intercepting a virtually moving ball. J Mot Behav 2021; 54:158-172. [PMID: 34180782 DOI: 10.1080/00222895.2021.1933886] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
In two experiments, horizontal and vertical orientated sounds moved in parabolas. Participants had to touch a screen to indicate where and when a virtual moving ball would cross a visible line. We predicted that due to the sensitivity of the auditory system to temporal information, manipulations of pitch should affect temporal errors more than spatial errors. Stimuli were sound sources at five different pitches moving along a parabola produced through loudspeakers mounted around a touch screen. Results showed pitch effects on spatial constant and spatial variable errors when the parabola was horizontally oriented (Exp. 1), and on temporal constant errors in vertically oriented parabolas (Exp. 2). We conclude that temporal and spatial precision in interception tasks were affected differently by pitch manipulations and require consideration in future studies when assessing the impact of auditory information on catching virtually moving balls.
Collapse
Affiliation(s)
- J Walter Tolentino-Castro
- Department of Performance Psychology, Institute of Psychology, German Sport University Cologne, Cologne, Germany
| | - Anna Schroeger
- Department for the Psychology of Human Movement and Sport, Institute of Sport Science, Friedrich Schiller University Jena, Jena, Germany
| | - Rouwen Cañal-Bruland
- Department for the Psychology of Human Movement and Sport, Institute of Sport Science, Friedrich Schiller University Jena, Jena, Germany
| | - Markus Raab
- Department of Performance Psychology, Institute of Psychology, German Sport University Cologne, Cologne, Germany.,School of Applied Sciences, London South Bank University, London, UK
| |
Collapse
|
4
|
Pouw W, de Jonge‐Hoekstra L, Harrison SJ, Paxton A, Dixon JA. Gesture-speech physics in fluent speech and rhythmic upper limb movements. Ann N Y Acad Sci 2021; 1491:89-105. [PMID: 33336809 PMCID: PMC8246948 DOI: 10.1111/nyas.14532] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2020] [Revised: 10/15/2020] [Accepted: 10/23/2020] [Indexed: 12/18/2022]
Abstract
It is commonly understood that hand gesture and speech coordination in humans is culturally and cognitively acquired, rather than having a biological basis. Recently, however, the biomechanical physical coupling of arm movements to speech vocalization has been studied in steady-state vocalization and monosyllabic utterances, where forces produced during gesturing are transferred onto the tensioned body, leading to changes in respiratory-related activity and thereby affecting vocalization F0 and intensity. In the current experiment (n = 37), we extend this previous line of work to show that gesture-speech physics also impacts fluent speech. Compared with nonmovement, participants who are producing fluent self-formulated speech while rhythmically moving their limbs demonstrate heightened F0 and amplitude envelope, and such effects are more pronounced for higher-impulse arm versus lower-impulse wrist movement. We replicate that acoustic peaks arise especially during moments of peak impulse (i.e., the beat) of the movement, namely around deceleration phases of the movement. Finally, higher deceleration rates of higher-mass arm movements were related to higher peaks in acoustics. These results confirm a role for physical impulses of gesture affecting the speech system. We discuss the implications of gesture-speech physics for understanding of the emergence of communicative gesture, both ontogenetically and phylogenetically.
Collapse
Affiliation(s)
- Wim Pouw
- Center for the Ecological Study of Perception and ActionUniversity of ConnecticutStorrsConnecticut
- Donders Institute for Brain, Cognition and BehaviourRadboud University NijmegenNijmegenthe Netherlands
- Institute for PsycholinguisticsMax Planck NijmegenNijmegenthe Netherlands
| | - Lisette de Jonge‐Hoekstra
- Center for the Ecological Study of Perception and ActionUniversity of ConnecticutStorrsConnecticut
- Faculty of Behavioral and Social SciencesUniversity of GroningenGroningenthe Netherlands
- Royal Dutch KentalisSint‐Michielsgestelthe Netherlands
| | - Steven J. Harrison
- Center for the Ecological Study of Perception and ActionUniversity of ConnecticutStorrsConnecticut
- Department of KinesiologyUniversity of ConnecticutStorrsConnecticut
| | - Alexandra Paxton
- Center for the Ecological Study of Perception and ActionUniversity of ConnecticutStorrsConnecticut
- Department of Psychological SciencesUniversity of ConnecticutStorrsConnecticut
| | - James A. Dixon
- Center for the Ecological Study of Perception and ActionUniversity of ConnecticutStorrsConnecticut
- Department of Psychological SciencesUniversity of ConnecticutStorrsConnecticut
| |
Collapse
|
5
|
The quantification of gesture-speech synchrony: A tutorial and validation of multimodal data acquisition using device-based and video-based motion tracking. Behav Res Methods 2020; 52:723-740. [PMID: 31659689 PMCID: PMC7148275 DOI: 10.3758/s13428-019-01271-9] [Citation(s) in RCA: 30] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
There is increasing evidence that hand gestures and speech synchronize their activity on multiple dimensions and timescales. For example, gesture’s kinematic peaks (e.g., maximum speed) are coupled with prosodic markers in speech. Such coupling operates on very short timescales at the level of syllables (200 ms), and therefore requires high-resolution measurement of gesture kinematics and speech acoustics. High-resolution speech analysis is common for gesture studies, given that field’s classic ties with (psycho)linguistics. However, the field has lagged behind in the objective study of gesture kinematics (e.g., as compared to research on instrumental action). Often kinematic peaks in gesture are measured by eye, where a “moment of maximum effort” is determined by several raters. In the present article, we provide a tutorial on more efficient methods to quantify the temporal properties of gesture kinematics, in which we focus on common challenges and possible solutions that come with the complexities of studying multimodal language. We further introduce and compare, using an actual gesture dataset (392 gesture events), the performance of two video-based motion-tracking methods (deep learning vs. pixel change) against a high-performance wired motion-tracking system (Polhemus Liberty). We show that the videography methods perform well in the temporal estimation of kinematic peaks, and thus provide a cheap alternative to expensive motion-tracking systems. We hope that the present article incites gesture researchers to embark on the widespread objective study of gesture kinematics and their relation to speech.
Collapse
|
6
|
Pouw W, Harrison SJ, Esteve-Gibert N, Dixon JA. Energy flows in gesture-speech physics: The respiratory-vocal system and its coupling with hand gestures. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2020; 148:1231. [PMID: 33003900 DOI: 10.1121/10.0001730] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/02/2019] [Accepted: 07/24/2020] [Indexed: 06/11/2023]
Abstract
Expressive moments in communicative hand gestures often align with emphatic stress in speech. It has recently been found that acoustic markers of emphatic stress arise naturally during steady-state phonation when upper-limb movements impart physical impulses on the body, most likely affecting acoustics via respiratory activity. In this confirmatory study, participants (N = 29) repeatedly uttered consonant-vowel (/pa/) mono-syllables while moving in particular phase relations with speech, or not moving the upper limbs. This study shows that respiration-related activity is affected by (especially high-impulse) gesturing when vocalizations occur near peaks in physical impulse. This study further shows that gesture-induced moments of bodily impulses increase the amplitude envelope of speech, while not similarly affecting the Fundamental Frequency (F0). Finally, tight relations between respiration-related activity and vocalization were observed, even in the absence of movement, but even more so when upper-limb movement is present. The current findings expand a developing line of research showing that speech is modulated by functional biomechanical linkages between hand gestures and the respiratory system. This identification of gesture-speech biomechanics promises to provide an alternative phylogenetic, ontogenetic, and mechanistic explanatory route of why communicative upper limb movements co-occur with speech in humans.
Collapse
Affiliation(s)
- Wim Pouw
- Center for the Ecological Study of Perception and Action at the University of Connecticut, 406 Babbidge Road, Storrs, Connecticut 06269, USA
| | - Steven J Harrison
- Center for the Ecological Study of Perception and Action at the University of Connecticut, 406 Babbidge Road, Storrs, Connecticut 06269, USA
| | - Núria Esteve-Gibert
- Psychology and Education Sciences at the Universitat Oberta de Catalunya, Rambla del Poblenou, 158, 08018, Barcelona, Spain
| | - James A Dixon
- Center for the Ecological Study of Perception and Action at the University of Connecticut, 406 Babbidge Road, Storrs, Connecticut 06269, USA
| |
Collapse
|
7
|
Pouw W, Dixon JA. Entrainment and Modulation of Gesture-Speech Synchrony Under Delayed Auditory Feedback. Cogn Sci 2020; 43:e12721. [PMID: 30900288 PMCID: PMC6593786 DOI: 10.1111/cogs.12721] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2018] [Revised: 01/25/2019] [Accepted: 02/14/2019] [Indexed: 11/30/2022]
Abstract
Gesture–speech synchrony re‐stabilizes when hand movement or speech is disrupted by a delayed feedback manipulation, suggesting strong bidirectional coupling between gesture and speech. Yet it has also been argued from case studies in perceptual–motor pathology that hand gestures are a special kind of action that does not require closed‐loop re‐afferent feedback to maintain synchrony with speech. In the current pre‐registered within‐subject study, we used motion tracking to conceptually replicate McNeill's (1992) classic study on gesture–speech synchrony under normal and 150 ms delayed auditory feedback of speech conditions (NO DAF vs. DAF). Consistent with, and extending McNeill's original results, we obtain evidence that (a) gesture‐speech synchrony is more stable under DAF versus NO DAF (i.e., increased coupling effect), (b) that gesture and speech variably entrain to the external auditory delay as indicated by a consistent shift in gesture‐speech synchrony offsets (i.e., entrainment effect), and (c) that the coupling effect and the entrainment effect are co‐dependent. We suggest, therefore, that gesture–speech synchrony provides a way for the cognitive system to stabilize rhythmic activity under interfering conditions.
Collapse
Affiliation(s)
- Wim Pouw
- Center for the Ecological Study of Perception and Action, University of Connecticut.,Department of Psychology, Education& Child Studies, Erasmus University Rotterdam
| | - James A Dixon
- Center for the Ecological Study of Perception and Action, University of Connecticut
| |
Collapse
|
8
|
Zelic G, Varlet M, Wishart J, Kim J, Davis C. The dual influence of pacer continuity and pacer pattern for visuomotor synchronisation. Neurosci Lett 2018; 683:150-159. [DOI: 10.1016/j.neulet.2018.07.044] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2018] [Revised: 07/25/2018] [Accepted: 07/30/2018] [Indexed: 10/28/2022]
|
9
|
A flexible and accurate method to estimate the mode and stability of spontaneous coordinated behaviors: The index-of-stability (IS) analysis. Behav Res Methods 2018; 50:182-194. [PMID: 28236217 DOI: 10.3758/s13428-017-0861-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Patterns of coordination result from the interaction between (at least) two oscillatory components. This interaction is typically understood by means of two variables: the mode that expresses the shape of the interaction, and the stability that is the robustness of the interaction in this mode. A potent method of investigating coordinated behaviors is to examine the extent to which patterns of coordination arise spontaneously. However, a prominent issue faced by researchers is that, to date, no standard methods exist to fairly assess the stability of spontaneous coordination. In the present study, we introduce a new method called the index-of-stability (IS) analysis. We developed this method from the phase-coupling (PC) analysis that has been traditionally used for examining locomotion-respiration coordinated systems. We compared the extents to which both methods estimate the stability of simulated coordinated behaviors. Computer-generated time series were used to simulate the coordination of two rhythmic components according to a selected mode m:n and a selected degree of stability. The IS analysis was superior to the PC analysis in estimating the stability of spontaneous coordinated behaviors, in three ways: First, the estimation of stability itself was found to be more accurate and more reliable with the IS analysis. Second, the IS analysis is not constrained by the limitations of the PC analysis. Third and last, the IS analysis offers more flexibility, and so can be adapted according to the user's needs.
Collapse
|
10
|
Zelic G, Varlet M, Kim J, Davis C. Influence of pacer continuity on continuous and discontinuous visuo-motor synchronisation. Acta Psychol (Amst) 2016; 169:61-70. [PMID: 27232554 DOI: 10.1016/j.actpsy.2016.05.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2015] [Revised: 03/28/2016] [Accepted: 05/17/2016] [Indexed: 12/27/2022] Open
Abstract
Previous research has reported that synchronising movements with an external pacer, known as sensorimotor synchronisation (SMS), is more stable when the movements are discrete/discontinuous rather than continuous. A standard explanation considers that more efficient mechanisms are involved for regulating synchronisation when producing discontinuous movements. To date, however, only discontinuous pacers (e.g., metronomes) have been investigated to compare discontinuous and continuous SMS. We propose an alternative explanation whereby the discontinuous SMS has benefited from the matching between the (dis)continuous nature of the pacer and the (dis)continuous nature of the movements of synchronisation. The present experiment tested this explanation by examining the relative stability of discontinuous and continuous SMS when synchronising with a continuous pacer. Twelve participants finger tapped (discontinuous SMS) or continuously oscillated their forearm (continuous SMS) in synchrony with an oscillatory visual target. The continuity of the pacer was manipulated by varying the kinematic (harmonic to Rayleigh-like oscillations) and the frequency (0.5 and 1Hz) of the target oscillations. Overall, the results showed a more stable continuous than discontinuous SMS. Furthermore, the stability of the discontinuous SMS improved when increasing the discontinuity of the target displacements (high nonlinear kinematic and low frequency), showing an interaction between movement type and pacer continuity in SMS.
Collapse
|
11
|
Zelic G, Mottet D, Lagarde J. Perceptuo-motor compatibility governs multisensory integration in bimanual coordination dynamics. Exp Brain Res 2015; 234:463-74. [DOI: 10.1007/s00221-015-4476-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2014] [Accepted: 10/15/2015] [Indexed: 11/30/2022]
|