1
|
The spontaneous emergence of rhythmic coordination in turn taking. Sci Rep 2023; 13:3259. [PMID: 36828878 PMCID: PMC9958099 DOI: 10.1038/s41598-022-18480-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Accepted: 08/12/2022] [Indexed: 02/26/2023] Open
Abstract
Turn-taking is a feature of many social interactions such as group music-making, where partners must alternate turns with high precision and accuracy. In two studies of musical rhythm coordination, we investigated how joint action partners learn to coordinate the timing of turn-taking. Musically inexperienced individuals learned to tap at the rate of a pacing cue individually or jointly (in turn with a partner), where each tap produced the next tone in a melodic sequence. In Study 1, partners alternated turns every tap, whereas in Study 2 partners alternated turns every two taps. Findings revealed that partners did not achieve the same level of performance accuracy or precision of inter-tap intervals (ITIs) when producing tapping sequences jointly relative to individually, despite showing learning (increased ITI accuracy and precision across the experiment) in both tasks. Strikingly, partners imposed rhythmic patterns onto jointly produced sequences that captured the temporal structure of turns. Together, learning to produce novel temporal sequences in turn with a partner appears to be more challenging than learning to produce the same sequences alone. Critically, partners may impose rhythmic structures onto turn-taking sequences as a strategy for facilitating coordination.
Collapse
|
2
|
Endogenous rhythms influence musicians' and non-musicians' interpersonal synchrony. Sci Rep 2022; 12:12973. [PMID: 35902677 PMCID: PMC9334298 DOI: 10.1038/s41598-022-16686-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2022] [Accepted: 07/13/2022] [Indexed: 12/03/2022] Open
Abstract
Individuals display considerable rate differences in the spontaneous production of rhythmic behaviors (such as speech, gait, dance). Temporal precision in rhythmic behavior tends to be highest at individuals’ spontaneous production rates; musically trained partners with similar spontaneous rates show increased synchrony in joint tasks, consistent with predictions based on intrinsic frequencies of coupled oscillators. We address whether partner-specific influences of intrinsic frequencies are evidenced in musically trained and untrained individuals who tapped a familiar melody at a spontaneous (uncued) rate individually. Each individual then synchronized with a partner from the same musicianship group at an initially cued rate that matched the partners’ spontaneous rates. Musically trained partners showed greater synchrony in joint tapping than musically untrained partners. Asynchrony increased in both groups as the partners’ difference in individual spontaneous rates increased, with greater impact for musically untrained pairs. Recurrence quantification analysis confirmed that musically untrained individuals demonstrated greater determinism (less flexibility) in their tapping than musically trained individuals. Furthermore, individuals with greater determinism in solo performances demonstrated reduced synchrony in joint performances. These findings suggest that musicians’ increased temporal flexibility is associated with decreased endogenous constraints on production rate and greater interpersonal synchrony in musical tasks.
Collapse
|
3
|
Blum S, Hölle D, Bleichner MG, Debener S. Pocketable Labs for Everyone: Synchronized Multi-Sensor Data Streaming and Recording on Smartphones with the Lab Streaming Layer. SENSORS (BASEL, SWITZERLAND) 2021; 21:8135. [PMID: 34884139 PMCID: PMC8662410 DOI: 10.3390/s21238135] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/15/2021] [Revised: 11/16/2021] [Accepted: 12/01/2021] [Indexed: 12/14/2022]
Abstract
The streaming and recording of smartphone sensor signals is desirable for mHealth, telemedicine, environmental monitoring and other applications. Time series data gathered in these fields typically benefit from the time-synchronized integration of different sensor signals. However, solutions required for this synchronization are mostly available for stationary setups. We hope to contribute to the important emerging field of portable data acquisition by presenting open-source Android applications both for the synchronized streaming (Send-a) and recording (Record-a) of multiple sensor data streams. We validate the applications in terms of functionality, flexibility and precision in fully mobile setups and in hybrid setups combining mobile and desktop hardware. Our results show that the fully mobile solution is equivalent to well-established desktop versions. With the streaming application Send-a and the recording application Record-a, purely smartphone-based setups for mobile research and personal health settings can be realized on off-the-shelf Android devices.
Collapse
Affiliation(s)
- Sarah Blum
- Neuropsychology Lab, Department of Psychology, University of Oldenburg, 26111 Oldenburg, Germany;
- Cluster of Excellence Hearing4all, 26111 Oldenburg, Germany
| | - Daniel Hölle
- Neurophysiology of Everyday Life Group, Department of Psychology, University of Oldenburg, 26111 Oldenburg, Germany; (D.H.); (M.G.B.)
| | - Martin Georg Bleichner
- Neurophysiology of Everyday Life Group, Department of Psychology, University of Oldenburg, 26111 Oldenburg, Germany; (D.H.); (M.G.B.)
| | - Stefan Debener
- Neuropsychology Lab, Department of Psychology, University of Oldenburg, 26111 Oldenburg, Germany;
- Cluster of Excellence Hearing4all, 26111 Oldenburg, Germany
| |
Collapse
|
4
|
Zamm A, Palmer C, Bauer AKR, Bleichner MG, Demos AP, Debener S. Behavioral and Neural Dynamics of Interpersonal Synchrony Between Performing Musicians: A Wireless EEG Hyperscanning Study. Front Hum Neurosci 2021; 15:717810. [PMID: 34588966 PMCID: PMC8473838 DOI: 10.3389/fnhum.2021.717810] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2021] [Accepted: 07/27/2021] [Indexed: 11/13/2022] Open
Abstract
Interpersonal synchrony refers to the temporal coordination of actions between individuals and is a common feature of social behaviors, from team sport to ensemble music performance. Interpersonal synchrony of many rhythmic (periodic) behaviors displays dynamics of coupled biological oscillators. The current study addresses oscillatory dynamics on the levels of brain and behavior between music duet partners performing at spontaneous (uncued) rates. Wireless EEG was measured from N = 20 pairs of pianists as they performed a melody first in Solo performance (at their spontaneous rate of performance), and then in Duet performances at each partner's spontaneous rate. Influences of partners' spontaneous rates on interpersonal synchrony were assessed by correlating differences in partners' spontaneous rates of Solo performance with Duet tone onset asynchronies. Coupling between partners' neural oscillations was assessed by correlating amplitude envelope fluctuations of cortical oscillations at the Duet performance frequency between observed partners and between surrogate (re-paired) partners, who performed the same melody but at different times. Duet synchronization was influenced by partners' spontaneous rates in Solo performance. The size and direction of the difference in partners' spontaneous rates were mirrored in the size and direction of the Duet asynchronies. Moreover, observed Duet partners showed greater inter-brain correlations of oscillatory amplitude fluctuations than did surrogate partners, suggesting that performing in synchrony with a musical partner is reflected in coupled cortical dynamics at the performance frequency. The current study provides evidence that dynamics of oscillator coupling are reflected in both behavioral and neural measures of temporal coordination during musical joint action.
Collapse
Affiliation(s)
- Anna Zamm
- Sequence Production Laboratory, Department of Psychology, McGill University, Montreal, QC, Canada
| | - Caroline Palmer
- Sequence Production Laboratory, Department of Psychology, McGill University, Montreal, QC, Canada
| | - Anna-Katharina R. Bauer
- Neuropsychology Laboratory, Institute for Psychology, European Medical School, University of Oldenburg, Oldenburg, Germany
| | - Martin G. Bleichner
- Neuropsychology Laboratory, Institute for Psychology, European Medical School, University of Oldenburg, Oldenburg, Germany
| | - Alexander P. Demos
- Sequence Production Laboratory, Department of Psychology, McGill University, Montreal, QC, Canada
| | - Stefan Debener
- Neuropsychology Laboratory, Institute for Psychology, European Medical School, University of Oldenburg, Oldenburg, Germany
- Cluster of Excellence Hearing4All Oldenburg, University of Oldenburg, Oldenburg, Germany
| |
Collapse
|
5
|
Mathias B, Zamm A, Gianferrara PG, Ross B, Palmer C. Rhythm Complexity Modulates Behavioral and Neural Dynamics During Auditory–Motor Synchronization. J Cogn Neurosci 2020; 32:1864-1880. [DOI: 10.1162/jocn_a_01601] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
We addressed how rhythm complexity influences auditory–motor synchronization in musically trained individuals who perceived and produced complex rhythms while EEG was recorded. Participants first listened to two-part auditory sequences (Listen condition). Each part featured a single pitch presented at a fixed rate; the integer ratio formed between the two rates varied in rhythmic complexity from low (1:1) to moderate (1:2) to high (3:2). One of the two parts occurred at a constant rate across conditions. Then, participants heard the same rhythms as they synchronized their tapping at a fixed rate (Synchronize condition). Finally, they tapped at the same fixed rate (Motor condition). Auditory feedback from their taps was present in all conditions. Behavioral effects of rhythmic complexity were evidenced in all tasks; detection of missing beats (Listen) worsened in the most complex (3:2) rhythm condition, and tap durations (Synchronize) were most variable and least synchronous with stimulus onsets in the 3:2 condition. EEG power spectral density was lowest at the fixed rate during the 3:2 rhythm and greatest during the 1:1 rhythm (Listen and Synchronize). ERP amplitudes corresponding to an N1 time window were smallest for the 3:2 rhythm and greatest for the 1:1 rhythm (Listen). Finally, synchronization accuracy (Synchronize) decreased as amplitudes in the N1 time window became more positive during the high rhythmic complexity condition (3:2). Thus, measures of neural entrainment corresponded to synchronization accuracy, and rhythmic complexity modulated the behavioral and neural measures similarly.
Collapse
Affiliation(s)
- Brian Mathias
- McGill University
- Max Planck Institute for Human Cognitive and Brain Science
| | - Anna Zamm
- McGill University
- Central European University, Budapest, Hungary
| | | | | | | |
Collapse
|
6
|
Graña M, Aguilar-Moreno M, De Lope Asiain J, Araquistain IB, Garmendia X. Improved Activity Recognition Combining Inertial Motion Sensors and Electroencephalogram Signals. Int J Neural Syst 2020; 30:2050053. [DOI: 10.1142/s0129065720500537] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
Abstract
Human activity recognition and neural activity analysis are the basis for human computational neureoethology research dealing with the simultaneous analysis of behavioral ethogram descriptions and neural activity measurements. Wireless electroencephalography (EEG) and wireless inertial measurement units (IMU) allow the realization of experimental data recording with improved ecological validity where the subjects can be carrying out natural activities while data recording is minimally invasive. Specifically, we aim to show that EEG and IMU data fusion allows improved human activity recognition in a natural setting. We have defined an experimental protocol composed of natural sitting, standing and walking activities, and we have recruited subjects in two sites: in-house ([Formula: see text]) and out-house ([Formula: see text]) populations with different demographics. Experimental protocol data capture was carried out with validated commercial systems. Classifier model training and validation were carried out with scikit-learn open source machine learning python package. EEG features consist of the amplitude of the standard EEG frequency bands. Inertial features were the instantaneous position of the body tracked points after a moving average smoothing to remove noise. We carry out three validation processes: a 10-fold cross-validation process per experimental protocol repetition, (b) the inference of the ethograms, and (c) the transfer learning from each experimental protocol repetition to the remaining repetitions. The in-house accuracy results were lower and much more variable than the out-house sessions results. In general, random forest was the best performing classifier model. Best cross-validation results, ethogram accuracy, and transfer learning were achieved from the fusion of EEG and IMUs data. Transfer learning behaved poorly compared to classification on the same protocol repetition, but it has accuracy still greater than 0.75 on average for the out-house data sessions. Transfer leaning accuracy among repetitions of the same subject was above 0.88 on average. Ethogram prediction accuracy was above 0.96 on average. Therefore, we conclude that wireless EEG and IMUs allow for the definition of natural experimental designs with high ecological validity toward human computational neuroethology research. The fusion of both EEG and IMUs signals improves activity and ethogram recognition.
Collapse
Affiliation(s)
- Manuel Graña
- Computational Intelligence Group, University of the Basque Country (UPV/EHU), San Sebastian, Spain
| | - Marina Aguilar-Moreno
- Computational Intelligence Group, University of the Basque Country (UPV/EHU), San Sebastian, Spain
| | - Javier De Lope Asiain
- Department of Artificial Intelligence, Universidad Politécnica de Madrid (UPM), Madrid, Spain
| | | | - Xavier Garmendia
- Computational Intelligence Group, University of the Basque Country (UPV/EHU), San Sebastian, Spain
| |
Collapse
|
7
|
Fachner JC, Maidhof C, Grocke D, Nygaard Pedersen I, Trondalen G, Tucek G, Bonde LO. "Telling me not to worry…" Hyperscanning and Neural Dynamics of Emotion Processing During Guided Imagery and Music. Front Psychol 2019; 10:1561. [PMID: 31402880 PMCID: PMC6673756 DOI: 10.3389/fpsyg.2019.01561] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2019] [Accepted: 06/20/2019] [Indexed: 12/15/2022] Open
Abstract
To analyze how emotions and imagery are shared, processed and recognized in Guided Imagery and Music, we measured the brain activity of an experienced therapist (“Guide”) and client (“Traveler”) with dual-EEG in a real therapy session about potential death of family members. Synchronously with the EEG, the session was video-taped and then micro-analyzed. Four raters identified therapeutically important moments of interest (MOI) and no-interest (MONI) which were transcribed and annotated. Several indices of emotion- and imagery-related processing were analyzed: frontal and parietal alpha asymmetry, frontal midline theta, and occipital alpha activity. Session ratings showed overlaps across all raters, confirming the importance of these MOIs, which showed different cortical activity in visual areas compared to resting-state. MOI 1 was a pivotal moment including an important imagery with a message of hope from a close family member, while in the second MOI the Traveler sent a message to an unborn baby. Generally, results seemed to indicate that the emotions of Traveler and Guide during important moments were not positive, pleasurably or relaxed when compared to resting-state, confirming both were dealing with negative emotions and anxiety that had to be contained in the interpersonal process. However, the temporal dynamics of emotion-related markers suggested shifts in emotional valence and intensity during these important, personally meaningful moments; for example, during receiving the message of hope, an increase of frontal alpha asymmetry was observed, reflecting increased positive emotional processing. EEG source localization during the message suggested a peak activation in left middle temporal gyrus. Interestingly, peaks in emotional markers in the Guide partly paralleled the Traveler's peaks; for example, during the Guide's strong feeling of mutuality in MOI 2, the time series of frontal alpha asymmetries showed a significant cross-correlation, indicating similar emotional processing in Traveler and Guide. Investigating the moment-to-moment interaction in music therapy showed how asymmetry peaks align with the situated cognition of Traveler and Guide along the emotional contour of the music, representing the highs and lows during the therapy process. Combining dual-EEG with detailed audiovisual and qualitative data seems to be a promising approach for further research into music therapy.
Collapse
Affiliation(s)
- Jörg C Fachner
- Cambridge Institute for Music Therapy Research, Anglia Ruskin University, Cambridge, United Kingdom.,Josef Ressel Centre for Personalised Music Therapy, IMC University of Applied Sciences Krems, Krems an der Donau, Austria
| | - Clemens Maidhof
- Cambridge Institute for Music Therapy Research, Anglia Ruskin University, Cambridge, United Kingdom.,Josef Ressel Centre for Personalised Music Therapy, IMC University of Applied Sciences Krems, Krems an der Donau, Austria
| | - Denise Grocke
- Melbourne Conservatorium of Music, University of Melbourne, Melbourne, VIC, Australia
| | - Inge Nygaard Pedersen
- Department of Communication and Psychology, The Faculty of Humanities, Aalborg University, Aalborg, Denmark
| | - Gro Trondalen
- Centre for Research in Music and Health, Norwegian Academy of Music, Oslo, Norway
| | - Gerhard Tucek
- Josef Ressel Centre for Personalised Music Therapy, IMC University of Applied Sciences Krems, Krems an der Donau, Austria
| | - Lars O Bonde
- Department of Communication and Psychology, The Faculty of Humanities, Aalborg University, Aalborg, Denmark.,Centre for Research in Music and Health, Norwegian Academy of Music, Oslo, Norway
| |
Collapse
|
8
|
Zamm A, Wang Y, Palmer C. Musicians' Natural Frequencies of Performance Display Optimal Temporal Stability. J Biol Rhythms 2018; 33:432-440. [PMID: 29940801 DOI: 10.1177/0748730418783651] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
Abstract
Many human action sequences, such as speaking and performing music, are inherently rhythmic: Sequence events are produced at quasi-regular temporal intervals. A wide range of interindividual variation has been noted in spontaneous production rates of these rhythmic action sequences. Dynamical theories of motor coordination suggest that individuals spontaneously produce rhythmic sequences at a natural frequency characterized by minimal energy expenditure and maximal temporal stability, relative to other frequencies. We tested this hypothesis by comparing the temporal variability with which musicians performed rhythmic melodies at their natural spontaneous rate with variability in their performances at faster and slower rates. Musicians' temporal variability was lowest during performances at their spontaneous rate; in addition, performers' tempo drift during trials at other rates showed bias toward their spontaneous rate. This study provides the first direct evidence that spontaneous rates of motor coordination represent optimally stable natural frequencies of endogenous rhythms.
Collapse
Affiliation(s)
| | | | - Caroline Palmer
- Department of Psychology, McGill University, Montreal, QC, Canada
| |
Collapse
|
9
|
Zamm A, Debener S, Bauer AKR, Bleichner MG, Demos AP, Palmer C. Amplitude envelope correlations measure synchronous cortical oscillations in performing musicians. Ann N Y Acad Sci 2018; 1423:251-263. [PMID: 29756657 DOI: 10.1111/nyas.13738] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2017] [Revised: 03/19/2018] [Accepted: 03/27/2018] [Indexed: 11/29/2022]
Abstract
A major question facing cognitive neuroscience is measurement of interbrain synchrony between individuals performing joint actions. We describe the application of a novel method for measuring musicians' interbrain synchrony: amplitude envelope correlations (AECs). Amplitude envelopes (AEs) reflect energy fluctuations in cortical oscillations over time; AE correlations measure the degree to which two envelope fluctuations are temporally correlated, such as cortical oscillations arising from two individuals performing a joint action. Wireless electroencephalography was recorded from two pianists performing a musical duet; an analysis pipeline is described for computing AEs of cortical oscillations at the duet performance frequency (number of tones produced per second) to test whether these oscillations reflect the temporal dynamics of partners' performances. The pianists' AE correlations were compared with correlations based on a distribution of AEs simulated from white noise signals using the same methods. The AE method was also applied to the temporal characteristics of the pianists' performances, to show that the observed pair's AEs reflect the temporal dynamics of their performance. AE correlations offer a promising approach for assessing interbrain correspondences in cortical activity associated with performing joint tasks.
Collapse
Affiliation(s)
- Anna Zamm
- Sequence Production Laboratory, Department of Psychology, McGill University, Montreal, Canada
| | - Stefan Debener
- Neuropsychology Laboratory, European Medical School, Institute for Psychology, University of Oldenburg, Oldenburg, Germany
- Cluster of Excellence Hearing4all Oldenburg, University of Oldenburg, Oldenburg, Germany
| | - Anna-Katharina R Bauer
- Neuropsychology Laboratory, European Medical School, Institute for Psychology, University of Oldenburg, Oldenburg, Germany
| | - Martin G Bleichner
- Neuropsychology Laboratory, European Medical School, Institute for Psychology, University of Oldenburg, Oldenburg, Germany
- Cluster of Excellence Hearing4all Oldenburg, University of Oldenburg, Oldenburg, Germany
| | - Alexander P Demos
- Sequence Production Laboratory, Department of Psychology, McGill University, Montreal, Canada
| | - Caroline Palmer
- Sequence Production Laboratory, Department of Psychology, McGill University, Montreal, Canada
| |
Collapse
|
10
|
Abstract
The Musical Instrument Digital Interface (MIDI) was readily adopted for auditory sensorimotor synchronization experiments. These experiments typically use MIDI percussion pads to collect responses, a MIDI–USB converter (or MIDI–PCI interface) to record responses on a PC and manipulate feedback, and an external MIDI sound module to generate auditory feedback. Previous studies have suggested that auditory feedback latencies can be introduced by these devices. The Schultz MIDI Benchmarking Toolbox (SMIDIBT) is an open-source, Arduino-based package designed to measure the point-to-point latencies incurred by several devices used in the generation of response-triggered auditory feedback. Experiment 1 showed that MIDI messages are sent and received within 1 ms (on average) in the absence of any external MIDI device. Latencies decreased when the baud rate increased above the MIDI protocol default (31,250 bps). Experiment 2 benchmarked the latencies introduced by different MIDI–USB and MIDI–PCI interfaces. MIDI–PCI was superior to MIDI–USB, primarily because MIDI–USB is subject to USB polling. Experiment 3 tested three MIDI percussion pads. Both the audio and MIDI message latencies were significantly greater than 1 ms for all devices, and there were significant differences between percussion pads and instrument patches. Experiment 4 benchmarked four MIDI sound modules. Audio latencies were significantly greater than 1 ms, and there were significant differences between sound modules and instrument patches. These experiments suggest that millisecond accuracy might not be achievable with MIDI devices. The SMIDIBT can be used to benchmark a range of MIDI devices, thus allowing researchers to make informed decisions when choosing testing materials and to arrive at an acceptable latency at their discretion.
Collapse
|