1
|
Sakalidis KE, Menting SGP, Hettinga FJ. Influence of intellectual disability on exercise regulation: exploring verbal, auditory and visual guidance to contribute to promote inclusive exercise environments. BMJ Open Sport Exerc Med 2024; 10:e001765. [PMID: 38196941 PMCID: PMC10773414 DOI: 10.1136/bmjsem-2023-001765] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/04/2023] [Indexed: 01/11/2024] Open
Abstract
Objective The role of intellectual disability (ID) in exercise regulation has remained largely unexplored, yet recent studies have indicated cognitive-related impaired pacing skills in people with ID. In a well-controlled laboratory environment, this study aims to (1) establish the role of ID in pacing and explore the ability of people with and without ID to maintain a steady pace; (2) to investigate if verbal feedback and/or (3) the presence of a pacer can improve the ability of people with ID to maintain a preplanned submaximal velocity. Methods Participants with (n=10) and without ID (n=10) were recruited and performed 7 min submaximal trials on a cycle ergometer (Velotron). Participants with ID also performed a cycling trial with a pacer (virtual avatar). Results The non-parametric tests for repeated measures data (p≤0.05) showed that (1) people with ID deviated more from the targeted pace compared with people without ID, (2) the verbal feedback did not influence their ability to keep a steady pace and (3) they deviated less from the targeted pace when a visual pacer was introduced. Conclusion The results revealed the difficulties of people with ID in planning and monitoring their exercise and the difficulties in appropriately responding to auditory and verbal feedback. Coaches and stakeholders who want to offer inclusive exercise pathways should consider that people with ID perform and pace themselves better when supported by intuitive, visual and personally meaningful stimuli such as other cyclists (avatars).
Collapse
|
2
|
Lapenta OM, Keller PE, Nozaradan S, Varlet M. Spatial and temporal (non)binding of audiovisual rhythms in sensorimotor synchronisation. Exp Brain Res 2023; 241:875-887. [PMID: 36788141 PMCID: PMC9985575 DOI: 10.1007/s00221-023-06569-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Accepted: 02/06/2023] [Indexed: 02/16/2023]
Abstract
Human movement synchronisation with moving objects strongly relies on visual input. However, auditory information also plays an important role, since real environments are intrinsically multimodal. We used electroencephalography (EEG) frequency tagging to investigate the selective neural processing and integration of visual and auditory information during motor tracking and tested the effects of spatial and temporal congruency between audiovisual modalities. EEG was recorded while participants tracked with their index finger a red flickering (rate fV = 15 Hz) dot oscillating horizontally on a screen. The simultaneous auditory stimulus was modulated in pitch (rate fA = 32 Hz) and lateralised between left and right audio channels to induce perception of a periodic displacement of the sound source. Audiovisual congruency was manipulated in terms of space in Experiment 1 (no motion, same direction or opposite direction), and timing in Experiment 2 (no delay, medium delay or large delay). For both experiments, significant EEG responses were elicited at fV and fA tagging frequencies. It was also hypothesised that intermodulation products corresponding to the nonlinear integration of visual and auditory stimuli at frequencies fV ± fA would be elicited, due to audiovisual integration, especially in Congruent conditions. However, these components were not observed. Moreover, synchronisation and EEG results were not influenced by congruency manipulations, which invites further exploration of the conditions which may modulate audiovisual processing and the motor tracking of moving objects.
Collapse
Affiliation(s)
- Olivia Morgan Lapenta
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia.
- Psychological Neuroscience Lab, Center for Investigation in Psychology, University of Minho, Rua da Universidade, 4710-057, Braga, Portugal.
| | - Peter E Keller
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| | - Sylvie Nozaradan
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
- Institute of Neuroscience, Université Catholique de Louvain, Woluwe-Saint-Lambert, Belgium
| | - Manuel Varlet
- The MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Penrith, Australia
- School of Psychology, Western Sydney University, Penrith, Australia
| |
Collapse
|
3
|
Guo J, Liu T, Wang J. Effects of auditory feedback on fine motor output and corticomuscular coherence during a unilateral finger pinch task. Front Neurosci 2022; 16:896933. [DOI: 10.3389/fnins.2022.896933] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Accepted: 10/11/2022] [Indexed: 11/06/2022] Open
Abstract
Auditory feedback is important to reduce movement error and improve motor performance during a precise motor task. Accurate motion guided by auditory feedback may rely on the neural muscle transmission pathway between the sensorimotor area and the effective muscle. However, it remains unclear how neural activities and sensorimotor loops play a role in enhancing performance. The present study uses an auditory feedback system by simultaneously recording electroencephalogram (EEG), electromyography (EMG), and exert force information to measure corticomuscular coherence (CMC), neural activity, and motor performance during precise unilateral right-hand pinch by using the thumb and the index finger with and without auditory feedback. This study confirms three results. First, compared with no auditory feedback, auditory feedback decreases movement errors. Second, compared with no auditory feedback, auditory feedback decreased the power spectrum in the beta band in the bimanual sensorimotor cortex and the alpha band in the ipsilateral sensorimotor cortex. Finally, CMC was computed between effector muscle of right hand and contralateral sensorimotor cortex. Analyses reveals that the CMC of beta band significantly decreases in auditory feedback condition compared with no auditory feedback condition. The results indicate that auditory feedback decreases the power spectral in the alpha and beta bands and decreases corticospinal connection in the beta band during precise hand control. This study provides a new perspective on the effect of auditory feedback on behavior and brain activity and offers a new idea for designing more suitable and effective rehabilitation and training strategies to improve fine motor performance.
Collapse
|
4
|
Cellists' sound quality is shaped by their primary postural behavior. Sci Rep 2020; 10:13882. [PMID: 32807898 PMCID: PMC7431865 DOI: 10.1038/s41598-020-70705-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2019] [Accepted: 07/27/2020] [Indexed: 11/25/2022] Open
Abstract
During the last 20 years, the role of musicians’ body movements has emerged as a central question in instrument practice: Why do musicians make so many postural movements, for instance, with their torsos and heads, while playing musical instruments? The musical significance of such ancillary gestures is still an enigma and therefore remains a major pedagogical challenge, since one does not know if these movements should be considered essential embodied skills that improve musical expressivity. Although previous studies established clear connections between musicians’ body movements and musical structures (particularly for clarinet, piano or violin performances), no evidence of direct relationships between body movements and the quality of the produced timbre has ever been found. In this study, focusing on the area of bowed-string instruments, we address the problem by showing that cellists use a set of primary postural directions to develop fluid kinematic bow features (velocity, acceleration) that prevent the production of poor quality (i.e., harsh, shrill, whistling) sounds. By comparing the body-related angles between normal and posturally constrained playing situations, our results reveal that the chest rotation and vertical inclination made by cellists act as coordinative support for the kinematics of the bowing gesture. These findings support the experimental works of Alexander, especially those that showed the role of head movements with respect to the upper torso (the so-called primary control) in ensuring the smooth transmission of fine motor control in musicians all the way to the produced sound. More generally, our research highlights the importance of focusing on this fundamental postural sense to improve the quality of human activities across different domains (music, dance, sports, rehabilitation, working positions, etc.).
Collapse
|
5
|
Abstract
Reaching movements are usually initiated by visual events and controlled visually and kinesthetically. Lately, studies have focused on the possible benefit of auditory information for localization tasks, and also for movement control. This explorative study aimed to investigate if it is possible to code reaching space purely by auditory information. Therefore, the precision of reaching movements to merely acoustically coded target positions was analyzed. We studied the efficacy of acoustically effect-based and of additional acoustically performance-based instruction and feedback and the role of visual movement control. Twenty-four participants executed reaching movements to merely acoustically presented, invisible target positions in three mutually perpendicular planes in front of them. Effector-endpoint trajectories were tracked using inertial sensors. Kinematic data regarding the three spatial dimensions and the movement velocity were sonified. Thus, acoustic instruction and real-time feedback of the movement trajectories and the target position of the hand were provided. The subjects were able to align their reaching movements to the merely acoustically instructed targets. Reaching space can be coded merely acoustically, additional visual movement control does not enhance reaching performance. On the basis of these results, a remarkable benefit of kinematic movement acoustics for the neuromotor rehabilitation of everyday motor skills can be assumed.
Collapse
|
6
|
Cuppone AV, Cappagli G, Gori M. Audio-Motor Training Enhances Auditory and Proprioceptive Functions in the Blind Adult. Front Neurosci 2019; 13:1272. [PMID: 31824258 PMCID: PMC6883219 DOI: 10.3389/fnins.2019.01272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2019] [Accepted: 11/08/2019] [Indexed: 01/06/2023] Open
Abstract
Several reports indicate that spatial perception in blind individuals can be impaired as the lack of visual experience severely affects the development of multisensory spatial correspondences. Despite the growing interest in the development of technological devices to support blind people in their daily lives, very few studies have assessed the benefit of interventions that help to refine sensorimotor perception. In the present study, we directly investigated the impact of a short audio-motor training on auditory and proprioceptive spatial perception in blind individuals. Our findings indicate that auditory and proprioceptive spatial capabilities can be enhanced through interventions designed to foster sensorimotor perception in the form of audio-motor correspondences, demonstrating the importance of the early introduction of sensorimotor training in therapeutic intervention for blind individuals.
Collapse
Affiliation(s)
- Anna Vera Cuppone
- Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| | - Giulia Cappagli
- Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy.,IRCSS Fondazione Istituto Neurologico C. Mondino, Pavia, Italy
| | - Monica Gori
- Unit for Visually Impaired People, Fondazione Istituto Italiano di Tecnologia, Genoa, Italy
| |
Collapse
|
7
|
Barramuño M, Valdés-Badilla P, Guevara E. Variations in glenohumeral movement control when implementing an auditory feedback system: A pilot study. REVISTA DE LA FACULTAD DE MEDICINA 2019. [DOI: 10.15446/revfacmed.v67n4.69456] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
Introduction: Human motor control requires a learning process and it can be trained by means of various sensory feedback sources.Objective: To determine variations in glenohumeral movement control by learning in young adults exposed to an auditory feedback system while they perform object translation tasks classified by difficulty level.Materials and methods: The study involved 45 volunteers of both sexes (22 women), aged between 18 and 32 years. Glenohumeral movement control was measured by means of the root mean square (RMS) of the accelerometry signal, while task execution speed (TES) was measured using an accelerometer during the execution of the task according to its difficulty (easy, moderate and hard) in four stages of randomized intervention (control, pre-exposure, exposure-with auditory feedback, and post-exposure).Results: Statistically significant differences (p<0.001) were found between the pre-exposure and exposure stages and between pre-exposure and post-exposure stages. A significant increase (p <0.001) in TES was identified between the pre-exposure and exposure stages for tasks classified as easy and hard, respectively.Conclusion: The use of an auditory feedback system in young adults without pathologies enhanced learning and glenohumeral movement control without reducing TES. This effect was maintained after the feedback, so the use of this type of feedback system in healthy individuals could result in a useful strategy for the training of motor control of the shoulder.
Collapse
|
8
|
Rand MK. Effects of auditory feedback on movements with two-segment sequence and eye-hand coordination. Exp Brain Res 2018; 236:3131-3148. [PMID: 30159590 DOI: 10.1007/s00221-018-5366-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2018] [Accepted: 08/18/2018] [Indexed: 11/29/2022]
Abstract
The present study investigated the effect of auditory feedback on planning and control of two-segment reaching movements and eye-hand coordination. In particular, it was examined whether additional auditory information indicating the progression of the initial reach (i.e., passing the midway and contacting the target) affects the performance of that reach and gaze shift to the second target at the transition between two segments. Young adults performed a rapid two-segment reaching task, in which both the first and second segments had two target sizes. One out of three auditory feedback conditions included the reach-progression information: a continuous tone was delivered at a consistent timing during the initial reach from the midway to the target contact. Conversely, the other two were control conditions: a continuous tone was delivered at a random timing in one condition or not delivered in the other. The results showed that the initial reach became more accurate with the auditory reach-progression cue compared to without any auditory cue. When that cue was available, movement time of the initial reach was decreased, which was accompanied by an increased peak velocity and a decreased time to peak velocity. These findings suggest that the auditory reach-progression feedback enhanced the preplanned control of the initial reach. Deceleration time of that reach was also decreased with auditory feedback, but it was observed regardless of whether the sound contained the reach-progression information. At the transition between the two segments, the onset latencies of both the gaze shift and reach to the second target became shorter with the auditory reach-progression cue, the effect of which was pronounced when the initial reach had a higher terminal accuracy constraint. This suggests that the reach-progression cue enhanced verification of the termination of initial reach, thereby facilitating the initiation of eye and hand movements to the second target. Taken together, the additional auditory information of reach-progression enhances the planning and control of multi-segment reaches and eye-hand coordination at the segment transition.
Collapse
Affiliation(s)
- Miya K Rand
- Leibniz Research Centre for Working Environment and Human Factors (IfADo), Ardeystraβe 67, 44139, Dortmund, Germany.
| |
Collapse
|
9
|
Lee BC, Fung A, Thrasher TA. The Effects of Coding Schemes on Vibrotactile Biofeedback for Dynamic Balance Training in Parkinson's Disease and Healthy Elderly Individuals. IEEE Trans Neural Syst Rehabil Eng 2017; 26:153-160. [PMID: 29053448 DOI: 10.1109/tnsre.2017.2762239] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Coding scheme for earlier versions of vibrotactile biofeedback systems for balance-related applications was primarily binary in nature, either on or off at a given threshold (range of postural tilt), making it unable to convey information about error magnitude. The purpose of this paper was to explore the effects of two coding schemes (binary versus continuous) for vibrotactile biofeedback during dynamic weight-shifting exercises that are common physical therapists' recommended balance exercises used in clinical settings. Nine individuals with idiopathic Parkinson's disease and nine healthy elderly individuals participated in this paper. All participants performed dynamic weight-shifting exercises assisted with either the binary or continuous vibrotactile biofeedback delivered using with vibrating actuators (tactors) in either the anterior-posterior or medial-lateral direction. Participants' limits of stability at pre and post exercises were compared to evaluate the effects of the exercises on their range of motion. The continuous coding scheme produced significantly better performance than the binary scheme when both groups were performing dynamic weight-shifting balance exercises with assistive vibrotactile biofeedback. The results have implications in terms of maximizing the effects of error-driven motor learning and increasing performance on balance rehabilitation training combined with vibrotactile biofeedback.
Collapse
|
10
|
Dyer JF, Stapleton P, Rodger M. Mapping Sonification for Perception and Action in Motor Skill Learning. Front Neurosci 2017; 11:463. [PMID: 28871218 PMCID: PMC5566964 DOI: 10.3389/fnins.2017.00463] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2017] [Accepted: 08/07/2017] [Indexed: 11/16/2022] Open
Affiliation(s)
- John F Dyer
- School of Psychology, Queen's University BelfastAntrim, United Kingdom
| | - Paul Stapleton
- Sonic Arts Research Centre, School of Arts, English and Languages, Queen's University BelfastAntrim, United Kingdom
| | - Matthew Rodger
- School of Psychology, Queen's University BelfastAntrim, United Kingdom
| |
Collapse
|
11
|
Boyer EO, Portron A, Bevilacqua F, Lorenceau J. Continuous Auditory Feedback of Eye Movements: An Exploratory Study toward Improving Oculomotor Control. Front Neurosci 2017; 11:197. [PMID: 28487626 PMCID: PMC5403913 DOI: 10.3389/fnins.2017.00197] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2016] [Accepted: 03/23/2017] [Indexed: 11/24/2022] Open
Abstract
As eye movements are mostly automatic and overtly generated to attain visual goals, individuals have a poor metacognitive knowledge of their own eye movements. We present an exploratory study on the effects of real-time continuous auditory feedback generated by eye movements. We considered both a tracking task and a production task where smooth pursuit eye movements (SPEM) can be endogenously generated. In particular, we used a visual paradigm which enables to generate and control SPEM in the absence of a moving visual target. We investigated whether real-time auditory feedback of eye movement dynamics might improve learning in both tasks, through a training protocol over 8 days. The results indicate that real-time sonification of eye movements can actually modify the oculomotor behavior, and reinforce intrinsic oculomotor perception. Nevertheless, large inter-individual differences were observed preventing us from reaching a strong conclusion on sensorimotor learning improvements.
Collapse
Affiliation(s)
- Eric O Boyer
- STMS Lab, IRCAM - Centre National de la Recherche Scientifique - UPMCParis, France
| | - Arthur Portron
- Laboratoire des Systèmes Perceptifs, LSP Centre National de la Recherche Scientifique (CNRS), UMR8248, Département d'Etudes Cognitives, Ecole Normale Supérieure-PSLParis, France
| | - Frederic Bevilacqua
- STMS Lab, IRCAM - Centre National de la Recherche Scientifique - UPMCParis, France
| | - Jean Lorenceau
- Laboratoire des Systèmes Perceptifs, LSP Centre National de la Recherche Scientifique (CNRS), UMR8248, Département d'Etudes Cognitives, Ecole Normale Supérieure-PSLParis, France
| |
Collapse
|
12
|
van Vugt FT, Kafczyk T, Kuhn W, Rollnik JD, Tillmann B, Altenmüller E. The role of auditory feedback in music-supported stroke rehabilitation: A single-blinded randomised controlled intervention. Restor Neurol Neurosci 2016; 34:297-311. [PMID: 26923616 DOI: 10.3233/rnn-150588] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
PURPOSE Learning to play musical instruments such as piano was previously shown to benefit post-stroke motor rehabilitation. Previous work hypothesised that the mechanism of this rehabilitation is that patients use auditory feedback to correct their movements and therefore show motor learning. We tested this hypothesis by manipulating the auditory feedback timing in a way that should disrupt such error-based learning. METHODS We contrasted a patient group undergoing music-supported therapy on a piano that emits sounds immediately (as in previous studies) with a group whose sounds are presented after a jittered delay. The delay was not noticeable to patients. Thirty-four patients in early stroke rehabilitation with moderate motor impairment and no previous musical background learned to play the piano using simple finger exercises and familiar children's songs. RESULTS Rehabilitation outcome was not impaired in the jitter group relative to the normal group. Conversely, some clinical tests suggests the jitter group outperformed the normal group. CONCLUSIONS Auditory feedback-based motor learning is not the beneficial mechanism of music-supported therapy. Immediate auditory feedback therapy may be suboptimal. Jittered delay may increase efficacy of the proposed therapy and allow patients to fully benefit from motivational factors of music training. Our study shows a novel way to test hypotheses concerning music training in a single-blinded way, which is an important improvement over existing unblinded tests of music interventions.
Collapse
Affiliation(s)
- F T van Vugt
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media, Emmichplatz, Hannover, Germany.,Lyon Neuroscience Research Center, Auditory Cognition and Psychoacoustics Team, CNRS-UMR 5292, INSERM U1028, University Lyon-1, 50 av Tony Garnier, Lyon, France
| | - T Kafczyk
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media, Emmichplatz, Hannover, Germany
| | - W Kuhn
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media, Emmichplatz, Hannover, Germany
| | - J D Rollnik
- Institute for Neurorehabilitational Research (InFo), BDH-Clinic Teaching Hospital of Hannover Medical School (MHH), Greitstrasse 18, Hessisch Oldendorf, Germany
| | - B Tillmann
- Lyon Neuroscience Research Center, Auditory Cognition and Psychoacoustics Team, CNRS-UMR 5292, INSERM U1028, University Lyon-1, 50 av Tony Garnier, Lyon, France
| | - E Altenmüller
- Institute of Music Physiology and Musicians' Medicine, University of Music, Drama and Media, Emmichplatz, Hannover, Germany
| |
Collapse
|
13
|
Investigating three types of continuous auditory feedback in visuo-manual tracking. Exp Brain Res 2016; 235:691-701. [PMID: 27858128 DOI: 10.1007/s00221-016-4827-x] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2015] [Accepted: 11/07/2016] [Indexed: 10/20/2022]
Abstract
The use of continuous auditory feedback for motor control and learning is still understudied and deserves more attention regarding fundamental mechanisms and applications. This paper presents the results of three experiments studying the contribution of task-, error-, and user-related sonification to visuo-manual tracking and assessing its benefits on sensorimotor learning. First results show that sonification can help decreasing the tracking error, as well as increasing the energy in participant's movement. In the second experiment, when alternating feedback presence, the user-related sonification did not show feedback dependency effects, contrary to the error and task-related feedback. In the third experiment, a reduced exposure of 50% diminished the positive effect of sonification on performance, whereas the increase of the average energy with sound was still significant. In a retention test performed on the next day without auditory feedback, movement energy was still superior for the groups previously trained with the feedback. Although performance was not affected by sound, a learning effect was measurable in both sessions and the user-related group improved its performance also in the retention test. These results confirm that a continuous auditory feedback can be beneficial for movement training and also show an interesting effect of sonification on movement energy. User-related sonification can prevent feedback dependency and increase retention. Consequently, sonification of the user's own motion appears as a promising solution to support movement learning with interactive feedback.
Collapse
|
14
|
Oscari F, Oboe R, Daud Albasini OA, Masiero S, Rosati G. Design and Construction of a Bilateral Haptic System for the Remote Assessment of the Stiffness and Range of Motion of the Hand. SENSORS 2016; 16:s16101633. [PMID: 27706085 PMCID: PMC5087421 DOI: 10.3390/s16101633] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/29/2016] [Revised: 09/21/2016] [Accepted: 09/27/2016] [Indexed: 11/16/2022]
Abstract
The use of haptic devices in the rehabilitation of impaired limbs has become rather popular, given the proven effectiveness in promoting recovery. In a standard framework, such devices are used in rehabilitation centers, where patients interact with virtual tasks, presented on a screen. To track their sessions, kinematic/dynamic parameters or performance scores are recorded. However, as Internet access is now available at almost every home and in order to reduce the hospitalization time of the patient, the idea of doing rehabilitation at home is gaining wide consent. Medical care programs can be synchronized with the home rehabilitation device; patient data can be sent to the central server that could redirect to the therapist laptop (tele-healthcare). The controversial issue is that the recorded data do not actually represent the clinical conditions of the patients according to the medical assessment scales, forcing them to frequently undergo clinical tests at the hospital. To respond to this demand, we propose the use of a bilateral master/slave haptic system that could allow the clinician, who interacts with the master, to assess remotely and in real time the clinical conditions of the patient that uses the home rehabilitation device as the slave. In this paper, we describe a proof of concept to highlight the main issues of such an application, limited to one degree of freedom, and to the measure of the stiffness and range of motion of the hand.
Collapse
Affiliation(s)
- Fabio Oscari
- Department of Management and Engineering, University of Padova, Stradella S. Nicola 3, 36100 Vicenza, Italy.
| | - Roberto Oboe
- Department of Management and Engineering, University of Padova, Stradella S. Nicola 3, 36100 Vicenza, Italy.
| | - Omar Andres Daud Albasini
- Center for the Development of Nanoscience and Nanotechnology, Universidad de Santiago de Chile, Av. Lib. Bernardo O'higgins, 3363 Santiago, Chile.
| | - Stefano Masiero
- Department of Neuroscience, Universiy-General Hospital of Padova, Via Giustiniani 2, 35128 Padova, Italy.
| | - Giulio Rosati
- Department of Management and Engineering, University of Padova, Stradella S. Nicola 3, 36100 Vicenza, Italy.
| |
Collapse
|
15
|
Bevilacqua F, Boyer EO, Françoise J, Houix O, Susini P, Roby-Brami A, Hanneton S. Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies. Front Neurosci 2016; 10:385. [PMID: 27610071 PMCID: PMC4996990 DOI: 10.3389/fnins.2016.00385] [Citation(s) in RCA: 35] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2016] [Accepted: 08/08/2016] [Indexed: 11/13/2022] Open
Abstract
This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with auditory-feedback. In particular, we propose to distinguish between sound-oriented tasks and movement-oriented tasks in experiments involving interactive sound feedback. We describe several research questions and recently published results on movement control, learning and perception. In particular, we studied the effect of the auditory feedback on movements considering several cases: from experiments on pointing and visuo-motor tracking to more complex tasks where interactive sound feedback can guide movements, or cases of sensory substitution where the auditory feedback can inform on object shapes. We also developed specific methodologies and technologies for designing the sonic feedback and movement sonification. We conclude with a discussion on key future research challenges in sensori-motor learning with movement sonification. We also point out toward promising applications such as rehabilitation, sport training or product design.
Collapse
Affiliation(s)
| | - Eric O. Boyer
- STMS Ircam-Centre National de la Recherche Scientifique-UPMCParis, France
- UMR7222 ISIR - Université Pierre et Marie CurieParis, France
| | - Jules Françoise
- STMS Ircam-Centre National de la Recherche Scientifique-UPMCParis, France
| | - Olivier Houix
- STMS Ircam-Centre National de la Recherche Scientifique-UPMCParis, France
| | - Patrick Susini
- STMS Ircam-Centre National de la Recherche Scientifique-UPMCParis, France
| | | | - Sylvain Hanneton
- UMR 8242 Centre National de la Recherche Scientifique - Université Paris DescartesParis, France
| |
Collapse
|
16
|
Oscari F, Finetto C, Kautz SA, Rosati G. Changes in muscle coordination patterns induced by exposure to a viscous force field. J Neuroeng Rehabil 2016; 13:58. [PMID: 27305944 PMCID: PMC4910356 DOI: 10.1186/s12984-016-0164-3] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2015] [Accepted: 06/03/2016] [Indexed: 12/22/2022] Open
Abstract
Background Robotic neurorehabilitation aims at promoting the recovery of lost function after neurological injury by leveraging strategies of motor learning. One important aspect of the rehabilitation process is the improvement of muscle coordination patterns, which can be drastically altered after stroke. However, it is not fully understood if and how robotic therapy can address these deficits. The aim of our study was to find how muscle coordination, analyzed from the perspective of motor modules, could change during motor adaptation to a dynamic environment generated by a haptic interface. Methods In our experiment we employed the traditional paradigm of exposure to a viscous force field to subjects that grasped the handle of an actuated joystick during a reaching movement (participants moved directly forward and back by 30 cm). EMG signals of ten muscles of the tested arm were recorded. We extracted motor modules from the pooled EMG data of all subjects and analyzed the muscle coordination patterns. Results We found that the participants reacted by using a coordination strategy that could be explained by a change in the activation of motor modules used during free motion and by two complementary modules. These complementary modules aggregated changes in muscle coordination, and evolved throughout the experiment eventually maintaining a comparable structure until the late phase of re-adaptation. Conclusions This result suggests that motor adaptation induced by the interaction with a robotic device can lead to changes in the muscle coordination patterns of the subject.
Collapse
Affiliation(s)
- Fabio Oscari
- Dept. of Management and Engineering, University of Padua, Via Venezia 1, Padua, 35135, Italy.
| | - Christian Finetto
- Dept. of Health Sciences and Research, Medical University of South Carolina, 77 President Street, MSC 700, Charleston, SC 29425, USA
| | - Steve A Kautz
- Dept. of Health Sciences and Research, Medical University of South Carolina, 77 President Street, MSC 700, Charleston, SC 29425, USA.,Ralph H. Johnson VA Medical Center, Charleston, SC 29425, USA
| | - Giulio Rosati
- Dept. of Management and Engineering, University of Padua, Via Venezia 1, Padua, 35135, Italy
| |
Collapse
|
17
|
Colomer C, Llorens R, Noé E, Alcañiz M. Effect of a mixed reality-based intervention on arm, hand, and finger function on chronic stroke. J Neuroeng Rehabil 2016; 13:45. [PMID: 27169462 PMCID: PMC4864937 DOI: 10.1186/s12984-016-0153-6] [Citation(s) in RCA: 63] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2015] [Accepted: 05/03/2016] [Indexed: 11/24/2022] Open
Abstract
Background Virtual and mixed reality systems have been suggested to promote motor recovery after stroke. Basing on the existing evidence on motor learning, we have developed a portable and low-cost mixed reality tabletop system that transforms a conventional table in a virtual environment for upper limb rehabilitation. The system allows intensive and customized training of a wide range of arm, hand, and finger movements and enables interaction with tangible objects, while providing audiovisual feedback of the participants’ performance in gamified tasks. This study evaluates the clinical effectiveness and the acceptance of an experimental intervention with the system in chronic stroke survivors. Methods Thirty individuals with stroke were included in a reversal (A-B-A) study. Phase A consisted of 30 sessions of conventional physical therapy. Phase B consisted of 30 training sessions with the experimental system. Both interventions involved flexion and extension of the elbow, wrist, and fingers, and grasping of different objects. Sessions were 45-min long and were administered three to five days a week. The body structures (Modified Ashworth Scale), functions (Motricity Index, Fugl-Meyer Assessment Scale), activities (Manual Function Test, Wolf Motor Function Test, Box and Blocks Test, Nine Hole Peg Test), and participation (Motor Activity Log) were assessed before and after each phase. Acceptance of the system was also assessed after phase B (System Usability Scale, Intrinsic Motivation Inventory). Results Significant improvement was detected after the intervention with the system in the activity, both in arm function measured by the Wolf Motor Function Test (p < 0.01) and finger dexterity measured by the Box and Blocks Test (p < 0.01) and the Nine Hole Peg Test (p < 0.01); and participation (p < 0.01), which was maintained to the end of the study. The experimental system was reported as highly usable, enjoyable, and motivating. Conclusions Our results support the clinical effectiveness of mixed reality interventions that satisfy the motor learning principles for upper limb rehabilitation in chronic stroke survivors. This characteristic, together with the low cost of the system, its portability, and its acceptance could promote the integration of these systems in the clinical practice as an alternative to more expensive systems, such as robotic instruments. Electronic supplementary material The online version of this article (doi:10.1186/s12984-016-0153-6) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Carolina Colomer
- Servicio de Neurorrehabilitación y Daño Cerebral de los Hospitales NISA. Fundación Hospitales NISA, Valencia, Spain
| | - Roberto Llorens
- Servicio de Neurorrehabilitación y Daño Cerebral de los Hospitales NISA. Fundación Hospitales NISA, Valencia, Spain. .,Instituto Interuniversitario de Investigación en Bioingeniería y Tecnología Orientada al Ser Humano, Universitat Politècnica de València, Camino de Vera s/n, Valencia, 46022, Spain.
| | - Enrique Noé
- Servicio de Neurorrehabilitación y Daño Cerebral de los Hospitales NISA. Fundación Hospitales NISA, Valencia, Spain
| | - Mariano Alcañiz
- Instituto Interuniversitario de Investigación en Bioingeniería y Tecnología Orientada al Ser Humano, Universitat Politècnica de València, Camino de Vera s/n, Valencia, 46022, Spain.,Ciber, Fisiopatología Obesidad y Nutrición, CB06/03 Instituto de Salud Carlos III, Av. Sos Baynat s/n, Univesity of Jaume I, Castellón, 12071, Spain
| |
Collapse
|
18
|
Auditory feedback in error-based learning of motor regularity. Brain Res 2015; 1606:54-67. [DOI: 10.1016/j.brainres.2015.02.026] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2014] [Revised: 02/07/2015] [Accepted: 02/09/2015] [Indexed: 11/19/2022]
|
19
|
Rosati G, Oscari F, Pacchierotti C, Prattichizzo D. Effects of kinesthetic and cutaneous stimulation during the learning of a viscous force field. IEEE TRANSACTIONS ON HAPTICS 2014; 7:251-263. [PMID: 24968386 DOI: 10.1109/toh.2013.2296312] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Haptic stimulation can help humans learn perceptual motor skills, but the precise way in which it influences the learning process has not yet been clarified. This study investigates the role of the kinesthetic and cutaneous components of haptic feedback during the learning of a viscous curl field, taking also into account the influence of visual feedback. We present the results of an experiment in which 17 subjects were asked to make reaching movements while grasping a joystick and wearing a pair of cutaneous devices. Each device was able to provide cutaneous contact forces through a moving platform. The subjects received visual feedback about joystick's position. During the experiment, the system delivered a perturbation through (1) full haptic stimulation, (2) kinesthetic stimulation alone, (3) cutaneous stimulation alone, (4) altered visual feedback, or (5) altered visual feedback plus cutaneous stimulation. Conditions 1, 2, and 3 were also tested with the cancellation of the visual feedback of position error. Results indicate that kinesthetic stimuli played a primary role during motor adaptation to the viscous field, which is a fundamental premise to motor learning and rehabilitation. On the other hand, cutaneous stimulation alone appeared not to bring significant direct or adaptation effects, although it helped in reducing direct effects when used in addition to kinesthetic stimulation. The experimental conditions with visual cancellation of position error showed slower adaptation rates, indicating that visual feedback actively contributes to the formation of internal models. However, modest learning effects were detected when the visual information was used to render the viscous field.
Collapse
|
20
|
Rosati G, Rodà A, Avanzini F, Masiero S. On the role of auditory feedback in robot-assisted movement training after stroke: review of the literature. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2013; 2013:586138. [PMID: 24382952 PMCID: PMC3871505 DOI: 10.1155/2013/586138] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/17/2013] [Accepted: 10/09/2013] [Indexed: 01/03/2023]
Abstract
The goal of this paper is to address a topic that is rarely investigated in the literature of technology-assisted motor rehabilitation, that is, the integration of auditory feedback in the rehabilitation device. After a brief introduction on rehabilitation robotics, the main concepts of auditory feedback are presented, together with relevant approaches, techniques, and technologies available in this domain. Current uses of auditory feedback in the context of technology-assisted rehabilitation are then reviewed. In particular, a comparative quantitative analysis over a large corpus of the recent literature suggests that the potential of auditory feedback in rehabilitation systems is currently and largely underexploited. Finally, several scenarios are proposed in which the use of auditory feedback may contribute to overcome some of the main limitations of current rehabilitation systems, in terms of user engagement, development of acute-phase and home rehabilitation devices, learning of more complex motor tasks, and improving activities of daily living.
Collapse
Affiliation(s)
- Giulio Rosati
- Department of Management and Engineering, University of Padova, Via Venezia 1, 35131 Padova, Italy
| | - Antonio Rodà
- Department of Information Engineering, University of Padova, Via Gradenigo 6/A, 35131 Padova, Italy
| | - Federico Avanzini
- Department of Information Engineering, University of Padova, Via Gradenigo 6/A, 35131 Padova, Italy
| | - Stefano Masiero
- Department of Medical and Surgical Sciences, University of Padova, Via Giustiniani 2, 35121 Padova, Italy
| |
Collapse
|
21
|
Robotic technologies and rehabilitation: new tools for stroke patients' therapy. BIOMED RESEARCH INTERNATIONAL 2013; 2013:153872. [PMID: 24350244 PMCID: PMC3852950 DOI: 10.1155/2013/153872] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 04/26/2013] [Accepted: 09/18/2013] [Indexed: 11/23/2022]
Abstract
Introduction. The role of robotics in poststroke patients' rehabilitation has been investigated intensively. This paper presents the state-of-the-art and the possible future role of robotics in poststroke rehabilitation, for both upper and lower limbs. Materials and Methods. We performed a comprehensive search of PubMed, Cochrane, and PeDRO databases using as keywords “robot AND stroke AND rehabilitation.” Results and Discussion. In upper limb robotic rehabilitation, training seems to improve arm function in activities of daily living. In addition, electromechanical gait training after stroke seems to be effective. It is still unclear whether robot-assisted arm training may improve muscle strength, and which electromechanical gait-training device may be the most effective for walking training implementation. Conclusions. In the field of robotic technologies for stroke patients' rehabilitation we identified currently relevant growing points and areas timely for developing research. Among the growing points there is the development of new easily transportable, wearable devices that could improve rehabilitation also after discharge, in an outpatient or home-based setting. For developing research, efforts are being made to establish the ideal type of treatment, the length and amount of training protocol, and the patient's characteristics to be successfully enrolled to this treatment.
Collapse
|
22
|
Thorp EB, Larson E, Stepp CE. Combined Auditory and Vibrotactile Feedback for Human-Machine-Interface Control. IEEE Trans Neural Syst Rehabil Eng 2013; 22:62-8. [PMID: 23912500 DOI: 10.1109/tnsre.2013.2273177] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
The purpose of this study was to determine the effect of the addition of binary vibrotactile stimulation to continuous auditory feedback (vowel synthesis) for human-machine interface (HMI) control. Sixteen healthy participants controlled facial surface electromyography to achieve 2-D targets (vowels). Eight participants used only real-time auditory feedback to locate targets whereas the other eight participants were additionally alerted to having achieved targets with confirmatory vibrotactile stimulation at the index finger. All participants trained using their assigned feedback modality (auditory alone or combined auditory and vibrotactile) over three sessions on three days and completed a fourth session on the third day using novel targets to assess generalization. Analyses of variance performed on the 1) percentage of targets reached and 2) percentage of trial time at the target revealed a main effect for feedback modality: participants using combined auditory and vibrotactile feedback performed significantly better than those using auditory feedback alone. No effect was found for session or the interaction of feedback modality and session, indicating a successful generalization to novel targets but lack of improvement over training sessions. Future research is necessary to determine the cognitive cost associated with combined auditory and vibrotactile feedback during HMI control.
Collapse
|
23
|
Boyer EO, Babayan BM, Bevilacqua F, Noisternig M, Warusfel O, Roby-Brami A, Hanneton S, Viaud-Delmon I. From ear to hand: the role of the auditory-motor loop in pointing to an auditory source. Front Comput Neurosci 2013; 7:26. [PMID: 23626532 PMCID: PMC3631711 DOI: 10.3389/fncom.2013.00026] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2012] [Accepted: 03/15/2013] [Indexed: 11/13/2022] Open
Abstract
Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed toward unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space.
Collapse
Affiliation(s)
- Eric O. Boyer
- STMS IRCAM-CNRS-UPMC, IRCAMParis, France
- Laboratoire de Neurophysique et Physiologie, CNRS UMR 8119, UFR Biomédicale des Saints Pères, Université Paris DescartesParis, France
| | | | | | | | | | - Agnes Roby-Brami
- Institut des Systèmes Intelligents et de Robotique, CNRS UMR 7222, UPMCParis, France
| | - Sylvain Hanneton
- Laboratoire de Neurophysique et Physiologie, CNRS UMR 8119, UFR Biomédicale des Saints Pères, Université Paris DescartesParis, France
| | | |
Collapse
|
24
|
Zanotto D, Rosati G, Spagnol S, Stegall P, Agrawal SK. Effects of complementary auditory feedback in robot-assisted lower extremity motor adaptation. IEEE Trans Neural Syst Rehabil Eng 2013; 21:775-86. [PMID: 23529102 DOI: 10.1109/tnsre.2013.2242902] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
This study investigates how complementary auditory feedback may affect short-term gait modifications induced by four training sessions with a robotic exoskeleton. Healthy subjects walked on a treadmill and were instructed to match a modified gait pattern derived from their natural one, while receiving assistance by the robot (kinetic guidance). The main question we wanted to answer is whether the most commonly used combination of feedback (i.e., haptic and visual) could be either enhanced by adding auditory feedback or successfully substituted with a combination of kinetic guidance and auditory feedback. Participants were randomly assigned to one of four groups, all of which received kinetic guidance. The control group received additional visual feedback, while the three experimental groups were each provided with a different modality of auditory feedback. The third experimental group also received the same visual feedback as the control group. Differences among the training modalities in gait kinematics, timing and symmetry were assessed in three post-training sessions.
Collapse
|