1
|
He J, Yang T. In the era of long COVID, can we seek new techniques for better rehabilitation? Chronic Dis Transl Med 2022; 8:149-153. [PMID: 36161203 PMCID: PMC9481878 DOI: 10.1002/cdt3.42] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2022] [Accepted: 07/24/2022] [Indexed: 11/08/2022] Open
Affiliation(s)
- Jiaze He
- Graduate School of Capital Medical University Beijing China
- Department of Pulmonary and Critical Care Medicine, Center of Respiratory Medicine, China‐Japan Friendship Hospital, National Center for Respiratory Medicine, Institute of Respiratory Medicine, Chinese Academy of Medical Sciences, National Clinical Research Center for Respiratory Diseases WHO Collaborating Centre for Tobacco Cessation and Respiratory Diseases Prevention Beijing China
| | - Ting Yang
- Department of Pulmonary and Critical Care Medicine, Center of Respiratory Medicine, China‐Japan Friendship Hospital, National Center for Respiratory Medicine, Institute of Respiratory Medicine, Chinese Academy of Medical Sciences, National Clinical Research Center for Respiratory Diseases WHO Collaborating Centre for Tobacco Cessation and Respiratory Diseases Prevention Beijing China
| |
Collapse
|
2
|
Agres KR, Foubert K, Sridhar S. Music Therapy During COVID-19: Changes to the Practice, Use of Technology, and What to Carry Forward in the Future. Front Psychol 2021; 12:647790. [PMID: 34093330 PMCID: PMC8177049 DOI: 10.3389/fpsyg.2021.647790] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2020] [Accepted: 03/25/2021] [Indexed: 11/16/2022] Open
Abstract
In recent years, the field of music therapy (MT) has increasingly embraced the use of technology for conducting therapy sessions and enhancing patient outcomes. Amidst a worldwide pandemic, we sought to examine whether this is now true to an even greater extent, as many music therapists have had to approach and conduct their work differently. The purpose of this survey study is to observe trends in how music therapists from different regions around the world have had to alter their practice, especially in relation to their use of technology during the COVID-19 pandemic, because of limited options to conduct in-person therapy due to social distancing measures. Further, the findings aim to clarify music therapists’ perspectives on the benefits and limitations of technology in MT, as well as online MT. In addition, this survey investigated what changes have been necessary to administer MT during COVID-19, in terms of virtual therapy and online tools, and how the changes made now may affect MT in the future. We also explored music therapists’ views on whether special technology-focused training might be helpful to support the practice of MT in the future. This is the first survey, to our knowledge, to break down opinions of and trends in technology use based on geographical region (North America, Europe, and Asia), and several noteworthy differences were apparent across regions. We hope our findings provide useful information, guidance, and a global reference point for music therapists on effectively continuing the practice of MT during times of crisis, and can encourage reflection and improvement in administering MT.
Collapse
Affiliation(s)
- Kat R Agres
- Yong Siew Toh Conservatory of Music, National University of Singapore, Singapore, Singapore
| | - Katrien Foubert
- LUCA School of Arts, Leuven, Belgium.,Faculty of Biomedical Sciences, Department of Development and Regeneration, KU Leuven, Leuven, Belgium.,University Psychiatric Centre, Kortenberg, Belgium
| | | |
Collapse
|
3
|
Belo J, Clerc M, Schön D. EEG-Based Auditory Attention Detection and Its Possible Future Applications for Passive BCI. FRONTIERS IN COMPUTER SCIENCE 2021. [DOI: 10.3389/fcomp.2021.661178] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
The ability to discriminate and attend one specific sound source in a complex auditory environment is a fundamental skill for efficient communication. Indeed, it allows us to follow a family conversation or discuss with a friend in a bar. This ability is challenged in hearing-impaired individuals and more precisely in those with a cochlear implant (CI). Indeed, due to the limited spectral resolution of the implant, auditory perception remains quite poor in a noisy environment or in presence of simultaneous auditory sources. Recent methodological advances allow now to detect, on the basis of neural signals, which auditory stream within a set of multiple concurrent streams an individual is attending to. This approach, called EEG-based auditory attention detection (AAD), is based on fundamental research findings demonstrating that, in a multi speech scenario, cortical tracking of the envelope of the attended speech is enhanced compared to the unattended speech. Following these findings, other studies showed that it is possible to use EEG/MEG (Electroencephalography/Magnetoencephalography) to explore auditory attention during speech listening in a Cocktail-party-like scenario. Overall, these findings make it possible to conceive next-generation hearing aids combining customary technology and AAD. Importantly, AAD has also a great potential in the context of passive BCI, in the educational context as well as in the context of interactive music performances. In this mini review, we firstly present the different approaches of AAD and the main limitations of the global concept. We then expose its potential applications in the world of non-clinical passive BCI.
Collapse
|
4
|
Accessible Digital Musical Instruments—A Review of Musical Interfaces in Inclusive Music Practice. MULTIMODAL TECHNOLOGIES AND INTERACTION 2019. [DOI: 10.3390/mti3030057] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Current advancements in music technology enable the creation of customized Digital Musical Instruments (DMIs). This paper presents a systematic review of Accessible Digital Musical Instruments (ADMIs) in inclusive music practice. History of research concerned with facilitating inclusion in music-making is outlined, and current state of developments and trends in the field are discussed. Although the use of music technology in music therapy contexts has attracted more attention in recent years, the topic has been relatively unexplored in Computer Music literature. This review investigates a total of 113 publications focusing on ADMIs. Based on the 83 instruments in this dataset, ten control interface types were identified: tangible controllers, touchless controllers, Brain–Computer Music Interfaces (BCMIs), adapted instruments, wearable controllers or prosthetic devices, mouth-operated controllers, audio controllers, gaze controllers, touchscreen controllers and mouse-controlled interfaces. The majority of the AMDIs were tangible or physical controllers. Although the haptic modality could potentially play an important role in musical interaction for many user groups, relatively few of the ADMIs (14.5%) incorporated vibrotactile feedback. Aspects judged to be important for successful ADMI design were instrument adaptability and customization, user participation, iterative prototyping, and interdisciplinary development teams.
Collapse
|
5
|
Sanyal S, Nag S, Banerjee A, Sengupta R, Ghosh D. Music of brain and music on brain: a novel EEG sonification approach. Cogn Neurodyn 2019; 13:13-31. [PMID: 30728868 PMCID: PMC6339862 DOI: 10.1007/s11571-018-9502-4] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2018] [Revised: 07/18/2018] [Accepted: 08/22/2018] [Indexed: 11/29/2022] Open
Abstract
Can we hear the sound of our brain? Is there any technique which can enable us to hear the neuro-electrical impulses originating from the different lobes of brain? The answer to all these questions is YES. In this paper we present a novel method with which we can sonify the electroencephalogram (EEG) data recorded in "control" state as well as under the influence of a simple acoustical stimuli-a tanpura drone. The tanpura has a very simple construction yet the tanpura drone exhibits very complex acoustic features, which is generally used for creation of an ambience during a musical performance. Hence, for this pilot project we chose to study the nonlinear correlations between musical stimulus (tanpura drone as well as music clips) and sonified EEG data. Till date, there have been no study which deals with the direct correlation between a bio-signal and its acoustic counterpart and also tries to see how that correlation varies under the influence of different types of stimuli. This study tries to bridge this gap and looks for a direct correlation between music signal and EEG data using a robust mathematical microscope called Multifractal Detrended Cross Correlation Analysis (MFDXA). For this, we took EEG data of 10 participants in 2 min "control condition" (i.e. with white noise) and in 2 min 'tanpura drone' (musical stimulus) listening condition. The same experimental paradigm was repeated for two emotional music, "Chayanat" and "Darbari Kanada". These are well known Hindustani classical ragas which conventionally portray contrast emotional attributes, also verified from human response data. Next, the EEG signals from different electrodes were sonified and MFDXA technique was used to assess the degree of correlation (or the cross correlation coefficient γx) between the EEG signals and the music clips. The variation of γx for different lobes of brain during the course of the experiment provides interesting new information regarding the extraordinary ability of music stimuli to engage several areas of the brain significantly unlike any other stimuli (which engages specific domains only).
Collapse
Affiliation(s)
- Shankha Sanyal
- Sir C.V. Raman Centre for Physics and Music, Jadavpur University, Kolkata, India
- Department of Physics, Jadavpur University, Kolkata, India
| | - Sayan Nag
- Department of Electrical Engineering, Jadavpur University, Kolkata, India
| | - Archi Banerjee
- Sir C.V. Raman Centre for Physics and Music, Jadavpur University, Kolkata, India
- Department of Physics, Jadavpur University, Kolkata, India
| | - Ranjan Sengupta
- Sir C.V. Raman Centre for Physics and Music, Jadavpur University, Kolkata, India
| | - Dipak Ghosh
- Sir C.V. Raman Centre for Physics and Music, Jadavpur University, Kolkata, India
| |
Collapse
|
6
|
Partesotti E, Peñalba A, Manzolli J. Digital instruments and their uses in music therapy. NORDIC JOURNAL OF MUSIC THERAPY 2018. [DOI: 10.1080/08098131.2018.1490919] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
Affiliation(s)
- Elena Partesotti
- Interdisciplinary Nucleus for Sound Studies, Universidade Estadual de Campinas Instituto de Artes, Campinas, Brazil
| | - Alicia Peñalba
- Department of Didactics of Bodily, Plastic and Musical Expression, Universidad de Valladolid Facultad de Filosofia y Letras, Valladolid, Spain
| | - Jônatas Manzolli
- Interdisciplinary Nucleus for Sound Studies, Universidade Estadual de Campinas Instituto de Artes, Campinas, Brazil
| |
Collapse
|
7
|
Deuel TA, Pampin J, Sundstrom J, Darvas F. The Encephalophone: A Novel Musical Biofeedback Device using Conscious Control of Electroencephalogram (EEG). Front Hum Neurosci 2017; 11:213. [PMID: 28491030 PMCID: PMC5405117 DOI: 10.3389/fnhum.2017.00213] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2016] [Accepted: 04/11/2017] [Indexed: 01/07/2023] Open
Abstract
A novel musical instrument and biofeedback device was created using electroencephalogram (EEG) posterior dominant rhythm (PDR) or mu rhythm to control a synthesized piano, which we call the Encephalophone. Alpha-frequency (8–12 Hz) signal power from PDR in the visual cortex or from mu rhythm in the motor cortex was used to create a power scale which was then converted into a musical scale, which could be manipulated by the individual in real time. Subjects could then generate different notes of the scale by activation (event-related synchronization) or de-activation (event-related desynchronization) of the PDR or mu rhythms in visual or motor cortex, respectively. Fifteen novice normal subjects were tested in their ability to hit target notes presented within a 5-min trial period. All 15 subjects were able to perform more accurately (average of 27.4 hits, 67.1% accuracy for visual cortex/PDR signaling; average of 20.6 hits, 57.1% accuracy for mu signaling) than a random note generation (19.03% accuracy). Moreover, PDR control was significantly more accurate than mu control. This shows that novice healthy individuals can control music with better accuracy than random, with no prior training on the device, and that PDR control is more accurate than mu control for these novices. Individuals with more years of musical training showed a moderate positive correlation with more PDR accuracy, but not mu accuracy. The Encephalophone may have potential applications both as a novel musical instrument without requiring movement, as well as a potential therapeutic biofeedback device for patients suffering from motor deficits (e.g., amyotrophic lateral sclerosis (ALS), brainstem stroke, traumatic amputation).
Collapse
Affiliation(s)
- Thomas A Deuel
- Department of Neurology, Swedish Neuroscience InstituteSeattle, WA, USA.,Center for Digital Arts and Experimental Media (DXARTS), University of WashingtonSeattle, WA, USA
| | - Juan Pampin
- Center for Digital Arts and Experimental Media (DXARTS), University of WashingtonSeattle, WA, USA.,School of Music, University of WashingtonSeattle, WA, USA
| | | | - Felix Darvas
- Department of Neurosurgery, University of WashingtonSeattle, WA, USA
| |
Collapse
|
8
|
Daly I, Williams D, Kirke A, Weaver J, Malik A, Hwang F, Miranda E, Nasuto SJ. Affective brain–computer music interfacing. J Neural Eng 2016; 13:046022. [DOI: 10.1088/1741-2560/13/4/046022] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
|
9
|
Daly I, Williams D, Hallowell J, Hwang F, Kirke A, Malik A, Weaver J, Miranda E, Nasuto SJ. Music-induced emotions can be predicted from a combination of brain activity and acoustic features. Brain Cogn 2015; 101:1-11. [PMID: 26544602 DOI: 10.1016/j.bandc.2015.08.003] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2015] [Revised: 08/03/2015] [Accepted: 08/04/2015] [Indexed: 10/22/2022]
Abstract
It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participant's music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).
Collapse
Affiliation(s)
- Ian Daly
- Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, UK.
| | - Duncan Williams
- Interdisciplinary Centre for Music Research, University of Plymouth, Plymouth, UK
| | - James Hallowell
- Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, UK
| | - Faustina Hwang
- Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, UK
| | - Alexis Kirke
- Interdisciplinary Centre for Music Research, University of Plymouth, Plymouth, UK
| | - Asad Malik
- Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, UK
| | - James Weaver
- Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, UK
| | - Eduardo Miranda
- Interdisciplinary Centre for Music Research, University of Plymouth, Plymouth, UK
| | - Slawomir J Nasuto
- Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, UK
| |
Collapse
|
10
|
Botrel L, Holz E, Kübler A. Brain Painting V2: evaluation of P300-based brain-computer interface for creative expression by an end-user following the user-centered design. BRAIN-COMPUTER INTERFACES 2015. [DOI: 10.1080/2326263x.2015.1100038] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
11
|
Eaton J, Williams D, Miranda E. The Space Between Us: Evaluating a multi-user affective brain-computer music interface. BRAIN-COMPUTER INTERFACES 2015. [DOI: 10.1080/2326263x.2015.1101922] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
12
|
Aparicio A. Immobilis in mobili: performing arts, BCI, and locked-in syndrome. BRAIN-COMPUTER INTERFACES 2015. [DOI: 10.1080/2326263x.2015.1100366] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
13
|
|
14
|
Daly I, Williams D, Hwang F, Kirke A, Malik A, Roesch E, Weaver J, Miranda E, Nasuto SJ. Investigating music tempo as a feedback mechanism for closed-loop BCI control. BRAIN-COMPUTER INTERFACES 2014. [DOI: 10.1080/2326263x.2014.979728] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
15
|
Ball P. Music is all in the mind. Nature 2011. [DOI: 10.1038/news.2011.113] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|