1
|
An S, Lyu H, Seong D, Yoon H, Kim IS, Lee H, Shin M, Hwang KC, Son D. A Water-Resistant, Self-Healing Encapsulation Layer for a Stable, Implantable Wireless Antenna. Polymers (Basel) 2023; 15:3391. [PMID: 37631448 PMCID: PMC10457836 DOI: 10.3390/polym15163391] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2023] [Revised: 08/09/2023] [Accepted: 08/11/2023] [Indexed: 08/27/2023] Open
Abstract
Polymers for implantable devices are desirable for biomedical engineering applications. This study introduces a water-resistant, self-healing fluoroelastomer (SHFE) as an encapsulation material for antennas. The SHFE exhibits a tissue-like modulus (approximately 0.4 MPa), stretchability (at least 450%, even after self-healing in an underwater environment), self-healability, and water resistance (WVTR result: 17.8610 g m-2 day-1). Further, the SHFE is self-healing in underwater environments via dipole-dipole interactions, such that devices can be protected from the penetration of biofluids and withstand external damage. With the combination of the SHFE and antennas designed to operate inside the body, we fabricated implantable, wireless antennas that can transmit information from inside the body to a reader coil that is outside. For antennas designed considering the dielectric constant, the uniformity of the encapsulation layer is crucial. A uniform and homogeneous interface is formed by simply overlapping two films. This study demonstrated the possibility of wireless communication in vivo through experiments on rodents for 4 weeks, maintaining the maximum communication distance (15 mm) without chemical or physical deformation in the SHFE layer. This study illustrates the applicability of fluoroelastomers in vivo and is expected to contribute to realizing the stable operation of high-performance implantable devices.
Collapse
Affiliation(s)
- Soojung An
- Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, Republic of Korea; (S.A.); (H.L.); (D.S.); (H.Y.)
| | - Hyunsang Lyu
- Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, Republic of Korea; (S.A.); (H.L.); (D.S.); (H.Y.)
| | - Duhwan Seong
- Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, Republic of Korea; (S.A.); (H.L.); (D.S.); (H.Y.)
| | - Hyun Yoon
- Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, Republic of Korea; (S.A.); (H.L.); (D.S.); (H.Y.)
| | - In Soo Kim
- Nanophotonics Research Center, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea;
| | - Hyojin Lee
- Biomaterials Research Center, Biomedical Research Institute, Korea Institute of Science and Technology (KIST), Seoul 02792, Republic of Korea;
- Division of Bio-Medical Science & Technology, KIST School—Korea University of Science and Technology (UST), Seoul 02792, Republic of Korea
| | - Mikyung Shin
- Department of Biomedical Engineering, Sungkyunkwan University (SKKU), Suwon 16419, Republic of Korea;
- Department of Intelligent Precision Healthcare Convergence, Sungkyunkwan University (SKKU), Suwon 16419, Republic of Korea
| | - Keum Cheol Hwang
- Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, Republic of Korea; (S.A.); (H.L.); (D.S.); (H.Y.)
| | - Donghee Son
- Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, Republic of Korea; (S.A.); (H.L.); (D.S.); (H.Y.)
- Department of Superintelligence Engineering, Sungkyunkwan University (SKKU), Suwon 16419, Republic of Korea
| |
Collapse
|
2
|
Ramirez-Melendez R, Reija X. The Creative Drummer: An EEG-Based Pilot Study on the Correlates of Emotions and Creative Drum Playing. Brain Sci 2023; 13:brainsci13010088. [PMID: 36672069 PMCID: PMC9856948 DOI: 10.3390/brainsci13010088] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2022] [Revised: 12/16/2022] [Accepted: 12/28/2022] [Indexed: 01/04/2023] Open
Abstract
It is reasonable to assume that emotional processes are involved in creative tasks and the generation of creative ideas. In this pilot study, we investigate the emotional correlates in professional drummers during different degrees of creative music playing. Ten participants performed three tasks: repetitive rhythmic drum playing, pattern-based improvisation, and attention-intensive free improvisation, while their EEG activity was recorded. Arousal and valence levels were estimated from the EEG data at baseline and for the three tasks. Results show significantly increased levels of valence (i.e., increased prefrontal right alpha power compared to prefrontal left alpha power) during pattern-based and free improvisation relative to baseline, and significantly increased levels of valence during free improvisation relative to pattern-based improvisation. These results seem to indicate that positive emotion (characterized as increased valence) is associated with the creation of original ideas in drum playing and that the freer the creative process, the greater the positive effect. The implication of these results may be of particular relevance in the fields of music-based therapeutic interventions and music pedagogy.
Collapse
|
3
|
Ramirez-Melendez R, Matamoros E, Hernandez D, Mirabel J, Sanchez E, Escude N. Music-Enhanced Emotion Identification of Facial Emotions in Autistic Spectrum Disorder Children: A Pilot EEG Study. Brain Sci 2022; 12:704. [PMID: 35741590 PMCID: PMC9221118 DOI: 10.3390/brainsci12060704] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2022] [Revised: 05/18/2022] [Accepted: 05/19/2022] [Indexed: 11/17/2022] Open
Abstract
The Autistic Spectrum Disorder (ASD) is characterized by a difficulty in expressing and interpreting others' emotions. In particular, people with ASD have difficulties when interpreting emotions encoded in facial expressions. In the past, music interventions have been shown to improve autistic individuals' emotional and social skills. The present study describes a pilot study to explore the usefulness of music as a tool for improving autistic children's emotion recognition in facial expressions. Twenty-five children (mean age = 8.8 y, SD = 1.24) with high-functioning ASD and normal hearing participated in the study consisting of four weekly sessions of 15 min each. Twenty-five participants were randomly divided into an experimental group (N = 14) and a control group (N = 11). During each session, participants in the experimental group were exposed to images of facial expressions for four emotions (happy, sad, angry, and fear). Images were shown in three conditions, with the second condition consisting of music of congruent emotion with the shown images. Participants in the control group were shown only images in all three conditions. For six participants in each group, EEG data were acquired during the sessions, and instantaneous emotional responses (arousal and valence values) were extracted from the EEG data. Inter- and intra-session emotion identification improvement was measured in terms of verbal response accuracy, and EEG response differences were analyzed. A comparison of the verbal responses of the experimental group pre- and post-intervention showed a significant (p = 0.001) average improvement in emotion identification accuracy responses of 26% (SD = 3.4). Furthermore, emotional responses of the experimental group at the end of the study showed a higher correlation with the emotional stimuli being presented, compared with their emotional responses at the beginning of the study. No similar verbal responses improvement or EEG-stimuli correlation was found in the control group. These results seem to indicate that music can be used to improve both emotion identification in facial expressions and emotion induction through facial stimuli in children with high-functioning ASD.
Collapse
Affiliation(s)
| | - Elisabet Matamoros
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, 08018 Barcelona, Spain; (E.M.); (D.H.)
| | - Davinia Hernandez
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, 08018 Barcelona, Spain; (E.M.); (D.H.)
| | - Julia Mirabel
- Centre Carrilet, 08031 Barcelona, Spain; (J.M.); (E.S.)
| | | | - Nuria Escude
- Institut Catalá de Musicoterapia, 08021 Barcelona, Spain;
| |
Collapse
|
4
|
Ramirez R, Planas J, Escude N, Mercade J, Farriols C. EEG-Based Analysis of the Emotional Effect of Music Therapy on Palliative Care Cancer Patients. Front Psychol 2018; 9:254. [PMID: 29551984 PMCID: PMC5840261 DOI: 10.3389/fpsyg.2018.00254] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2017] [Accepted: 02/15/2018] [Indexed: 11/13/2022] Open
Abstract
Music is known to have the power to induce strong emotions. The present study assessed, based on Electroencephalography (EEG) data, the emotional response of terminally ill cancer patients to a music therapy intervention in a randomized controlled trial. A sample of 40 participants from the palliative care unit in the Hospital del Mar in Barcelona was randomly assigned to two groups of 20. The first group [experimental group (EG)] participated in a session of music therapy (MT), and the second group [control group (CG)] was provided with company. Based on our previous work on EEG-based emotion detection, instantaneous emotional indicators in the form of a coordinate in the arousal-valence plane were extracted from the participants’ EEG data. The emotional indicators were analyzed in order to quantify (1) the overall emotional effect of MT on the patients compared to controls, and (2) the relative effect of the different MT techniques applied during each session. During each MT session, five conditions were considered: I (initial patient’s state before MT starts), C1 (passive listening), C2 (active listening), R (relaxation), and F (final patient’s state). EEG data analysis showed a significant increase in valence (p = 0.0004) and arousal (p = 0.003) between I and F in the EG. No significant changes were found in the CG. This results can be interpreted as a positive emotional effect of MT in advanced cancer patients. In addition, according to pre- and post-intervention questionnaire responses, participants in the EG also showed a significant decrease in tiredness, anxiety and breathing difficulties, as well as an increase in levels of well-being. No equivalent changes were observed in the CG.
Collapse
Affiliation(s)
- Rafael Ramirez
- Music and Machine Learning Lab, Department of Information and Communication Technologies, Pompeu Fabra University, Barcelona, Spain
| | - Josep Planas
- Palliative Care Unit, Oncology Service, Parc de Salut Mar, Instituto Mar de Investigaciones Médicas, Barcelona, Spain
| | - Nuria Escude
- Catalan Institute of Music Therapy, University of Barcelona, Barcelona, Spain
| | - Jordi Mercade
- Catalan Institute of Music Therapy, University of Barcelona, Barcelona, Spain
| | - Cristina Farriols
- Palliative Care Unit, Oncology Service, Parc de Salut Mar, Instituto Mar de Investigaciones Médicas, Barcelona, Spain
| |
Collapse
|
5
|
Affiliation(s)
- Cinthi Pillai
- Massachusetts Eye and Ear Infirmary, Harvard Medical School, Boston, MA, USA
| | - John W. Gittinger
- Massachusetts Eye and Ear Infirmary, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
6
|
Ramirez R, Palencia-Lefler M, Giraldo S, Vamvakousis Z. Musical neurofeedback for treating depression in elderly people. Front Neurosci 2015; 9:354. [PMID: 26483628 PMCID: PMC4591427 DOI: 10.3389/fnins.2015.00354] [Citation(s) in RCA: 50] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2015] [Accepted: 09/17/2015] [Indexed: 11/13/2022] Open
Abstract
We introduce a new neurofeedback approach, which allows users to manipulate expressive parameters in music performances using their emotional state, and we present the results of a pilot clinical experiment applying the approach to alleviate depression in elderly people. Ten adults (9 female and 1 male, mean = 84, SD = 5.8) with normal hearing participated in the neurofeedback study consisting of 10 sessions (2 sessions per week) of 15 min each. EEG data was acquired using the Emotiv EPOC EEG device. In all sessions, subjects were asked to sit in a comfortable chair facing two loudspeakers, to close their eyes, and to avoid moving during the experiment. Participants listened to music pieces preselected according to their music preferences, and were encouraged to increase the loudness and tempo of the pieces, based on their arousal and valence levels. The neurofeedback system was tuned so that increased arousal, computed as beta to alpha activity ratio in the frontal cortex corresponded to increased loudness, and increased valence, computed as relative frontal alpha activity in the right lobe compared to the left lobe, corresponded to increased tempo. Pre and post evaluation of six participants was performed using the BDI depression test, showing an average improvement of 17.2% (1.3) in their BDI scores at the end of the study. In addition, an analysis of the collected EEG data of the participants showed a significant decrease of relative alpha activity in their left frontal lobe (p = 0.00008), which may be interpreted as an improvement of their depression condition.
Collapse
Affiliation(s)
- Rafael Ramirez
- Department of Information and Communication Technologies, Universitat Pompeu Fabra Barcelona, Spain
| | | | - Sergio Giraldo
- Department of Information and Communication Technologies, Universitat Pompeu Fabra Barcelona, Spain
| | - Zacharias Vamvakousis
- Department of Information and Communication Technologies, Universitat Pompeu Fabra Barcelona, Spain
| |
Collapse
|
7
|
Yau SH, McArthur G, Badcock NA, Brock J. Case study: auditory brain responses in a minimally verbal child with autism and cerebral palsy. Front Neurosci 2015; 9:208. [PMID: 26150768 PMCID: PMC4473003 DOI: 10.3389/fnins.2015.00208] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2014] [Accepted: 05/24/2015] [Indexed: 01/17/2023] Open
Abstract
An estimated 30% of individuals with autism spectrum disorders (ASD) remain minimally verbal into late childhood, but research on cognition and brain function in ASD focuses almost exclusively on those with good or only moderately impaired language. Here we present a case study investigating auditory processing of GM, a nonverbal child with ASD and cerebral palsy. At the age of 8 years, GM was tested using magnetoencephalography (MEG) whilst passively listening to speech sounds and complex tones. Where typically developing children and verbal autistic children all demonstrated similar brain responses to speech and nonspeech sounds, GM produced much stronger responses to nonspeech than speech, particularly in the 65-165 ms (M50/M100) time window post-stimulus onset. GM was retested aged 10 years using electroencephalography (EEG) whilst passively listening to pure tone stimuli. Consistent with her MEG response to complex tones, GM showed an unusually early and strong response to pure tones in her EEG responses. The consistency of the MEG and EEG data in this single case study demonstrate both the potential and the feasibility of these methods in the study of minimally verbal children with ASD. Further research is required to determine whether GM's atypical auditory responses are characteristic of other minimally verbal children with ASD or of other individuals with cerebral palsy.
Collapse
Affiliation(s)
- Shu H. Yau
- ARC Centre of Excellence in Cognition and its Disorders, Macquarie UniversitySydney, Australia
- Department of Cognitive Science, Macquarie UniversitySydney, Australia
| | - Genevieve McArthur
- ARC Centre of Excellence in Cognition and its Disorders, Macquarie UniversitySydney, Australia
- Department of Cognitive Science, Macquarie UniversitySydney, Australia
| | - Nicholas A. Badcock
- ARC Centre of Excellence in Cognition and its Disorders, Macquarie UniversitySydney, Australia
- Department of Cognitive Science, Macquarie UniversitySydney, Australia
| | - Jon Brock
- ARC Centre of Excellence in Cognition and its Disorders, Macquarie UniversitySydney, Australia
- Department of Cognitive Science, Macquarie UniversitySydney, Australia
- Department of Psychology, Macquarie UniversitySydney, Australia
| |
Collapse
|
8
|
Badcock NA, Mousikou P, Mahajan Y, de Lissa P, Thie J, McArthur G. Validation of the Emotiv EPOC(®) EEG gaming system for measuring research quality auditory ERPs. PeerJ 2013; 1:e38. [PMID: 23638374 PMCID: PMC3628843 DOI: 10.7717/peerj.38] [Citation(s) in RCA: 134] [Impact Index Per Article: 12.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2012] [Accepted: 01/22/2013] [Indexed: 11/20/2022] Open
Abstract
Background. Auditory event-related potentials (ERPs) have proved useful in investigating the role of auditory processing in cognitive disorders such as developmental dyslexia, specific language impairment (SLI), attention deficit hyperactivity disorder (ADHD), schizophrenia, and autism. However, laboratory recordings of auditory ERPs can be lengthy, uncomfortable, or threatening for some participants - particularly children. Recently, a commercial gaming electroencephalography (EEG) system has been developed that is portable, inexpensive, and easy to set up. In this study we tested if auditory ERPs measured using a gaming EEG system (Emotiv EPOC(®), www.emotiv.com) were equivalent to those measured by a widely-used, laboratory-based, research EEG system (Neuroscan). Methods. We simultaneously recorded EEGs with the research and gaming EEG systems, whilst presenting 21 adults with 566 standard (1000 Hz) and 100 deviant (1200 Hz) tones under passive (non-attended) and active (attended) conditions. The onset of each tone was marked in the EEGs using a parallel port pulse (Neuroscan) or a stimulus-generated electrical pulse injected into the O1 and O2 channels (Emotiv EPOC(®)). These markers were used to calculate research and gaming EEG system late auditory ERPs (P1, N1, P2, N2, and P3 peaks) and the mismatch negativity (MMN) in active and passive listening conditions for each participant. Results. Analyses were restricted to frontal sites as these are most commonly reported in auditory ERP research. Intra-class correlations (ICCs) indicated that the morphology of the research and gaming EEG system late auditory ERP waveforms were similar across all participants, but that the research and gaming EEG system MMN waveforms were only similar for participants with non-noisy MMN waveforms (N = 11 out of 21). Peak amplitude and latency measures revealed no significant differences between the size or the timing of the auditory P1, N1, P2, N2, P3, and MMN peaks. Conclusions. Our findings suggest that the gaming EEG system may prove a valid alternative to laboratory ERP systems for recording reliable late auditory ERPs (P1, N1, P2, N2, and the P3) over the frontal cortices. In the future, the gaming EEG system may also prove useful for measuring less reliable ERPs, such as the MMN, if the reliability of such ERPs can be boosted to the same level as late auditory ERPs.
Collapse
Affiliation(s)
- Nicholas A Badcock
- ARC Centre of Excellence in Cognition and its Disorders, Macquarie University , Sydney, NSW , Australia
| | | | | | | | | | | |
Collapse
|
9
|
Singman EL. Automating the assessment of visual dysfunction after traumatic brain injury. ACTA ACUST UNITED AC 2013. [DOI: 10.7243/2052-6962-1-3] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|