1
|
Cortical activations associated with spatial remapping of finger touch using EEG. Cereb Cortex 2024; 34:bhae161. [PMID: 38642106 DOI: 10.1093/cercor/bhae161] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2023] [Revised: 03/22/2024] [Accepted: 03/23/2024] [Indexed: 04/22/2024] Open
Abstract
The spatial coding of tactile information is functionally essential for touch-based shape perception and motor control. However, the spatiotemporal dynamics of how tactile information is remapped from the somatotopic reference frame in the primary somatosensory cortex to the spatiotopic reference frame remains unclear. This study investigated how hand position in space or posture influences cortical somatosensory processing. Twenty-two healthy subjects received electrical stimulation to the right thumb (D1) or little finger (D5) in three position conditions: palm down on right side of the body (baseline), hand crossing the body midline (effect of position), and palm up (effect of posture). Somatosensory-evoked potentials (SEPs) were recorded using electroencephalography. One early-, two mid-, and two late-latency neurophysiological components were identified for both fingers: P50, P1, N125, P200, and N250. D1 and D5 showed different cortical activation patterns: compared with baseline, the crossing condition showed significant clustering at P1 for D1, and at P50 and N125 for D5; the change in posture showed a significant cluster at N125 for D5. Clusters predominated at centro-parietal electrodes. These results suggest that tactile remapping of fingers after electrical stimulation occurs around 100-125 ms in the parietal cortex.
Collapse
|
2
|
Hand Posture and Force Estimation Using Surface Electromyography and an Artificial Neural Network. HUMAN FACTORS 2023; 65:382-402. [PMID: 34006135 DOI: 10.1177/00187208211016695] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
OBJECTIVE The purpose of this study was to develop an approach to predict hand posture (pinch versus grip) and grasp force using forearm surface electromyography (sEMG) and artificial neural networks (ANNs) during tasks that varied repetition rate and duty cycle. BACKGROUND Prior studies have used electromyography with machine learning models to predict grip force but relatively few studies have assessed whether both hand posture and force can be predicted, particularly at varying levels of duty cycle and repetition rate. METHOD Fourteen individuals participated in this experiment. sEMG data for five forearm muscles and force output data were collected. Calibration data (25, 50, 75, 100% of maximum voluntary contraction (MVC)) were used to train ANN models to predict hand posture (pinch versus grip) and force magnitude while performing tasks that varied load, repetition rate, and duty cycle. RESULTS Across all participants, overall hand posture prediction accuracy was 79% (0.79 ± .08), whereas overall hand force prediction accuracy was 73% (0.73 ± .09). Accuracy ranged between 0.65 and 0.93 based on varying repetition rate and duty cycle. CONCLUSION Hand posture and force prediction were possible using sEMG and ANNs, though there were important differences in the accuracy of predictions based on task characteristics including duty cycle and repetition rate. APPLICATION The results of this study could be applied to the development of a dosimeter used for distal upper extremity biomechanical exposure measurement, risk assessment, job (re)design, and return to work programs.
Collapse
|
3
|
sEMG-Based Hand Posture Recognition and Visual Feedback Training for the Forearm Amputee. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22207984. [PMID: 36298335 PMCID: PMC9608765 DOI: 10.3390/s22207984] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Revised: 10/14/2022] [Accepted: 10/17/2022] [Indexed: 05/27/2023]
Abstract
sEMG-based gesture recognition is useful for human-computer interactions, especially for technology supporting rehabilitation training and the control of electric prostheses. However, high variability in the sEMG signals of untrained users degrades the performance of gesture recognition algorithms. In this study, the hand posture recognition algorithm and radar plot-based visual feedback training were developed using multichannel sEMG sensors. Ten healthy adults and one bilateral forearm amputee participated by repeating twelve hand postures ten times. The visual feedback training was performed for two days and five days in healthy adults and a forearm amputee, respectively. Artificial neural network classifiers were trained with two types of feature vectors: a single feature vector and a combination of feature vectors. The classification accuracy of the forearm amputee increased significantly after three days of hand posture training. These results indicate that the visual feedback training efficiently improved the performance of sEMG-based hand posture recognition by reducing variability in the sEMG signal. Furthermore, a bilateral forearm amputee was able to participate in the rehabilitation training by using a radar plot, and the radar plot-based visual feedback training would help the amputees to control various electric prostheses.
Collapse
|
4
|
Hand Pose Recognition Using Parallel Multi Stream CNN. SENSORS (BASEL, SWITZERLAND) 2021; 21:8469. [PMID: 34960562 PMCID: PMC8708730 DOI: 10.3390/s21248469] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/30/2021] [Revised: 12/13/2021] [Accepted: 12/13/2021] [Indexed: 11/17/2022]
Abstract
Recently, several computer applications provided operating mode through pointing fingers, waving hands, and with body movement instead of a mouse, keyboard, audio, or touch input such as sign language recognition, robot control, games, appliances control, and smart surveillance. With the increase of hand-pose-based applications, new challenges in this domain have also emerged. Support vector machines and neural networks have been extensively used in this domain using conventional RGB data, which are not very effective for adequate performance. Recently, depth data have become popular due to better understating of posture attributes. In this study, a multiple parallel stream 2D CNN (two-dimensional convolution neural network) model is proposed to recognize the hand postures. The proposed model comprises multiple steps and layers to detect hand poses from image maps obtained from depth data. The hyper parameters of the proposed model are tuned through experimental analysis. Three publicly available benchmark datasets: Kaggle, First Person, and Dexter, are used independently to train and test the proposed approach. The accuracy of the proposed method is 99.99%, 99.48%, and 98% using the Kaggle hand posture dataset, First Person hand posture dataset, and Dexter dataset, respectively. Further, the results obtained for F1 and AUC scores are also near-optimal. Comparative analysis with state-of-the-art shows that the proposed model outperforms the previous methods.
Collapse
|
5
|
sEMG-Based Hand Posture Recognition Considering Electrode Shift, Feature Vectors, and Posture Groups. SENSORS 2021; 21:s21227681. [PMID: 34833756 PMCID: PMC8624257 DOI: 10.3390/s21227681] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Revised: 11/15/2021] [Accepted: 11/15/2021] [Indexed: 11/16/2022]
Abstract
Surface electromyography (sEMG)-based gesture recognition systems provide the intuitive and accurate recognition of various gestures in human-computer interaction. In this study, an sEMG-based hand posture recognition algorithm was developed, considering three main problems: electrode shift, feature vectors, and posture groups. The sEMG signal was measured using an armband sensor with the electrode shift. An artificial neural network classifier was trained using 21 feature vectors for seven different posture groups. The inter-session and inter-feature Pearson correlation coefficients (PCCs) were calculated. The results indicate that the classification performance improved with the number of training sessions of the electrode shift. The number of sessions necessary for efficient training was four, and the feature vectors with a high inter-session PCC (r > 0.7) exhibited high classification accuracy. Similarities between postures in a posture group decreased the classification accuracy. Our results indicate that the classification accuracy could be improved with the addition of more electrode shift training sessions and that the PCC is useful for selecting the feature vector. Furthermore, hand posture selection was as important as feature vector selection. These findings will help in optimizing the sEMG-based pattern recognition algorithm more easily and quickly.
Collapse
|
6
|
Behavioral and Physiological Evidence of a favored Hand Posture in the Body Representation for Action. Cereb Cortex 2021; 31:3299-3310. [PMID: 33611384 PMCID: PMC8196246 DOI: 10.1093/cercor/bhab011] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2020] [Revised: 01/07/2021] [Accepted: 01/13/2021] [Indexed: 11/12/2022] Open
Abstract
Motor planning and execution require a representational map of our body. Since the body can assume different postures, it is not known how it is represented in this map. Moreover, is the generation of the motor command favored by some body configurations? We investigated the existence of a centrally favored posture of the hand for action, in search of physiological and behavioral advantages due to central motor processing. We tested two opposite hand pinch grips, equally difficult and commonly used: forearm pronated, thumb-down, index-up pinch against the same grip performed with thumb-up. The former revealed faster movement onset, sign of faster neural computation, and faster target reaching. It induced increased corticospinal excitability, independently on pre-stimulus tonic muscle contraction. Remarkably, motor excitability also increased when thumb-down pinch was only observed, imagined, or prepared, actually keeping the hand at rest. Motor advantages were independent of any concurrent modulation due to somatosensory input, as shown by testing afferent inhibition. Results provide strong behavioral and physiological evidence for a preferred hand posture favoring brain motor control, independently by somatosensory processing. This suggests the existence of a baseline postural representation that may serve as an a priori spatial reference for body-space interaction.
Collapse
|
7
|
Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. J Imaging 2020; 6:jimaging6080073. [PMID: 34460688 PMCID: PMC8321080 DOI: 10.3390/jimaging6080073] [Citation(s) in RCA: 46] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2020] [Revised: 07/15/2020] [Accepted: 07/21/2020] [Indexed: 11/16/2022] Open
Abstract
Hand gestures are a form of nonverbal communication that can be used in several fields such as communication between deaf-mute people, robot control, human-computer interaction (HCI), home automation and medical applications. Research papers based on hand gestures have adopted many different techniques, including those based on instrumented sensor technology and computer vision. In other words, the hand sign can be classified under many headings, such as posture and gesture, as well as dynamic and static, or a hybrid of the two. This paper focuses on a review of the literature on hand gesture techniques and introduces their merits and limitations under different circumstances. In addition, it tabulates the performance of these methods, focusing on computer vision techniques that deal with the similarity and difference points, technique of hand segmentation used, classification algorithms and drawbacks, number and types of gestures, dataset used, detection range (distance) and type of camera used. This paper is a thorough general overview of hand gesture methods with a brief discussion of some possible applications.
Collapse
|
8
|
Abstract
The motor system plays a role in some object mental rotation tasks, and researchers have reported that people may use a strategy of motor simulation to mentally rotate objects. In this study, we used images of a hand with a letter printed on the palm to directly determine whether a hand image can be automatically rotated during the deliberate mental rotation of an object and whether the hand and object are rotated in the same trajectory. A total of 41 participants were shown the stimuli and asked to decide whether the letters, which were upright or tilted at specific degrees, were normal or mirrored. The hand images in the background showed either a left or a right hand in the palm view, with fingers pointing upwards, medial, downwards, or lateral. Reaction times and error rates were measured to determine the speed and accuracy of mental rotation. A complex interaction between the hand posture and letter orientation revealed that the hand image was mentally rotated automatically, together with the deliberate mental rotation of the letter. The biomechanical constraints of the hand also influenced reaction times, suggesting the involvement of the motor system in the concomitant mental rotation of the hand image. Consistent with the motor simulation theory, the participants seemed to imagine the hand carrying the object in its movement. These behavioural data support the motor simulation theory and elucidate specific processes of mental rotation that have not been addressed by neuroimaging studies.
Collapse
|
9
|
Functional connectivity associated with hand shape generation: Imitating novel hand postures and pantomiming tool grips challenge different nodes of a shared neural network. Hum Brain Mapp 2015; 36:3426-40. [PMID: 26095674 DOI: 10.1002/hbm.22853] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2015] [Revised: 04/23/2015] [Accepted: 05/16/2015] [Indexed: 11/05/2022] Open
Abstract
Clinical research suggests that imitating meaningless hand postures and pantomiming tool-related hand shapes rely on different neuroanatomical substrates. We investigated the BOLD responses to different tasks of hand posture generation in 14 right handed volunteers. Conjunction and contrast analyses were applied to select regions that were either common or sensitive to imitation and/or pantomime tasks. The selection included bilateral areas of medial and lateral extrastriate cortex, superior and inferior regions of the lateral and medial parietal lobe, primary motor and somatosensory cortex, and left dorsolateral prefrontal, and ventral and dorsal premotor cortices. Functional connectivity analysis revealed that during hand shape generation the BOLD-response of every region correlated significantly with every other area regardless of the hand posture task performed, although some regions were more involved in some hand postures tasks than others. Based on between-task differences in functional connectivity we predict that imitation of novel hand postures would suffer most from left superior parietal disruption and that pantomiming hand postures for tools would be impaired following left frontal damage, whereas both tasks would be sensitive to inferior parietal dysfunction. We also unveiled that posterior temporal cortex is committed to pantomiming tool grips, but that the involvement of this region to the execution of hand postures in general appears limited. We conclude that the generation of hand postures is subserved by a highly interconnected task-general neural network. Depending on task requirements some nodes/connections will be more engaged than others and these task-sensitive findings are in general agreement with recent lesion studies.
Collapse
|
10
|
Hand proximity differentially affects visual working memory for color and orientation in a binding task. Front Psychol 2014; 5:318. [PMID: 24795671 PMCID: PMC4001000 DOI: 10.3389/fpsyg.2014.00318] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2013] [Accepted: 03/27/2014] [Indexed: 11/25/2022] Open
Abstract
Observers determined whether two sequentially presented arrays of six lines were the same or different. Differences, when present, involved either a swap in the color of two lines or a swap in the orientation of two lines. Thus, accurate change detection required the binding of color and orientation information for each line within visual working memory. Holding viewing distance constant, the proximity of the arrays to the hands was manipulated. Placing the hands near the to-be-remembered array decreased participants’ ability to remember color information, but increased their ability to remember orientation information. This pair of results indicates that hand proximity differentially affects the processing of various types of visual information, a conclusion broadly consistent with functional and anatomical differences in the magnocellular and parvocellular pathways. It further indicates that hand proximity affects the likelihood that various object features will be encoded into integrated object files.
Collapse
|
11
|
Validity of a simple videogrammetric method to measure the movement of all hand segments for clinical purposes. Proc Inst Mech Eng H 2014; 228:182-9. [PMID: 24503512 DOI: 10.1177/0954411914522023] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Hand movement measurement is important in clinical, ergonomics and biomechanical fields. Videogrammetric techniques allow the measurement of hand movement without interfering with the natural hand behaviour. However, an accurate measurement of the hand movement requires the use of a high number of markers, which limits its applicability for the clinical practice (60 markers would be needed for hand and wrist). In this work, a simple method that uses a reduced number of markers (29), based on a simplified kinematic model of the hand, is proposed and evaluated. A set of experiments have been performed to evaluate the errors associated with the kinematic simplification, together with the evaluation of its accuracy, repeatability and reproducibility. The global error attributed to the kinematic simplification was 6.68°. The method has small errors in repeatability and reproducibility (3.43° and 4.23°, respectively) and shows no statistically significant difference with the use of electronic goniometers. The relevance of the work lies in the ability of measuring all degrees of freedom of the hand with a reduced number of markers without interfering with the natural hand behaviour, which makes it suitable for its use in clinical applications, as well as for ergonomic and biomechanical purposes.
Collapse
|
12
|
Abstract
The body is closely tied to the processing of social and emotional information. The purpose of this study was to determine whether a relationship between emotions and social attitudes conveyed through gestures exists. Thus, we tested the effect of pro-social (i.e., happy face) and anti-social (i.e., angry face) emotional primes on the ability to detect socially relevant hand postures (i.e., pictures depicting an open/closed hand). In particular, participants were required to establish, as quickly as possible, if the test stimulus (i.e., a hand posture) was the same or different, compared to the reference stimulus (i.e., a hand posture) previously displayed in the computer screen. Results show that facial primes, displayed between the reference and the test stimuli, influence the recognition of hand postures, according to the social attitude implicitly related to the stimulus. We found that perception of pro-social (i.e., happy face) primes resulted in slower RTs in detecting the open hand posture as compared to the closed hand posture. Vice-versa, perception of the anti-social (i.e., angry face) prime resulted in slower RTs in detecting the closed hand posture compared to the open hand posture. These results suggest that the social attitude implicitly conveyed by the displayed stimuli might represent the conceptual link between emotions and gestures.
Collapse
|
13
|
Abstract
A series of visual search experiments conducted by Abrams et al. (2008) indicates that disengagement of visual attention is slowed when the array of objects that are to be searched are close to the hands (hands on the monitor) than if they are not close to the hands (hands in the lap). These experiments establish the impact one's hands can have on visual attentional processing. In the current paper we more closely examine these two hand postures with the goal of pinpointing which characteristics are crucial for the observed differences in attentional processing. Specifically, in a set of 4 experiments we investigated additional hand postures and additional modes of response to address this goal. We replicated the original Abrams et al. (2008) effect when only the two original postures were used; however, surprisingly, the effect was extinguished with the new range of postures and response modes, and this extinction persisted across different populations (German and English students), and different experimental hardware. Furthermore, analyses indicated that it is unlikely that the extinction of the effect was caused by increased practice due to additional blocks of trials or by an increased probability that participants were able to guess the purpose of the experiment. As such our results suggest that in addition to the nature of the postures of the hand, the number of postures is a further important factor that influences the impact the hands have on visual processing.
Collapse
|
14
|
Grasp posture modulates attentional prioritization of space near the hands. Front Psychol 2013; 4:312. [PMID: 23755037 PMCID: PMC3668266 DOI: 10.3389/fpsyg.2013.00312] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2013] [Accepted: 05/15/2013] [Indexed: 11/21/2022] Open
Abstract
Changes in visual processing near the hands may assist observers in evaluating items that are candidates for actions. If altered vision near the hands reflects adaptations linked to effective action production, then positioning the hands for different types of actions could lead to different visual biases. I examined the influence of hand posture on attentional prioritization to test this hypothesis. Participants placed one of their hands on a visual display and detected targets appearing either near or far from the hand. Replicating previous findings, detection near the hand was facilitated when participants positioned their hand on the display in a standard open palm posture affording a power grasp (Experiments 1 and 3). However, when participants instead positioned their hand in a pincer grasp posture with the thumb and forefinger resting on the display, they were no faster to detect targets appearing near their hand than targets appearing away from their hand (Experiments 2 and 3). These results demonstrate that changes in visual processing near the hands rely on the hands' posture. Although hands positioned to afford power grasps facilitate rapid onset detection, a pincer grasp posture that affords more precise action does not.
Collapse
|
15
|
Hand posture effects on handedness recognition as revealed by the simon effect. Front Hum Neurosci 2009; 3:59. [PMID: 20011220 PMCID: PMC2791032 DOI: 10.3389/neuro.09.059.2009] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2009] [Accepted: 11/15/2009] [Indexed: 11/30/2022] Open
Abstract
We investigated the influence of hand posture in handedness recognition, while varying the spatial correspondence between stimulus and response in a modified Simon task. Drawings of the left and right hands were displayed either in a back or palm view while participants discriminated stimulus handedness by pressing either a left or right key with their hands resting either in a prone or supine posture. As a control, subjects performed a regular Simon task using simple geometric shapes as stimuli. Results showed that when hands were in a prone posture, the spatially corresponding trials (i.e., stimulus and response located on the same side) were faster than the non-corresponding trials (i.e., stimulus and response on opposite sides). In contrast, for the supine posture, there was no difference between corresponding and non-corresponding trials. Control experiments with the regular Simon task showed that the posture of the responding hand had no influence on performance. When the stimulus is the drawing of a hand, however, the posture of the responding hand affects the spatial correspondence effect because response location is coded based on multiple reference points, including the body of the hand.
Collapse
|
16
|
Brain regions controlling nonsynergistic versus synergistic movement of the digits: a functional magnetic resonance imaging study. J Neurosci 2002; 22:5074-80. [PMID: 12077202 PMCID: PMC6757747] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/25/2023] Open
Abstract
Human hand dexterity depends on the ability to move digits independently and to combine these movements in various coordinative patterns. It is well established that the primary motor cortex (M1) is important for skillful digit actions but less is known about the role played by the nonprimary motor centers. Here we use functional magnetic resonance imaging to examine the hypothesis that nonprimary motor areas and the posterior parietal cortex are strongly activated when healthy humans move the right digits in a skillful coordination pattern involving relatively independent digit movements. A task in which flexion of the thumb is accompanied by extension of the fingers and vice versa, i.e., a learned "nonsynergistic" coordination pattern, is contrasted with a task in which all digits flex and extend simultaneously in an innate synergistic coordination pattern (opening and closing the fist). The motor output is the same in the two conditions. Thus, the difference when contrasting the nonsynergistic and synergistic tasks represents the requirement to fractionate the movements of the thumb and fingers and to combine these movements in a learned coordinative pattern. The supplementary (and cingulate) motor area, the bilateral dorsal premotor area, the bilateral lateral cerebellum, the bilateral cortices of the postcentral sulcus, and the left intraparietal cortex showed stronger activity when the subjects made the nonsynergistic flexion-extension movements of the digits than when the synergistic movements were made. These results suggest that the human neural substrate for skillful digit movement includes a sensorimotor network of nonprimary frontoparietal areas and the cerebellum that, in conjunction with M1, control the movements of the digits.
Collapse
|