1
|
Flechsenhar A, Levine S, Bertsch K. Threat induction biases processing of emotional expressions. Front Psychol 2022; 13:967800. [PMID: 36507050 PMCID: PMC9730731 DOI: 10.3389/fpsyg.2022.967800] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/13/2022] [Accepted: 10/18/2022] [Indexed: 11/25/2022] Open
Abstract
Threats can derive from our physical or social surroundings and bias the way we perceive and interpret a given situation. They can be signaled by peers through facial expressions, as expressed anger or fear can represent the source of perceived threat. The current study seeks to investigate enhanced attentional state and defensive reflexes associated with contextual threat induced through aversive sounds presented in an emotion recognition paradigm. In a sample of 120 healthy participants, response and gaze behavior revealed differences in perceiving emotional facial expressions between threat and safety conditions: Responses were slower under threat and less accurate. Happy and neutral facial expressions were classified correctly more often in a safety context and misclassified more often as fearful under threat. This unidirectional misclassification suggests that threat applies a negative filter to the perception of neutral and positive information. Eye movements were initiated later under threat, but fixation changes were more frequent and dwell times shorter compared to a safety context. These findings demonstrate that such experimental paradigms are capable of providing insight into how context alters emotion processing at cognitive, physiological, and behavioral levels. Such alterations may derive from evolutionary adaptations necessary for biasing cognitive processing to survive disadvantageous situations. This perspective sets up new testable hypotheses regarding how such levels of explanation may be dysfunctional in patient populations.
Collapse
Affiliation(s)
- Aleya Flechsenhar
- Clinical Psychology and Psychotherapy, Department of Psychology, LMU Munich, Munich, Germany,NeuroImaging Core Unit Munich (NICUM), University Hospital LMU, Munich, Germany,*Correspondence: Aleya Flechsenhar,
| | - Seth Levine
- Clinical Psychology and Psychotherapy, Department of Psychology, LMU Munich, Munich, Germany,NeuroImaging Core Unit Munich (NICUM), University Hospital LMU, Munich, Germany
| | - Katja Bertsch
- Clinical Psychology and Psychotherapy, Department of Psychology, LMU Munich, Munich, Germany,NeuroImaging Core Unit Munich (NICUM), University Hospital LMU, Munich, Germany,Department of General Psychiatry, Center for Psychosocial Medicine, Heidelberg University, Heidelberg, Germany
| |
Collapse
|
2
|
Zinchenko A, Kotz SA, Schröger E, Kanske P. Moving towards dynamics: Emotional modulation of cognitive and emotional control. Int J Psychophysiol 2020; 147:193-201. [DOI: 10.1016/j.ijpsycho.2019.10.018] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2019] [Revised: 10/18/2019] [Accepted: 10/23/2019] [Indexed: 12/13/2022]
|
3
|
Abstract
While audiovisual integration is well known in speech perception, faces and speech are also informative with respect to speaker recognition. To date, audiovisual integration in the recognition of familiar people has never been demonstrated. Here we show systematic benefits and costs for the recognition of familiar voices when these are combined with time-synchronized articulating faces, of corresponding or noncorresponding speaker identity, respectively. While these effects were strong for familiar voices, they were smaller or nonsignificant for unfamiliar voices, suggesting that the effects depend on the previous creation of a multimodal representation of a person's identity. Moreover, the effects were reduced or eliminated when voices were combined with the same faces presented as static pictures, demonstrating that the effects do not simply reflect the use of facial identity as a “cue” for voice recognition. This is the first direct evidence for audiovisual integration in person recognition.
Collapse
|
4
|
Zinchenko A, Obermeier C, Kanske P, Schröger E, Villringer A, Kotz SA. The Influence of Negative Emotion on Cognitive and Emotional Control Remains Intact in Aging. Front Aging Neurosci 2017; 9:349. [PMID: 29163132 PMCID: PMC5671981 DOI: 10.3389/fnagi.2017.00349] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2017] [Accepted: 10/16/2017] [Indexed: 02/06/2023] Open
Abstract
Healthy aging is characterized by a gradual decline in cognitive control and inhibition of interferences, while emotional control is either preserved or facilitated. Emotional control regulates the processing of emotional conflicts such as in irony in speech, and cognitive control resolves conflict between non-affective tendencies. While negative emotion can trigger control processes and speed up resolution of both cognitive and emotional conflicts, we know little about how aging affects the interaction of emotion and control. In two EEG experiments, we compared the influence of negative emotion on cognitive and emotional conflict processing in groups of younger adults (mean age = 25.2 years) and older adults (69.4 years). Participants viewed short video clips and either categorized spoken vowels (cognitive conflict) or their emotional valence (emotional conflict), while the visual facial information was congruent or incongruent. Results show that negative emotion modulates both cognitive and emotional conflict processing in younger and older adults as indicated in reduced response times and/or enhanced event-related potentials (ERPs). In emotional conflict processing, we observed a valence-specific N100 ERP component in both age groups. In cognitive conflict processing, we observed an interaction of emotion by congruence in the N100 responses in both age groups, and a main effect of congruence in the P200 and N200. Thus, the influence of emotion on conflict processing remains intact in aging, despite a marked decline in cognitive control. Older adults may prioritize emotional wellbeing and preserve the role of emotion in cognitive and emotional control.
Collapse
Affiliation(s)
- Artyom Zinchenko
- International Max Planck Research School on Neuroscience of Communication, Leipzig, Germany.,Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Department Psychologie, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Christian Obermeier
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Philipp Kanske
- Department of Social Neuroscience, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Institute of Clinical Psychology and Psychotherapy, Department of Psychology, Technische Universität Dresden, Dresden, Germany
| | - Erich Schröger
- Institute of Psychology, University of Leipzig, Leipzig, Germany
| | - Arno Villringer
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Sonja A Kotz
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.,Department of Neuropsychology and Psychopharmacology, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
5
|
Affiliation(s)
- Stefan R. Schweinberger
- Department of General Psychology, Friedrich Schiller University and DFG Research Unit Person Perception, Jena, Germany
| | - David M.C. Robertson
- Department of General Psychology, Friedrich Schiller University and DFG Research Unit Person Perception, Jena, Germany
| |
Collapse
|
6
|
Ben-Yosef D, Anaki D, Golan O. Context processing in adolescents with autism spectrum disorder: How complex could it be? Autism Res 2016; 10:520-530. [PMID: 27484258 DOI: 10.1002/aur.1676] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2016] [Accepted: 07/05/2016] [Indexed: 11/06/2022]
Abstract
The ability of individuals with Autism Spectrum Disorder (ASD) to process context has long been debated: According to the Weak Central Coherence theory, ASD is characterized by poor global processing, and consequently-poor context processing. In contrast, the Social Cognition theory argues individuals with ASD will present difficulties only in social context processing. The complexity theory of autism suggests context processing in ASD will depend on task complexity. The current study examined this controversy through two priming tasks, one presenting human stimuli (facial expressions) and the other presenting non-human stimuli (animal faces). Both tasks presented visual targets, preceded by congruent, incongruent, or neutral auditory primes. Local and global processing were examined by presenting the visual targets in three spatial frequency conditions: High frequency, low frequency, and broadband. Tasks were administered to 16 adolescents with high functioning ASD and 16 matched typically developing adolescents. Reaction time and accuracy were measured for each task in each condition. Results indicated that individuals with ASD processed context for both human and non-human stimuli, except in one condition, in which human stimuli had to be processed globally (i.e., target presented in low frequency). The task demands presented in this condition, and the performance deficit shown in the ASD group as a result, could be understood in terms of cognitive overload. These findings provide support for the complexity theory of autism and extend it. Our results also demonstrate how associative priming could support intact context processing of human and non-human stimuli in individuals with ASD. Autism Res 2017, 10: 520-530. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Dekel Ben-Yosef
- Department of Psychology, Bar-Ilan University, Ramat-Gan, 5290002, Israel
| | - David Anaki
- Department of Psychology, Bar-Ilan University, Ramat-Gan, 5290002, Israel.,Gonda Brain Research Center, Bar-Ilan University, Ramat-Gan, 5290002, Israel
| | - Ofer Golan
- Department of Psychology, Bar-Ilan University, Ramat-Gan, 5290002, Israel
| |
Collapse
|
7
|
Facial, vocal and cross-modal emotion processing in early-onset schizophrenia spectrum disorders. Schizophr Res 2015; 168:252-9. [PMID: 26297473 DOI: 10.1016/j.schres.2015.07.039] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/09/2015] [Revised: 07/13/2015] [Accepted: 07/22/2015] [Indexed: 11/22/2022]
Abstract
Recognition of emotional expressions plays an essential role in children's healthy development. Anomalies in these skills may result in empathy deficits, social interaction difficulties and premorbid emotional problems in children and adolescents with schizophrenia. Twenty-six subjects with early onset schizophrenia spectrum (EOSS) disorders and twenty-eight matched healthy controls (HC) were instructed to identify five basic emotions and a neutral expression. The assessment entailed presenting visual, auditory and congruent cross-modal stimuli. Using a generalized linear mixed model, we found no significant association for handedness, age or gender. However, significant associations emerged for emotion type, perception modality, and group. EOSS patients performed worse than HC in uni- and cross-modal emotional tasks with a specific negative emotion processing impairment pattern. There was no relationship between emotion identification scores and positive or negative symptoms, self-reported empathy traits or a positive history of developmental disorders. However, we found a significant association between emotional identification scores and nonverbal communication impairments. We conclude that cumulative dysfunctions in both nonverbal communication and emotion processing contribute to the social vulnerability and morbidity found in youths who display EOSS disorder.
Collapse
|
8
|
Tseng HH, Bossong MG, Modinos G, Chen KM, McGuire P, Allen P. A systematic review of multisensory cognitive–affective integration in schizophrenia. Neurosci Biobehav Rev 2015; 55:444-52. [DOI: 10.1016/j.neubiorev.2015.04.019] [Citation(s) in RCA: 57] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/17/2014] [Revised: 03/25/2015] [Accepted: 04/26/2015] [Indexed: 11/16/2022]
|
9
|
de Borst AW, de Gelder B. Is it the real deal? Perception of virtual characters versus humans: an affective cognitive neuroscience perspective. Front Psychol 2015; 6:576. [PMID: 26029133 PMCID: PMC4428060 DOI: 10.3389/fpsyg.2015.00576] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2014] [Accepted: 04/20/2015] [Indexed: 01/30/2023] Open
Abstract
Recent developments in neuroimaging research support the increased use of naturalistic stimulus material such as film, avatars, or androids. These stimuli allow for a better understanding of how the brain processes information in complex situations while maintaining experimental control. While avatars and androids are well suited to study human cognition, they should not be equated to human stimuli. For example, the uncanny valley hypothesis theorizes that artificial agents with high human-likeness may evoke feelings of eeriness in the human observer. Here we review if, when, and how the perception of human-like avatars and androids differs from the perception of humans and consider how this influences their utilization as stimulus material in social and affective neuroimaging studies. First, we discuss how the appearance of virtual characters affects perception. When stimuli are morphed across categories from non-human to human, the most ambiguous stimuli, rather than the most human-like stimuli, show prolonged classification times and increased eeriness. Human-like to human stimuli show a positive linear relationship with familiarity. Secondly, we show that expressions of emotions in human-like avatars can be perceived similarly to human emotions, with corresponding behavioral, physiological and neuronal activations, with exception of physical dissimilarities. Subsequently, we consider if and when one perceives differences in action representation by artificial agents versus humans. Motor resonance and predictive coding models may account for empirical findings, such as an interference effect on action for observed human-like, natural moving characters. However, the expansion of these models to explain more complex behavior, such as empathy, still needs to be investigated in more detail. Finally, we broaden our outlook to social interaction, where virtual reality stimuli can be utilized to imitate complex social situations.
Collapse
Affiliation(s)
- Aline W de Borst
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University , Maastricht, Netherlands
| | - Beatrice de Gelder
- Brain and Emotion Laboratory, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University , Maastricht, Netherlands
| |
Collapse
|
10
|
Kret ME, Ploeger A. Emotion processing deficits: A liability spectrum providing insight into comorbidity of mental disorders. Neurosci Biobehav Rev 2015; 52:153-71. [DOI: 10.1016/j.neubiorev.2015.02.011] [Citation(s) in RCA: 52] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2014] [Revised: 02/11/2015] [Accepted: 02/17/2015] [Indexed: 12/13/2022]
|
11
|
Asymmetries of influence: differential effects of body postures on perceptions of emotional facial expressions. PLoS One 2013; 8:e73605. [PMID: 24039996 PMCID: PMC3769306 DOI: 10.1371/journal.pone.0073605] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2013] [Accepted: 07/22/2013] [Indexed: 11/30/2022] Open
Abstract
The accuracy and speed with which emotional facial expressions are identified is influenced by body postures. Two influential models predict that these congruency effects will be largest when the emotion displayed in the face is similar to that displayed in the body: the emotional seed model and the dimensional model. These models differ in whether similarity is based on physical characteristics or underlying dimensions of valence and arousal. Using a 3-alternative forced-choice task in which stimuli were presented briefly (Exp 1a) or for an unlimited time (Exp 1b) we provide evidence that congruency effects are more complex than either model predicts; the effects are asymmetrical and cannot be accounted for by similarity alone. Fearful postures are especially influential when paired with facial expressions, but not when presented in a flanker task (Exp 2). We suggest refinements to each model that may account for our results and suggest that additional studies be conducted prior to drawing strong theoretical conclusions.
Collapse
|
12
|
Mondloch CJ, Horner M, Mian J. Wide eyes and drooping arms: Adult-like congruency effects emerge early in the development of sensitivity to emotional faces and body postures. J Exp Child Psychol 2013; 114:203-16. [DOI: 10.1016/j.jecp.2012.06.003] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2012] [Revised: 06/02/2012] [Accepted: 06/03/2012] [Indexed: 10/28/2022]
|
13
|
Sad or fearful? The influence of body posture on adults' and children's perception of facial displays of emotion. J Exp Child Psychol 2011; 111:180-96. [PMID: 21939983 DOI: 10.1016/j.jecp.2011.08.003] [Citation(s) in RCA: 43] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2010] [Revised: 08/04/2011] [Accepted: 08/06/2011] [Indexed: 11/23/2022]
Abstract
The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body posing fear). Adults and 8-year-olds made more errors and had longer reaction times on incongruent trials than on congruent trials when judging sad versus fearful facial expressions, an effect that was larger in 8-year-olds. The congruency effect was reduced when faces and bodies were misaligned, providing some evidence for holistic processing. Neither adults nor 8-year-olds were affected by congruency when judging sad versus happy expressions. Evidence that congruency effects vary with age and with similarity of emotional expressions is consistent with dimensional theories and "emotional seed" models of emotion perception.
Collapse
|
14
|
Multistage audiovisual integration of speech: dissociating identification and detection. Exp Brain Res 2010; 208:447-57. [DOI: 10.1007/s00221-010-2495-9] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2010] [Accepted: 11/09/2010] [Indexed: 11/25/2022]
|
15
|
Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation. Neuroimage 2010; 54:2973-82. [PMID: 21073970 DOI: 10.1016/j.neuroimage.2010.11.017] [Citation(s) in RCA: 49] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2010] [Revised: 10/25/2010] [Accepted: 11/04/2010] [Indexed: 11/21/2022] Open
Abstract
BACKGROUND The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. METHODS We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. RESULTS We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less happy (p=0.008) and sad faces were rated as sadder (p=0.002). INTERPRETATION Happy-sad congruence across modalities may enhance activity in auditory regions while incongruence appears to impact the perception of visual affect, leading to increased activation in face processing regions such as the FG. We suggest that greater understanding of the neural bases of happy-sad congruence across modalities can shed light on basic mechanisms of affective perception and experience and may lead to novel insights in the study of emotion regulation and therapeutic use of music.
Collapse
|
16
|
van der Smagt MJ, van Engeland H, Kemner C. Brief report: can you see what is not there? low-level auditory-visual integration in autism spectrum disorder. J Autism Dev Disord 2007; 37:2014-9. [PMID: 17273934 DOI: 10.1007/s10803-006-0346-0] [Citation(s) in RCA: 60] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2006] [Accepted: 12/01/2006] [Indexed: 10/23/2022]
Abstract
Patients diagnosed with Autism Spectrum Disorder, show impaired integration of information across different senses. The processing-level from which this impairment originates, however, remains unclear. We investigated low-level integration of auditory and visual stimuli in subjects with Autism Spectrum Disorder. High-functioning adult subjects with Autism Spectrum Disorder as well as age- and IQ-matched adults were tested using a task that evokes illusory visual stimuli, by presenting sounds concurrently with visual flashes. In both groups the number of sounds presented significantly affected the number of flashes perceived, yet there was no difference between groups. This finding implicates that any problems arising from integrating auditory and visual information must stem from higher processing stages in high-functioning adults with Autism Spectrum Disorder.
Collapse
Affiliation(s)
- Maarten J van der Smagt
- Experimental Psychology, Helmholtz Institute & Utrecht University, Heidelberglaan 2, Utrecht 3584 CS, The Netherlands.
| | | | | |
Collapse
|
17
|
de Gelder B, Morris JS, Dolan RJ. Unconscious fear influences emotional awareness of faces and voices. Proc Natl Acad Sci U S A 2005; 102:18682-7. [PMID: 16352717 PMCID: PMC1317960 DOI: 10.1073/pnas.0509179102] [Citation(s) in RCA: 89] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2005] [Indexed: 11/18/2022] Open
Abstract
Nonconscious recognition of facial expressions opens an intriguing possibility that two emotions can be present together in one brain with unconsciously and consciously perceived inputs interacting. We investigated this interaction in three experiments by using a hemianope patient with residual nonconscious vision. During simultaneous presentation of facial expressions to the intact and the blind field, we measured interactions between conscious and nonconsciously recognized images. Fear-specific congruence effects were expressed as enhanced neuronal activity in fusiform gyrus, amygdala, and pulvinar. Nonconscious facial expressions also influenced processing of consciously recognized emotional voices. Emotional congruency between visual and an auditory input enhances activity in amygdala and superior colliculus for blind, relative to intact, field presentation of faces. Our findings indicate that recognition of fear is mandatory and independent of awareness. Most importantly, unconscious fear recognition remains robust even in the light of a concurrent incongruent happy facial expression or an emotional voice of which the observer is aware.
Collapse
Affiliation(s)
- B de Gelder
- Cognitive and Affective Neuroscience Laboratory, Tilburg University, P.O. Box 90153, 5000 LE Tilburg, The Netherlands.
| | | | | |
Collapse
|
18
|
Partan SR, Marler P. Issues in the Classification of Multimodal Communication Signals. Am Nat 2005; 166:231-45. [PMID: 16032576 DOI: 10.1086/431246] [Citation(s) in RCA: 355] [Impact Index Per Article: 18.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2004] [Accepted: 04/04/2005] [Indexed: 11/03/2022]
Abstract
Communication involves complex behavior in multiple sensory channels, or "modalities." We provide an overview of multimodal communication and its costs and benefits, place examples of signals and displays from an array of taxa, sensory systems, and functions into our signal classification system, and consider issues surrounding the categorization of multimodal signals. The broadest level of classification is between signals with redundant and nonredundant components, with finer distinctions in each category. We recommend that researchers gather information on responses to each component of a multimodal signal as well as the response to the signal as a whole. We discuss the choice of categories, whether to categorize signals on the basis of the signal or the response, and how to classify signals if data are missing. The choice of behavioral assay may influence the outcome, as may the context of the communicative event. We also consider similarities and differences between multimodal and unimodal composite signals and signals that are sequentially, rather than simultaneously, multimodal.
Collapse
Affiliation(s)
- Sarah R Partan
- Department of Psychology, University of South Florida, St. Petersburg, Florida 33701, USA.
| | | |
Collapse
|