1
|
Kwon J, Kim JY. Meaning of Gaze Behaviors in Individuals' Perception and Interpretation of Commercial Interior Environments: An Experimental Phenomenology Approach Involving Eye-Tracking. Front Psychol 2021; 12:581918. [PMID: 34484018 PMCID: PMC8415749 DOI: 10.3389/fpsyg.2021.581918] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2020] [Accepted: 07/12/2021] [Indexed: 11/13/2022] Open
Abstract
A critical question in interior design is how multisensory information is integrated into occupant perception and interpretation of the environmental contexts and meanings. Although there have been efforts to identify and theorize visual perception of interior factors or features (e.g., colors, fixtures, and signs), the hidden meanings behind visual attention and behaviors have been neglected in interior design research. This experimental phenomenological study investigates the impact of auditory stimuli on the gaze behaviors of individuals and the hidden meanings of their audio-visual perceptions of commercial interiors. Implementing eye-tracking and open-ended interviews, this study explored how the neurophysiological and phenomenological methods in complementary can serve for interior design research on the meaning of gaze behaviors. The study used a convenience sample of 26 participants, three coffee shop interior images, and two musical stimuli. Essential to this study is the interpretive analysis of corresponding eye-tracking and interview data. The results show that visual perception is affected by auditory stimuli and other interior elements and factors associated with personal experiences; however, no distinct gaze pattern is identified by the type of auditory stimuli. The fixation patterns showed mixed reflections of the participants' perceptions, e.g., a single fixation pattern reflecting participants' likes and dislikes. Findings included six essential meanings of participants' gaze behaviors. This study suggested that auditory and visual stimuli are reciprocal in individuals' perceptions. Rather than one affects the other, the interaction between sensory stimuli contributes to the complexity and intensity of multisensory stimuli people associate with their experiences and conceptualize with meanings they establish.
Collapse
Affiliation(s)
- Jain Kwon
- Interior Architecture and Design, Department of Design and Merchandising, Colorado State University, Fort Collins, CO, United States
| | - Ju Yeon Kim
- Interior Architectural Design, School of Architecture, Soongsil University, Seoul, South Korea
| |
Collapse
|
2
|
Di Cicco F, van Zuijlen MJP, Wijntjes MWA, Pont SC. Soft like velvet and shiny like satin: Perceptual material signatures of fabrics depicted in 17th century paintings. J Vis 2021; 21:10. [PMID: 33978685 PMCID: PMC8132013 DOI: 10.1167/jov.21.5.10] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Accepted: 03/25/2021] [Indexed: 11/24/2022] Open
Abstract
Dutch 17th century painters were masters in depicting materials and their properties in a convincing way. Here, we studied the perception of the material signatures and key image features of different depicted fabrics, like satin and velvet. We also tested whether the perception of fabrics depicted in paintings related to local or global cues, by cropping the stimuli. In Experiment 1, roughness, warmth, softness, heaviness, hairiness, and shininess were rated for the stimuli shown either full figure or cropped. In the full figure, all attributes except shininess were rated higher for velvet, whereas shininess was rated higher for satin. This distinction was less clear in the cropped condition, and some properties were perceived significantly different between the two conditions. In Experiment 2 we tested whether this difference was due to the choice of the cropped area. On the basis of the results of Experiment 1, shininess and softness were rated for multiple crops from each fabric. Most crops from the same fabric differed significantly in shininess, but not in softness perception. Perceived shininess correlated positively with the mean luminance of the crops and the highlights' coverage. Experiment 1 showed that painted velvet and satin triggered distinct perceptions, indicative of robust material signatures of the two fabrics. The results of Experiment 2 suggest that the presence of local image cues affects the perception of optical properties like shininess, but not mechanical properties such as softness.
Collapse
Affiliation(s)
- Francesca Di Cicco
- Perceptual Intelligence Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Delft, The Netherlands
| | - Mitchell J P van Zuijlen
- Perceptual Intelligence Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Delft, The Netherlands
| | - Maarten W A Wijntjes
- Perceptual Intelligence Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Delft, The Netherlands
| | - Sylvia C Pont
- Perceptual Intelligence Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Delft, The Netherlands
| |
Collapse
|
3
|
Badde S, Navarro KT, Landy MS. Modality-specific attention attenuates visual-tactile integration and recalibration effects by reducing prior expectations of a common source for vision and touch. Cognition 2020; 197:104170. [PMID: 32036027 DOI: 10.1016/j.cognition.2019.104170] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2019] [Revised: 12/19/2019] [Accepted: 12/20/2019] [Indexed: 10/25/2022]
Abstract
At any moment in time, streams of information reach the brain through the different senses. Given this wealth of noisy information, it is essential that we select information of relevance - a function fulfilled by attention - and infer its causal structure to eventually take advantage of redundancies across the senses. Yet, the role of selective attention during causal inference in cross-modal perception is unknown. We tested experimentally whether the distribution of attention across vision and touch enhances cross-modal spatial integration (visual-tactile ventriloquism effect, Expt. 1) and recalibration (visual-tactile ventriloquism aftereffect, Expt. 2) compared to modality-specific attention, and then used causal-inference modeling to isolate the mechanisms behind the attentional modulation. In both experiments, we found stronger effects of vision on touch under distributed than under modality-specific attention. Model comparison confirmed that participants used Bayes-optimal causal inference to localize visual and tactile stimuli presented as part of a visual-tactile stimulus pair, whereas simultaneously collected unity judgments - indicating whether the visual-tactile pair was perceived as spatially-aligned - relied on a sub-optimal heuristic. The best-fitting model revealed that attention modulated sensory and cognitive components of causal inference. First, distributed attention led to an increase of sensory noise compared to selective attention toward one modality. Second, attending to both modalities strengthened the stimulus-independent expectation that the two signals belong together, the prior probability of a common source for vision and touch. Yet, only the increase in the expectation of vision and touch sharing a common source was able to explain the observed enhancement of visual-tactile integration and recalibration effects with distributed attention. In contrast, the change in sensory noise explained only a fraction of the observed enhancements, as its consequences vary with the overall level of noise and stimulus congruency. Increased sensory noise leads to enhanced integration effects for visual-tactile pairs with a large spatial discrepancy, but reduced integration effects for stimuli with a small or no cross-modal discrepancy. In sum, our study indicates a weak a priori association between visual and tactile spatial signals that can be strengthened by distributing attention across both modalities.
Collapse
Affiliation(s)
- Stephanie Badde
- Department of Psychology and Center of Neural Science, New York University, 6 Washington Place, New York, NY, 10003, USA.
| | - Karen T Navarro
- Department of Psychology, University of Minnesota, 75 E River Rd., Minneapolis, MN, 55455, USA
| | - Michael S Landy
- Department of Psychology and Center of Neural Science, New York University, 6 Washington Place, New York, NY, 10003, USA
| |
Collapse
|
4
|
Takahashi C, Watt SJ. Optimal visual-haptic integration with articulated tools. Exp Brain Res 2017; 235:1361-1373. [PMID: 28214998 PMCID: PMC5380699 DOI: 10.1007/s00221-017-4896-5] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2016] [Accepted: 01/24/2017] [Indexed: 11/27/2022]
Abstract
When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory 'correspondence problem' underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world-seeing and feeling the same thing-and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual-haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools' properties.
Collapse
Affiliation(s)
- Chie Takahashi
- School of Computer Science, University of Birmingham, Birmingham, UK
| | - Simon J Watt
- Wolfson Centre for Cognitive Neuroscience, School of Psychology, Bangor University, Penrallt Rd., Bangor, Gwynedd, LL57 2AS, UK.
| |
Collapse
|
5
|
Mental Reactivation and Pleasantness Judgment of Experience Related to Vision, Hearing, Skin Sensations, Taste and Olfaction. PLoS One 2016; 11:e0159036. [PMID: 27400090 PMCID: PMC4939968 DOI: 10.1371/journal.pone.0159036] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2015] [Accepted: 06/27/2016] [Indexed: 11/27/2022] Open
Abstract
Language acquisition is based on our knowledge about the world and forms through multiple sensory-motor interactions with the environment. We link the properties of individual experience formed at different stages of ontogeny with the phased development of sensory modalities and with the acquisition of words describing the appropriate forms of sensitivity. To test whether early-formed experience related to skin sensations, olfaction and taste differs from later-formed experience related to vision and hearing, we asked Russian-speaking participants to categorize or to assess the pleasantness of experience mentally reactivated by sense-related adjectives found in common dictionaries. It was found that categorizing adjectives in relation to vision, hearing and skin sensations took longer than categorizing adjectives in relation to olfaction and taste. In addition, experience described by adjectives predominantly related to vision, hearing and skin sensations took more time for the pleasantness judgment and generated less intense emotions than that described by adjectives predominantly related to olfaction and taste. Interestingly the dynamics of skin resistance corresponded to the intensity and pleasantness of reported emotions. We also found that sense-related experience described by early-acquired adjectives took less time for the pleasantness judgment and generated more intense and more positive emotions than that described by later-acquired adjectives. Correlations were found between the time of the pleasantness judgment of experience, intensity and pleasantness of reported emotions, age of acquisition, frequency, imageability and length of sense-related adjectives. All in all these findings support the hypothesis that early-formed experience is less differentiated than later-formed experience.
Collapse
|
6
|
Adams WJ, Kerrigan IS, Graf EW. Touch influences perceived gloss. Sci Rep 2016; 6:21866. [PMID: 26915492 PMCID: PMC4768155 DOI: 10.1038/srep21866] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2015] [Accepted: 02/02/2016] [Indexed: 11/09/2022] Open
Abstract
Identifying an object's material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be. Visual cues to material qualities such as gloss have recently received attention, but how they interact with haptic (touch) information has been largely overlooked. Here, we show that touch modulates gloss perception: objects that feel slippery are perceived as glossier (more shiny).Participants explored virtual objects that varied in look and feel. A discrimination paradigm (Experiment 1) revealed that observers integrate visual gloss with haptic information. Observers could easily detect an increase in glossiness when it was paired with a decrease in friction. In contrast, increased glossiness coupled with decreased slipperiness produced a small perceptual change: the visual and haptic changes counteracted each other. Subjective ratings (Experiment 2) reflected a similar interaction - slippery objects were rated as glossier and vice versa. The sensory system treats visual gloss and haptic friction as correlated cues to surface material. Although friction is not a perfect predictor of gloss, the visual system appears to know and use a probabilistic relationship between these variables to bias perception - a sensible strategy given the ambiguity of visual clues to gloss.
Collapse
Affiliation(s)
- Wendy J Adams
- Psychology, University of Southampton, Southampton, SO17 1BJ, ENGLAND
| | - Iona S Kerrigan
- Psychology, University of Southampton, Southampton, SO17 1BJ, ENGLAND
| | - Erich W Graf
- Psychology, University of Southampton, Southampton, SO17 1BJ, ENGLAND
| |
Collapse
|
7
|
Shi Z, Müller HJ. Multisensory perception and action: development, decision-making, and neural mechanisms. Front Integr Neurosci 2013; 7:81. [PMID: 24319414 PMCID: PMC3836185 DOI: 10.3389/fnint.2013.00081] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2013] [Accepted: 11/04/2013] [Indexed: 11/13/2022] Open
Affiliation(s)
- Zhuanghua Shi
- Department of Psychology, Experimental Psychology, Ludwig-Maximilians-Universität München Munich, Germany
| | | |
Collapse
|
8
|
Grasping with the eyes of your hands: Hapsis and vision modulate hand preference. Exp Brain Res 2013; 232:385-93. [PMID: 24162864 DOI: 10.1007/s00221-013-3746-3] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2013] [Accepted: 10/11/2013] [Indexed: 10/26/2022]
|