Audio-visual integration in noise: Influence of auditory and visual stimulus degradation on eye movements and perception
- PDF / 872,212 Bytes
- 14 Pages / 595.276 x 790.866 pts Page_size
- 20 Downloads / 214 Views
Audio-visual integration in noise: Influence of auditory and visual stimulus degradation on eye movements and perception of the McGurk effect Jemaine E. Stacey 1,2 & Christina J. Howard 1 & Suvobrata Mitra 1 & Paula C. Stacey 1
# The Author(s) 2020
Abstract Seeing a talker’s face can aid audiovisual (AV) integration when speech is presented in noise. However, few studies have simultaneously manipulated auditory and visual degradation. We aimed to establish how degrading the auditory and visual signal affected AV integration. Where people look on the face in this context is also of interest; Buchan, Paré and Munhall (Brain Research, 1242, 162–171, 2008) found fixations on the mouth increased in the presence of auditory noise whilst Wilson, Alsius, Paré and Munhall (Journal of Speech, Language, and Hearing Research, 59(4), 601–615, 2016) found mouth fixations decreased with decreasing visual resolution. In Condition 1, participants listened to clear speech, and in Condition 2, participants listened to vocoded speech designed to simulate the information provided by a cochlear implant. Speech was presented in three levels of auditory noise and three levels of visual blurring. Adding noise to the auditory signal increased McGurk responses, while blurring the visual signal decreased McGurk responses. Participants fixated the mouth more on trials when the McGurk effect was perceived. Adding auditory noise led to people fixating the mouth more, while visual degradation led to people fixating the mouth less. Combined, the results suggest that modality preference and where people look during AV integration of incongruent syllables varies according to the quality of information available. Keywords McGurk effect . Eye movements . Multisensory perception . Audio-visual integration
Introduction In our everyday environment we are bombarded with information from our senses; multisensory integration is essential for helping to consolidate information and make sense of the world. Multisensory information is often complementary; for example, to understand the person speaking during a conservation, the auditory element (the voice of the speaker) and the visual element (the face of the speaker) are combined into a single percept. It has been suggested that this occurs because Electronic supplementary material The online version of this article (https://doi.org/10.3758/s13414-020-02042-x) contains supplementary material, which is available to authorized users. * Jemaine E. Stacey [email protected] 1
Department of Psychology, Nottingham Trent University, Nottingham NG1 4BU, UK
2
National Institute for Health Research, Nottingham Biomedical Research Centre, Nottingham NG1 5DU, UK
sensory pathways in the brain are cross-modal, meaning they can be influenced by other modalities (Shimojo & Shams, 2001). This idea is underpinned in part by evidence from audiovisual perceptual illusions that arise when synchronized, incongruent information is presented in the auditory and visual modalities. Research has shown that auditory
Data Loading...