Silent articulation modulates auditory and audiovisual speech perception
- PDF / 1,005,054 Bytes
- 14 Pages / 595.276 x 790.866 pts Page_size
- 37 Downloads / 234 Views
		    RESEARCH ARTICLE
 
 Silent articulation modulates auditory and audiovisual speech perception Marc Sato · Emilie Troille · Lucie Ménard · Marie-Agnès Cathiard · Vincent Gracco 
 
 Received: 21 November 2012 / Accepted: 3 April 2013 / Published online: 17 April 2013 © Springer-Verlag Berlin Heidelberg 2013
 
 Abstract  The concept of an internal forward model that internally simulates the sensory consequences of an action is a central idea in speech motor control. Consistent with this hypothesis, silent articulation has been shown to modulate activity of the auditory cortex and to improve the auditory identification of concordant speech sounds, when embedded in white noise. In the present study, we replicated and extended this behavioral finding by showing that silently articulating a syllable in synchrony with the presentation of a concordant auditory and/or visually ambiguous speech stimulus improves its identification. Our Aspects of this work were presented at the 2008 International Conference on Auditory-Visual Speech Processing, Tangalooma, Australia. M. Sato (*) · E. Troille  GIPSA-LAB, UMR CNRS 5216, Département Parole and Cognition, Grenoble Université, 1180, Avenue centrale, BP 25, 38040 Grenoble Cedex 9, France e-mail: [email protected] M. Sato · L. Ménard · V. Gracco  Centre for Research on Brain, Language and Music, McGill University, Montreal, Canada E. Troille · M.-A. Cathiard  Centre de Recherche sur l’Imaginaire, Université Stendhal, Grenoble, France L. Ménard  Département de Linguistique, Université du Québec à Montréal, Montréal, Canada V. Gracco  School of Communication Sciences and Disorders, McGill University, Montreal, Canada V. Gracco  Haskins Laboratories, New Haven, CT, USA
 
 results further demonstrate that, even in the case of perfect perceptual identification, concurrent mouthing of a syllable speeds up the perceptual processing of a concordant speech stimulus. These results reflect multisensory-motor interactions during speech perception and provide new behavioral arguments for internally generated sensory predictions during silent speech production. Keywords  Speech perception · Speech production · Silent speech · Audiovisual speech perception · Internal forward models · Sensory-motor interactions · Efference copy · McGurk effect
 
 Introduction Speech production is a complex multistage process that converts an intended linguistic message, through specific articulatory movements, into an acoustic speech signal that can be perceived and understood by a listener (Levelt 1989). From higher-order linguistic conceptualization of the intended message, speech production requires phonemic encoding of the articulatory plans, initiation and coordination of sequences of movements produced by the combined actions of the respiratory system, the larynx and the supra-laryngeal vocal tract. Online auditory and somatosensory feedback control mechanisms also play a key role in speech production. During the phonemic encoding stage of the intended linguistic message, it is proposed that segmental speech mov		
Data Loading...
 
	 
	 
	 
	 
	 
	 
	 
	 
	 
	 
	