Is visual information represented in auditory sensory memory?

Besle, J., and Giard, M.-H.
Unité 280, Institut National de la Santé et de la Recherche Médicale, Lyon, France
E-mail: besle@lyon.inserm.fr

Mismatch Negativity is elicited when incoming sounds are detected as deviating from a neural representation of acoustic regularities in auditory cortex and can thus be used as a probe to study the representation of sounds in transient auditory sensory memory (ASM). However, several studies have shown that auditory MMN may be sensitive to visual information: an auditory-like MMN can be elicited by an illusory auditory phonetic deviation due to infrequent mismatching lip movements (McGurk effect), even though the physical auditory signal does not differ between deviants and standards.

The question then arises whether this visual influence on auditory representation in ASM is speech-specific, or may be observed with non-speech audio-visual events as well. Indeed there is growing evidence for the existence of cross-modal interactions at early sensory processing stages, suggesting that auditory-visual information may be at least partially integrated before the MMN process occurs.

To test this hypothesis, we compared the MMNs (amplitude, latency and topography) elicited by stimuli deviating from audio-visual AV standards on visual (AV') or auditory (A'V) or both (A'V') dimensions. Auditory-only (A) and visual-only (V) oddball paradigms were conducted as additional controls.

The main hypothesis is that if the ASM trace includes visual information of a bimodal stimulus, 1) an AV’-MMN should exist and 2) A’V’-MMN and A’V-MMN should have different characteristics. Further comparisons between the different audiovisual and sensory-specific MMNs should allow to characterize the possible bimodal nature of the neural representations stored in ASM and involved in the MMN process.