Keitel, C., Schröger, E., Saupe, K., & Müller, M. M. (2011). Sustained selective intermodal attention modulates processing of language-like stimuli. Experimental Brain Research, 213, 221-227.

Sustained selective intermodal attention modulates processing of language-like stimuli

Keitel, C., Schröger, E., Saupe, K., & Müller, M. M.

Intermodal attention (IA) is assumed to allocate limited neural processing resources to input from one specific sensory modality. We investigated effects of sustained IA on the amplitude of a 40-Hz auditory (ASSR) and a 4.3-Hz visual steady-state response (VSSR). To this end, we concurrently presented amplitude-modulated multi-speech babble and a stream of nonsense letter sets to elicit the respective brain responses. Subjects were cued trialwise to selectively attend to one of the streams for several seconds where they had to perform a lexical decision task on occasionally occurring words and pseudowords. Attention to the auditory stream led to greater ASSR amplitudes than attention to the visual stream. Vice versa, the VSSR amplitude was greater when the visual stream was attended. We demonstrate that IA research by means of frequency tagging can be extended to complex stimuli as used in the current study. Furthermore, we show not only that IA selectively modulates processing of concurrent multisensory input but that this modulation occurs during trial-by-trial cueing of IA. The use of frequency tagging may be suitable to study the role of IA in more naturalistic setups that comprise a larger number of multisensory signals.