Abstract
Stuckenberg, M. V., Schröger, E., & Widmann, A. (2019). Presentation Probability of Visual-Auditory Pairs Modulates Visually Induced Auditory Predictions. Journal of Cognitive Neuroscience, 31(8), 1-16.
Presentation Probability of Visual-Auditory Pairs Modulates Visually Induced Auditory Predictions
Predictions about forthcoming auditory events can be established on the basis of preceding visual information. Sounds being incongruent to predictive visual information have been found to elicit an enhanced negative ERP in the latency range of the auditory N1 compared with physically identical sounds being preceded by congruent visual information. This so-called incongruency response (IR) is interpreted as reduced prediction error for predicted sounds at a sensory level. The main purpose of this study was to examine the impact of probability manipulations on the IR. We manipulated the probability with which particular congruent visual-auditory pairs were presented (83/17 vs. 50/50 condition). This manipulation led to two conditions with different strengths of the association of visual with auditory information. A visual cue was presented either above or below a fixation cross and was followed by either a high- or low-pitched sound. In 90% of trials, the visual cue correctly predicted the subsequent sound. In one condition, one of the sounds was presented more frequently (83% of trials), whereas in the other condition both sounds were presented with equal probability (50% of trials). Therefore, in the 83/17 condition, one congruent combination of visual cue and corresponding sound was presented more frequently than the other combinations presumably leading to a stronger visual-auditory association. A significant IR for unpredicted compared with predicted but otherwise identical sounds was observed only in the 83/17 condition, but not in the 50/50 condition, where both congruent visual cue-sound combinations were presented with equal probability. We also tested whether the processing of the prediction violation is dependent on the task relevance of the visual information. Therefore, we contrasted a visual-auditory matching task with a pitch discrimination task. It turned out that the task only had an impact on the behavioral performance but not on the prediction error signals. Results suggest that the generation of visual-to-auditory sensory predictions is facilitated by a strong association between the visual cue and the predicted sound (83/17 condition) but is not influenced by the task relevance of the visual information.
Contact
Cognitive and Biological Psychology
University of Leipzig
Faculty of Life Sciences
Institute of Psychology
Neumarkt 9-19
D-04109 Leipzig
Secretariat
Dagmar Schrödl
Phone: +49 341 97-39570
Email: dagmar dot schroedl at uni-leipzig dot de
Fax: +49 341 97-39271