Understanding one another often requires more than identifying words in order. In conversation, we make use of cultural knowledge, social interactional cues, and gestures in order to help get our point across. Perhaps because the presence of a combinatorial syntax is a unique feature of human communication systems, most electrophysiological investigations of language comprehension have addressed how the brain responds to syntactic and semantic information. However, as noted above, successful communication relies on extra-linguistic information to a considerable extent. In this talk, we describe a number of event-related related brain potential (ERP) studies that address the processing of these sorts of communicative cues. We consider how language users integrate linguistic information with background knowledge about the physical and social world, how speakers process co-speech gestures, and how "back channel" information in speech disfluencies affects language comprehension.