Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.

Neuroimaging studies of written and spoken sentence processing report greater left hemisphere than right hemisphere activation. However, a large majority of our experience with language is face-to-face interaction, which is much richer in information. The current study examines the neural organization of audio-visual (AV) sentence processing using functional magnetic resonance imaging (fMRI) at 4 Tesla. Participants viewed the face and upper body of a speaker via a video screen while listening to her produce, in alternating blocks, English sentences and sentences composed of pronounceable non-words. Audio-visual sentence processing was associated with activation in the left hemisphere in Broca's area, dorsolateral prefrontal cortex, the superior precentral sulcus, anterior and middle portions of the lateral sulcus, middle superior portions of the temporal sulcus, supramarginal gyrus and angular gyrus. Further, AV sentence processing elicited activation in the right anterior and middle lateral sulcus. Between-hemisphere analyses revealed a left hemisphere dominant pattern of activation. The findings support the hypothesis that the left hemisphere may be biased to process language independently of the modality through which it is perceived. These results are discussed in the context of previous neuroimaging results using American Sign Language (ASL).

Original publication

DOI

10.1016/j.cogbrainres.2003.10.014

Type

Journal article

Journal

Brain Res Cogn Brain Res

Publication Date

07/2004

Volume

20

Pages

111 - 119

Keywords

Adult, Auditory Cortex, Auditory Perception, Brain Mapping, Dominance, Cerebral, Female, Humans, Magnetic Resonance Imaging, Male, Sign Language, Visual Cortex, Visual Perception