Elsevier

Brain Research Bulletin

Volume 75, Issue 5, 28 March 2008, Pages 513-525
Brain Research Bulletin

Research report
Tactile–visual integration in the posterior parietal cortex: A functional magnetic resonance imaging study

https://doi.org/10.1016/j.brainresbull.2007.09.004Get rights and content

Abstract

To explore the neural substrates of visual–tactile crossmodal integration during motion direction discrimination, we conducted functional magnetic resonance imaging with 15 subjects. We initially performed independent unimodal visual and tactile experiments involving motion direction matching tasks. Visual motion discrimination activated the occipital cortex bilaterally, extending to the posterior portion of the superior parietal lobule, and the dorsal and ventral premotor cortex. Tactile motion direction discrimination activated the bilateral parieto-premotor cortices. The left superior parietal lobule, intraparietal sulcus, bilateral premotor cortices and right cerebellum were activated during both visual and tactile motion discrimination. Tactile discrimination deactivated the visual cortex including the middle temporal/V5 area. To identify the crossmodal interference of the neural activities in both the unimodal and the multimodal areas, tactile and visual crossmodal experiments with event-related designs were also performed by the same subjects who performed crossmodal tactile–visual tasks or intramodal tactile–tactile and visual–visual matching tasks within the same session. The activities detected during intramodal tasks in the visual regions (including the middle temporal/V5 area) and the tactile regions were suppressed during crossmodal conditions compared with intramodal conditions. Within the polymodal areas, the left superior parietal lobule and the premotor areas were activated by crossmodal tasks. The left superior parietal lobule was more prominently activated under congruent event conditions than under incongruent conditions. These findings suggest that a reciprocal and competitive association between the unimodal and polymodal areas underlies the interaction between motion direction-related signals received simultaneously from different sensory modalities.

Introduction

Visual motion has been shown to strongly influence tactile motion judgments [5], [17]. When visual motion was presented simultaneously but in the opposite direction to tactile motion, the accuracy of the tactile motion judgments was substantially reduced [17]. This decline in performance was observed when the visual display was placed either near to or at a distance from the tactile display. The extent of the effect decreased as the degree of misalignment increased [17]. A substantial effect of visual motion that was dependent of the relative direction of the motion was beyond a general visual motion effect [17]. This direction-specificity means that a general perceptual conflict is unlikely to be the cause of the interference, and instead suggests that crossmodal interaction occurs during motion direction judgment [17]. However, the neural substrates for this remain largely unknown [5].

Crossmodal motion direction discrimination requires both the coding of motion in the two sensory modalities and a decision stage that compares the two motion direction signals. Crossmodal interference could thus occur at either stage. One candidate locus for the integration of visual and tactile motion information during the coding stage is the human middle temporal (MT)/V5 area [5]. According to visual mediation heuristics [47], tactile inputs are translated into their corresponding visual representations (visually based imagery), which are further processed by the visual system. Recent neuroimaging studies have reported that tactile motion perception tasks activate the part of the MT/V5 area [6], [34], independent of imagery of visual motion [4], [60]. Thus, the activity observed in the MT/V5 area during tactile stimulation might reflect bottom-up sensory input or a top-down cognitive strategy (such as imagery). It is possible that if the MT/V5 area is involved in tactile motion perception, it might be the site of the interaction between the visual and tactile modalities [5].

The integration of visual and tactile motion might also involve multisensory areas at the decision stage, because human spatial perception is highly integrated across modalities [46]. The ventral intraparietal sulcus (IPS) of non-human primates contains neurons that respond to both visual and tactile motion stimuli [20]. The caudal pole of the superior parietal lobule of non-human primates is currently considered to be a key region in the dorsal stream of signals linking somatosensory and visual input to the motor commands driving body movements [12], [74]. A recent functional magnetic resonance imaging (fMRI) study revealed that, in humans, there is a parietal face area containing head-centred visual and tactile maps that are aligned with one another [71]. Hence, an alternative candidate area for crossmodal integration is the multisensory posterior parietal cortex.

To explore these alternatives, we conducted an fMRI experiment. Our hypothesis was that the spatial analysis of the direction of movement via visual and tactile modalities activates both sensory-specific and multisensory areas. The former represent the modality-specific coding stage, while the latter include the neural substrates of the decision process that requires the comparison of the two signals coming from different modalities (i.e., crossmodal integration).

Brain areas participating in crossmodal integration should show signs of convergence and interaction [59]. We initially performed independent tactile and visual unimodal experiments involving motion direction matching tasks, in order to define the common multimodal areas that are activated during each of the independent tactile and visual tasks (i.e., convergence). We then carried out tactile and visual crossmodal experiments with event-related designs, in order to identify the areas in which the crossmodal response was enhanced [67], by comparing stimuli whose directions of motion were congruent and incongruent (i.e., interaction). Previous studies of crossmodal integration suggested that the effect of crossmodal interaction is known to be subtle [3]. Thus, we tried to restrict the search volume first by defining the polymodal areas (convergence), within which the interaction effect was searched.

Semantically congruent and/or spatially coincident multisensory inputs in close temporal proximity lead to behavioural response enhancement, resulting in lower thresholds and reduced reaction times compared with unimodal stimuli [25], [41], [55]. By contrast, incongruent inputs slow response times and produce anomalous perceptions [52], [70], [77], [84]. These enhancements and reductions in behavioural responses are thought to be due to crossmodal integration. Additionally, the response properties of multisensory cells in non-human primates seem to reflect this pattern of crossmodal behavioural enhancement and reduction [54], [76], [82], [77]. Calvert et al. [13] postulated that response enhancement and depression are the hallmarks of intersensory interactions in humans. Thus, it should be possible to depict the neural substrates of crossmodal interaction by comparing congruent and incongruent sensory conditions [67]. This approach also allows us to subtract out the effects of attention, ignorance and the effect of the differences between the cues [67]. And hence the visuo-tactile crossmodal interaction we are investigating should be considered as bottom-up process.

Section snippets

Subjects

In total, 15 healthy volunteers (seven men and eight women; mean age ± standard deviation [S.D.] = 27.9 ± 6.7 years) participated in this study. Among these subjects, 14 were right-handed and one was left-handed according to the Edinburgh handedness inventory [56]. None of the participants had a history of neurological or psychiatric illness. The protocol was approved by the Ethical Committee of the National Institute for Physiological Sciences, Japan. The experiments were undertaken in compliance

Unimodal block design experiment

In the unimodal experiments, the mean ± S.D. percentages of correct responses were 92.0 ± 5.5% for the T-task and 94.3 ± 7.8% for the V-task, which did not significantly differ (P = 0.09, paired t-test).

Tactile motion direction discrimination activated the bilateral inferior parietal lobule (LPi), LPs, secondary somatosensory area (SII), dorsal premotor cortex (PMd), ventral premotor cortex (PMv), inferior frontal gyrus (GFi), insula, putamen, left primary sensorimotor area (SM1), postcentral gyrus

Task performance

The crossmodal experiment did not reveal a congruency effect during either the crossmodal or intramodal conditions. This might have been due to a ceiling effect, as the performance levels were relatively high. Performance in the TT condition was significantly worse than that during the VV or TV conditions. During this task, tactile stimuli were presented simultaneously, but independently, to the index and middle fingers. This might have made performance in the TT condition more difficult than

References (85)

  • T. Hanakawa et al.

    Differential activity in the premotor cortex subdivisions in humans during mental calculation and verbal rehearsal tasks: a functional magnetic resonance imaging study

    Neurosci. Lett.

    (2003)
  • R.N. Henson et al.

    Detecting latency differences in event-related BOLD responses: application to words versus nonwords and initial versus repeated face presentations

    Neuroimage

    (2002)
  • R. Kitada et al.

    Tactile estimation of the roughness of gratings yields a graded response in the human brain: an fMRI study

    Neuroimage

    (2005)
  • A. Mechelli et al.

    Estimating efficiency a priori: a comparison of blocked and randomized designs

    Neuroimage

    (2003)
  • R.C. Oldfield

    The assessment and analysis of handedness: the Edinburgh inventory

    Neuropsychologia

    (1971)
  • J. O'Shea et al.

    Functionally specific reorganization in human premotor cortex

    Neuron

    (2007)
  • T. Raij et al.

    Audiovisual integration of letters in the human brain

    Neuron

    (2000)
  • G. Rizzolatti et al.

    The organization of the cortical motor system: new concepts

    Electroencephalogr. Clin. Neurophysiol.

    (1998)
  • M.C. Saetti et al.

    Tactile morphagnosia secondary to spatial deficits

    Neuropsychologia

    (1999)
  • F. Scheperjans et al.

    Subdivisions of human parietal area 5 revealed by quantitative receptor autoradiography: a parietal region between motor, somatosensory, and cingulate cortical areas

    Neuroimage

    (2005)
  • F. Scheperjans et al.

    Transmitter receptors reveal segregation of cortical areas in the human superior parietal cortex: relations to visual and somatosensory regions

    Neuroimage

    (2005)
  • B. Stefanovic et al.

    Hemodynamic and metabolic responses to neuronal inhibition

    Neuroimage

    (2004)
  • M.C. Stoeckel et al.

    A fronto-parietal circuit for tactile object discrimination: an event-related fMRI study

    Neuroimage

    (2003)
  • M. Avillac et al.

    Reference frames for representing visual and tactile locations in parietal cortex

    Nat. Neurosci.

    (2005)
  • M.S. Beauchamp et al.

    Human MST but not MT respond to tactile stimulation

    J. Neurosci.

    (2007)
  • S.J. Bensmaia et al.

    Influence of visual motion on tactile motion perception

    J. Neurophysiol.

    (2006)
  • R. Blake et al.

    Neural synergy between kinetic vision and touch

    Psychol. Sci.

    (2004)
  • G.J. Blatt et al.

    Visual receptive field organization and cortico-cortical connections of the lateral intraparietal area (area LIP) in the macaque

    J. Comp. Neurol.

    (1990)
  • A. Bodegard et al.

    Somatosensory areas in man activated by moving stimuli: cytoarchitectonic mapping and PET

    Neuroreport

    (2000)
  • D. Boussaoud et al.

    Pathways for motion analysis: cortical connections of the medial superior temporal area (area LIP) in the macaque

    J. Comp. Neurol.

    (1990)
  • M. Brett et al.

    The problem of functional localization in the human brain

    Nat. Rev. Neurosci.

    (2002)
  • R. Breveglieri et al.

    Somatosensory cells in area PEc of macaque posterior parietal cortex

    J. Neurosci.

    (2006)
  • C. Cavada et al.

    Posterior parietal cortex in rhesus monkey: I. Parcellation of areas based on distinctive limbic and sensory corticocortical connections

    J. Comp. Neurol.

    (1989)
  • C.L. Colby et al.

    Ventral intraparietal area of the macaque: anatomic location and visual response properties

    J. Neurophysiol.

    (1993)
  • C.L. Colby et al.

    Topographical organization of cortical afferents to extrastriate visual area PO in the macaque: a dual tracer study

    J. Comp. Neurol.

    (1988)
  • J.C. Craig

    Visual motion interferes with tactile motion perception

    Perception

    (2006)
  • M.-P. Deiber et al.

    Frontal and parietal networks for conditional motor learning: a positron emission tomography study

    J. Neurophysiol.

    (1997)
  • J.R. Duhamel et al.

    Ventral intraparietal area of the macaque: congruent visual and somatic response properties

    J. Neurophysiol.

    (1998)
  • S.O. Dumoulin et al.

    A new anatomical landmark for reliable identification of human area V5/MT: a quantitative analysis of sulcal patterning

    Cereb. Cortex

    (2000)
  • A.C. Evans et al.

    An MRI-based probalistic atlas of neuroanatomy.

  • D.J. Felleman et al.

    Cortical connections of areas V3 and VP of macaque monkey extrastriate visual cortex

    J. Comp. Neurol.

    (1997)
  • M.A. Frens et al.

    Spatial and temporal factors determine auditory-visual interactions in human saccadic eye movements

    Percept. Psychophys.

    (1995)
  • Cited by (0)

    View full text