Research reportTactile–visual integration in the posterior parietal cortex: A functional magnetic resonance imaging study
Introduction
Visual motion has been shown to strongly influence tactile motion judgments [5], [17]. When visual motion was presented simultaneously but in the opposite direction to tactile motion, the accuracy of the tactile motion judgments was substantially reduced [17]. This decline in performance was observed when the visual display was placed either near to or at a distance from the tactile display. The extent of the effect decreased as the degree of misalignment increased [17]. A substantial effect of visual motion that was dependent of the relative direction of the motion was beyond a general visual motion effect [17]. This direction-specificity means that a general perceptual conflict is unlikely to be the cause of the interference, and instead suggests that crossmodal interaction occurs during motion direction judgment [17]. However, the neural substrates for this remain largely unknown [5].
Crossmodal motion direction discrimination requires both the coding of motion in the two sensory modalities and a decision stage that compares the two motion direction signals. Crossmodal interference could thus occur at either stage. One candidate locus for the integration of visual and tactile motion information during the coding stage is the human middle temporal (MT)/V5 area [5]. According to visual mediation heuristics [47], tactile inputs are translated into their corresponding visual representations (visually based imagery), which are further processed by the visual system. Recent neuroimaging studies have reported that tactile motion perception tasks activate the part of the MT/V5 area [6], [34], independent of imagery of visual motion [4], [60]. Thus, the activity observed in the MT/V5 area during tactile stimulation might reflect bottom-up sensory input or a top-down cognitive strategy (such as imagery). It is possible that if the MT/V5 area is involved in tactile motion perception, it might be the site of the interaction between the visual and tactile modalities [5].
The integration of visual and tactile motion might also involve multisensory areas at the decision stage, because human spatial perception is highly integrated across modalities [46]. The ventral intraparietal sulcus (IPS) of non-human primates contains neurons that respond to both visual and tactile motion stimuli [20]. The caudal pole of the superior parietal lobule of non-human primates is currently considered to be a key region in the dorsal stream of signals linking somatosensory and visual input to the motor commands driving body movements [12], [74]. A recent functional magnetic resonance imaging (fMRI) study revealed that, in humans, there is a parietal face area containing head-centred visual and tactile maps that are aligned with one another [71]. Hence, an alternative candidate area for crossmodal integration is the multisensory posterior parietal cortex.
To explore these alternatives, we conducted an fMRI experiment. Our hypothesis was that the spatial analysis of the direction of movement via visual and tactile modalities activates both sensory-specific and multisensory areas. The former represent the modality-specific coding stage, while the latter include the neural substrates of the decision process that requires the comparison of the two signals coming from different modalities (i.e., crossmodal integration).
Brain areas participating in crossmodal integration should show signs of convergence and interaction [59]. We initially performed independent tactile and visual unimodal experiments involving motion direction matching tasks, in order to define the common multimodal areas that are activated during each of the independent tactile and visual tasks (i.e., convergence). We then carried out tactile and visual crossmodal experiments with event-related designs, in order to identify the areas in which the crossmodal response was enhanced [67], by comparing stimuli whose directions of motion were congruent and incongruent (i.e., interaction). Previous studies of crossmodal integration suggested that the effect of crossmodal interaction is known to be subtle [3]. Thus, we tried to restrict the search volume first by defining the polymodal areas (convergence), within which the interaction effect was searched.
Semantically congruent and/or spatially coincident multisensory inputs in close temporal proximity lead to behavioural response enhancement, resulting in lower thresholds and reduced reaction times compared with unimodal stimuli [25], [41], [55]. By contrast, incongruent inputs slow response times and produce anomalous perceptions [52], [70], [77], [84]. These enhancements and reductions in behavioural responses are thought to be due to crossmodal integration. Additionally, the response properties of multisensory cells in non-human primates seem to reflect this pattern of crossmodal behavioural enhancement and reduction [54], [76], [82], [77]. Calvert et al. [13] postulated that response enhancement and depression are the hallmarks of intersensory interactions in humans. Thus, it should be possible to depict the neural substrates of crossmodal interaction by comparing congruent and incongruent sensory conditions [67]. This approach also allows us to subtract out the effects of attention, ignorance and the effect of the differences between the cues [67]. And hence the visuo-tactile crossmodal interaction we are investigating should be considered as bottom-up process.
Section snippets
Subjects
In total, 15 healthy volunteers (seven men and eight women; mean age ± standard deviation [S.D.] = 27.9 ± 6.7 years) participated in this study. Among these subjects, 14 were right-handed and one was left-handed according to the Edinburgh handedness inventory [56]. None of the participants had a history of neurological or psychiatric illness. The protocol was approved by the Ethical Committee of the National Institute for Physiological Sciences, Japan. The experiments were undertaken in compliance
Unimodal block design experiment
In the unimodal experiments, the mean ± S.D. percentages of correct responses were 92.0 ± 5.5% for the T-task and 94.3 ± 7.8% for the V-task, which did not significantly differ (P = 0.09, paired t-test).
Tactile motion direction discrimination activated the bilateral inferior parietal lobule (LPi), LPs, secondary somatosensory area (SII), dorsal premotor cortex (PMd), ventral premotor cortex (PMv), inferior frontal gyrus (GFi), insula, putamen, left primary sensorimotor area (SM1), postcentral gyrus
Task performance
The crossmodal experiment did not reveal a congruency effect during either the crossmodal or intramodal conditions. This might have been due to a ceiling effect, as the performance levels were relatively high. Performance in the TT condition was significantly worse than that during the VV or TV conditions. During this task, tactile stimuli were presented simultaneously, but independently, to the index and middle fingers. This might have made performance in the TT condition more difficult than
References (85)
- et al.
The inferential impact of global signal covariates in functional neuroimaging analyses
Neuroimage
(1998) - et al.
Integration of auditory and visual information about objects in superior temporal sulcus
Neuron
(2004) - et al.
Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys
Neuron
(2001) - et al.
Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex
Curr. Biol.
(2000) - et al.
Cerebral processes related to visuomotor imagery and generation of simple finger movements studied with positron emission tomography
Neuroimage
(1998) - et al.
Functional topography of the secondary somatosensory cortex for nonpainful and painful stimuli: an fMRI study
Neuroimage
(2003) - et al.
Detecting activations in PET and fMRI: levels of inference and power
Neuroimage
(1996) - et al.
How many subjects constitute a study?
Neuroimage
(1999) - et al.
Stochastic designs in event-related fMRI
Neuroimage
(1999) - et al.
Thresholding of statistical maps in functional neuroimaging using the false discovery Rate
Neuroimage
(2002)