Journeying Beyond Classical Somatosensory Cortex
Visual cortical areas are involved in a variety of somatosensory tasks in the sighted, including tactile perception of two-dimensional patterns and motion, and haptic perception of three-dimensional objects. It is still unresolved whether visual imagery or modality-independent representations can better explain such cross-modal recruitment. However, these explanations are not necessarily in conflict with each other and might both be true, if imagery processes can access modality-independent representations. Greater visual cortical engagement in blind compared to sighted people is commonplace during language tasks, and also seems to occur during processing of tactile spatial information. Such engagement is even greater in the congenitally blind compared to the late blind, indicative of enhanced cross-modal plasticity during early development. At the other extreme, short-term visual deprivation of the normally sighted also leads to cross-modal plasticity. Altogether, the boundaries between sensory modalities appear to be flexible rather than immutable.
The seemingly radical idea that visual cortical areas are intimately involved in processing tactile information, both in the normally sighted and in the visually deprived, has garnered widespread acceptance, based largely on functional neuroimaging studies. However, the mechanisms underlying such cross-modal cortical recruitment still remain uncertain. One view favours visual imagery as the fundamental trigger while an alternative is that so-called “visual” cortical areas actually perform multisensory processing. Here we review the growing literature on visual cortical involvement in tactile perception of stimuli applied to static body parts, as well as haptic perception of stimuli explored actively with the hand, and consider the available evidence relating to the relevant neural mechanisms.
Visual Cortical Processing of Tactile and Haptic Input in Sighted Humans
Tactile Perception of Two-Dimensional Patterns
The earliest realization that areas of visual cortex might be normally active during tactile perception resulted from our positron emission tomographic (PET) study (Sathian, Zangaladze, Hoffman, & Grafton, 1997) employing discrimination of the orientation of gratings applied to the immobilized right index fingerpad. Relative to a control task calling for tactile discrimination of grating groove width, the orientation task activated a left parieto-occipital cortical (POC) region (Sathian et al., 1997). This POC region had previously been reported as active during visual discrimination of grating orientation (Sergent, Ohta, & MacDonald, 1992) and spatial mental imagery (Mellet et al., 1996), suggesting a commonality of processing across vision and touch and possible mediation by imagery. The POC activation site is near the human V6 complex of areas (Pitzalis et al., 2006), part of which is probably homologous to macaque area V6/PO, where a large proportion of neurons are orientation-selective (Galletti, Battaglini, & Fattori, 1991). The preferential association of macrospatial, compared to microspatial, tactile tasks with visual imagery (Klatzky, Lederman, & Reed, 1987) and the general superiority of vision over touch for macrospatial feature perception, with the reverse being true of microspatial features (Heller, 1989b), fit with our PET findings if one considers the orientation task as macrospatial and the control task as microspatial.
To evaluate the functional significance of the POC activation found in our PET study (Zangaladze, Epstein, Grafton, & Sathian, 1999), we used transcranial magnetic stimulation (TMS) to disrupt processing at this site. Single-pulse TMS, at a delay of 180 ms following the onset of the tactile stimulus, significantly impaired tac tile discrimination of grating orientation, but had no effect on tactile discrimination of grating groove width. In contrast to this task-specific effect over POC, performance on both tasks was degraded by TMS over primary somatosensory cortex (51) at a 30-ms delay. This study was the first to establish that extrastriate visual cortical activity is actually necessary for optimal tactile perception in normally sighted individuals, rather than being merely epiphenomenal. In a later functional magnetic resonance imaging (fMRI) study (Zhang et al., 2005) using a similar paradigm to our original PET study, we verified left POC activation during tactile discrimination of grating orientation. Activation specific to this task was also found in other cortical areas in this study, including the right postcentral sulcus (PCS) and left anterior intraparietal sulcus (aIPS). The PCS has been shown to correspond to Brodmann’s Area 2 (Grefkes, Geyer, Schormann, Roland, & Zilles, 2001), which is the most posterior (and highest-order) part of s1. Two other fMRI studies have confirmed a role for the aIPS in tactile discrimination of grating orientation, although opposite lateralization was reported in these two studies: One study found bilateral aIPS activity, greater on the left irrespective of which hand was used, when this task was contrasted with discrimination of microspatial changes in grating location (Van Boven, Ingeholm, Beauchamp, Bikle, & Ungerleider, 2005). Again regardless of which hand was stimulated, right-lateralized activity was observed in the PCS-aIPS region during discrimination of the orientation of gratings scanned across the fingerpad, relative to discrimination of grating roughness (Kitada et al., 2006). This last study demonstrated multisensory processing in the right aIPS, since it was also more active during visual discrimination of grating orientation than colour.
Another TMS study (Merabet et al., 2004) applied repetitive TMS (rTMS) in 10-min trains at 1 Hz to decrease cortical excitability while subjects felt dot-patterns of varying interdot distance. The tasks were to scale either perceived interdot distance, which rises monotonically as physical interdot distance increases up to 8 mm, or perceived roughness, which peaks around 3 mm and then declines. Roughness judgments were disrupted by rTMS over s1, whereas perceived interdot distance ratings were impaired by rTMS over medial occipital cortex and were also affected in a congenitally blind patient with bilateral occipital infarcts who, however, performed normally on roughness judgments (Merabet et al., 2004). These findings are consistent with the greater tendency of macrospatial compared to microspatial tactile tasks to involve visual processing, as outlined earlier. An fMRI study from our group (Stoesz et al., 2003) corroborated this: A macrospatial tactile form condition requiring subjects to distinguish between the upside-down letters T and V was contrasted with a microspatial tactile condition, detection of a gap in a bar. This contrast revealed bilateral activation of the lateral occipital complex (LOC), a visual object-selective region (Malach et al., 1995) that is considered homologous with macaque inferotemporal cortex (Grill-Spector et al., 1998). Right LOC activity was also found in the same form task, relative to tactile discrimination of bar orientation, in a PET study from our laboratory (Prather, Votaw, & Sathian, 2004).
In a recent fMRI study, we confirmed that visual cortical activation during microspatial tasks is minimal (Sathian & Stilla, unpublished observations). This study used as a stimulus a 3-dot array oriented along the long axis of the immobilized right index fingerpad, with the central dot in the array being offset to the left or right by
Haptic Perception of Three-Dimensional Objects
Activity has consistently been found in the LOC during haptic perception of object shape in a number of fMRI studies (Amedi, Jacobson, Hendler, Malach, & Zohary, 2002; Amedi, Malach, Hendler, Peled, & Zohary, 2001; James et al., 2002; Reed, Shoham, & Halgren, 2004; Stoeckel et al., 2003; Zhang, Weisser, Stilla, Prather, & Sathian, 2004). A part of the LOC is object-selective in both vision and touch (Amedi et al., 2001, 2002); this region is particularly driven by graspable visual objects relative to other visual stimuli, but is not activated by the characteristic sounds of objects, suggesting its specialization for shape processing (Amedi et al., 2002). Neurological lesions involving the LOC impair haptic shape perception, demonstrating the functional importance of this area (Feinberg, Rothi, & Heilman, 1986; James, James, Humphrey, & Goodale, 2006). There is some evidence that visual and haptic shape perception are mediated by a common representation, including cross-modal priming effects observed in psychophysical (Easton, Greene, & Srinivas, 1997; Easton, Srinivas, & Greene, 1997; Reales & Ballesteros, 1999) and in fMRI studies (Amedi et al., 2001; James et al., 2002), and overlapping category-specific representations between the two modalities, at least for manmade objects (Pietrini et al., 2004). Our recent demonstration that the magnitude of right LOC activity evoked during visual and haptic shape perception is significantly correlated across subjects provides further support for the idea of a common visuo-haptic representation of shape, and suggests that such a modality-independent representation might reside in the right LOC (Stilla & Sathian, in press). The right lateralization is interesting, especially because haptic exploration was with the right hand.
We also found a number of bilateral parietal regions that were shape-selective for haptic as well as visual stimuli: These included the PCS and multiple parts of the IPS, including the aIPS, pIPS, and ventral IPS (vIPS) (Peltier et al., 2007). These findings are in keeping with reports of multisensory shape-selectivity in the left aIPS (Grefkes, Weiss, Zilles, & Fink, 2002) and in a caudal region of the IPS (Saito, Okada, Morita, Yonekura, & Sadato, 2003), and multisensory responses in the IPS in monkeys (Iriki, Tanaka, & Iwamura, 1996). Paralleling the existence of multisensory texture-selectivity in V2 described above, the occurrence of multisensory shape processing in the PCS, which, since it corresponds to Brodmann’s Area 2 (Grefkes et al., 2001), is part of S1, emphasizes that multisensory processing extends into quite early areas of the sensory hierarchies.
Tactile Perception of Motion
Tactile motion stimuli, even without a task requirement, recruit the human MT complex (Blake, Sobel, & James, 2004; Hagen et al., 2002), an area that is important for visual motion and considered homologous with the macaque visual motion area MT/V5. Moreover, the tactually perceived direction of motion of a rotating globe can influence its visually perceived direction when this is ambiguous (Blake et al., 2004; James & Blake, 2004). In the case when the direction of motion is unambiguous but incongruent between vision and touch, visual motion disrupts tactile motion perception (Craig, 2006). These observations suggest that, as for object shape, both modalities engage a common representation.
The Role of Visual Imagery in Parietal and Occipital Cortical Activity During Touch
In the preceding section, we reviewed studies indicating that tactile perception regularly elicits activity outside classical somatosensory cortex, not only in multisensory parietal regions but also in occipital cortex. Such cortical recruitment is not arbitrary, but rather is highly task-specific, so that extrastriate visual cortical areas known to mediate certain aspects of vision are also active during tactual performance of the corresponding tasks. To what extent might visual imagery be responsible for this? A well-known visual imagery task calls for mental rotation of visual stimuli: A classic finding in this task is the linear increase in response time for mirror-image discrimination as the angular disparity between the stimuli is increased (Shepard & Metzler, 1971). This is also true in the tactile modality (Carpenter & Eisenberg, 1978; Dellantonio & Spagnolo, 1990; Hollins, 1986; Marmor & Zaback, 1976; Prather & Sathian, 2002; Prather et al., 2004) and does not seem to require visual experience, since similar relationships obtain in early blind, late blind, and sighted individuals (Carpenter & Eisenberg, 1978; Röder & Rösler, 1998). A PET study from our laboratory (Prather et al., 2004) investigated mirror-image discrimination of tactile stimuli. When mental rotation was required (stimuli at a large angle with respect to the finger axis), compared to when it was not (stimuli not angled), activation was found in the left aIPS. This focus was also active during mental rotation of visual stimuli (Alivisatos & Petrides, 1997), reinforcing the idea that this region is multisensory. These psychophysical and imaging studies fit with the notion that similar spatial imagery processes operate in both vision and touch, at least in the case of mental rotation.
In support of the visual imagery hypothesis for visual cortical recruitment during touch, subjects in our laboratory consistently report mentally visualizing tactile stimuli, particularly when performing the macrospatial tasks associated with visual cortical recruitment but not their microspatial counterparts (Sathian et al., 1997; Stoesz et al., 2003; Zangaladze et al., 1999). The trigger for visual imagery could be unfamiliarity with the tac tile stimuli or tasks; this could be an instantiation of a more general cross-modal translation of complex information into the most adept modality (Freides, 1974). An fMRI study from our laboratory (Zhang et al., 2004) found that interindividual variations in the strength of haptic shape-selective activity in the right LOC (ipsilateral to the stimulated hand) were strongly predicted by a multiple regression on two visual imagery scores, one using the Vividness of Visual Imagery Questionnaire (WIQ; Marks, 1973) to assess imagery in common situations, and the other indexing the vividness of visual imagery specifically employed during haptic shape perception. However, activation strengths in the left LOC showed no relationship to visual imagery ratings, potentially implicating other factors in cross-modal visual cortical recruitment. In other imaging studies, left-lateralized LOC activity was reported during retrieval of either geometric or material object properties from memory based on cuing by visually presented words (Newman, Klatzky, Lederman, & Just, 2005) and during generation of mental images of shape triggered by familiar sounds, based on prior visual exposure in sighted subjects and haptic exposure in blind subjects (De Voider et al., 2001). Left lateralization in these studies may have stemmed from semantic (naming) requirements of the tasks used.
Some have argued against a role for visual imagery in the activation of the LOC by haptic perception, since visual imagery evoked only 20% of the LOC activity evoked during haptic object identification (Amedi et al., 2001). Although this finding might have been associated with the inability to verify active maintenance of images on-line during scanning, other possible explanations must be considered for visual cortical recruitment during tactile perception. For instance, the relevant “visual” cortical areas might actually perform multisensory processing. This could indicate the presence of bottom-up somatosensory projections to the visual cortical areas that are involved in tactile perception, whereas the visual imagery explanation presupposes top-down inputs into visual cortical areas. One way to disentangle these competing ideas is to investigate the pattern of connectivity between somatosensory, multisensory, and visual cortical areas. Recent work in the macaque brain, for example, shows multiple polysynaptic pathways between primary somatosensory, and primary visual cortex (Négyessy, Nepusz, Kocsis, & Bazso, 2006). In the human brain, using exploratory structural equation modelling to evaluate all possible models for their fit to fMRI data comprising the time series across haptically shape-selective regions of interest – the PCS, multiple parts of the IPS, and the LOC – we found bidirectional influences consistent with a potential neural substrate fof Visual? imagery as well as multisensory representations (Peltier et al., 2007).
Evidence for a common visuo-haptic shape representation has been reviewed earlier in this article. The nature of multisensory representations has been addressed by several behavioural studies in various cross-modal memory paradigms. Gross-modal memory performance correlates with visuo-spatial scores under instructions to use a visualization strategy for memorization, and with verbal ability scores under instructions to use a naming strategy, suggesting dual representations, both visual and verbal, for familiar objects (Johnson, Paivio, & Clark, 1989). Although language may play a facilitatory role (see below), work with preverbal infants and nonhuman animals shows that language is not a necessary condition for cross-modal interactions (Rose, 1994). Other researchers have taken a more direct route to distinguishing between competing alternatives for the nature of these representations by using interference techniques (Lacey & Campbell, 2006a,b; Newell, Woods, Mernagh, & Bulthoff, 2005). In one study (Lacey & Campbell, 2006a), participants were required to encode familiar and unfamiliar objects both visually (without being able to feel them) and haptically (without being able to see them) whilst performing a concurrent visual, verbal, or haptic interference task. Visual and verbal interference during encoding significantly reduced subsequent cross-modal recognition of the unfamiliar objects, but not the familiar objects. The haptic interference task had no effect. Since visual interference disrupted haptic encoding as much as visual encoding, ,the inference was that visual processes were active during encoding in either modality. Given that the visual interference task used in this study (dynamic visual noise) is known to disrupt visual imagery mnemonics for recall of word-lists (Quinn & McConnell, 1996) and the use of visual imagery in symbolic comparison tasks (Dean, Dewhurst, Morris, & Whittaker, 2005), these results supported the visual imagery hypothesis for haptically evoked activation of visual cortex. The effect of verbal interference on unfamiliar but not familiar objects was surprising and in contrast to the prediction made by Johnson et al. C1989), although these investigators did not actually use unfamiliar objects. This was interpreted ‘as reflecting a strategy of covert verbal description for unfamiliar objects that facilitates forming a representation. This strategy may not have been necessary for familiar objects – indeed, memory for these was unaffected by any interference task, perhaps because their representations are well established in several formats (visual, verbal, and haptic), either in a network of associated representations or a single multisensory representation.
The findings of Lacey and Campbell (2006a) do not establish whether har)tic input triggers visual imagery ,in order to generate a modality-specific visual representation or a modality-independent spatial, representation containing information available to both vision and touch. In the preceding study, the interference tasks did not explicitly demand spatial processing. A subsequent study contrasted the effects of spatial and nonspatial interference tasks, presented in both visual and haptic versions (Lacey & Campbell, 2006b). The results clearly showed that spatial interference disrupted both encoding and retrieval, independent of the modality in which the interference task was presented, whereas nonspatial interference in either modality had no effect. It remains uncertain whether this reflects a single, multisensory spatial representation or separately derived visual and haptic spatial representations that are compared cross-modally.
Some neurophysiologic and neuroanatomic observations in macaque monkeys are pertinent to the potential mechanisms underlying cross-modal recruitment of visual cortex. A highly interesting but underappreciated study (Haenny, Maunsell, & Schiller, 1988) revealed that neurons in visual area V4, but not V1, were selective for the orientation of a tactile grating when it served as a cue to be matched to a subsequently presented visual stimulus but not when the tactile grating was task-irrelevant, indicating that the tactile responses depended on top-down inputs. Conversely, S1 neurons in monkeys were visually responsive in a visuo-haptic matching task involving grating orientation (Zhou & Fuster, 1997). Multisensory inputs have been demonstrated in early sensory cortical areas, including V1 (Falchier, Clavagnier, Barone, & Kennedy, 2002; Rockland & Ojima, 2003) and auditory association cortex (Schroeder et al., 2001; Schroeder et al., 2003; Schroeder & Foxe, 2002); their laminar profile is consistent with the existence of both top-down (Falchier et al., 2002; Rockland & Ojima, 2003; Schroeder & Foxe, 2002; Schroeder et al., 2003) and bottom-up (Schroeder & Foxe, 2002; Schroeder et al., 2003) inputs. This is consistent with the connectivity study based on human fMRI data cited earlier (Peltier et al., 2007). Overall, then, the net evidence suggests modality-independent representations that can be accessed through bottomup sensory inputs as well as top-down processes such as visual imagery.
Cross-Modal Processing of Somatosensory Input in the Blind
It is commonly held that the blind have superior somatosensory perception compared to the sighted. Although this has been researched for more than a century (Griesbach, 1899; Hollins, 1989), the results have not been uniform. Studies of haptic shape perception have yielded results ranging from better performance in the blind (Heller, 1989a), through equal performance in sighted and blind individuals (Morrongiello, Humphrey, Timney, Choi, & Rocca, 1994), to better performance in the sighted (Bailee & Lambert, 1986; Hollins, 1985; Lederman, Klatzky, Chataway, & Summers, 1990). Apparent perceptual advantages of the blind on haptic tasks might ensue from practiced attention to cues that the sighted ignore (Hollins, 1989), or from efficient sensorimotor strategies (D’Angiulli, Kennedy, & Heller, 1998; Davidson, 1972; Shimizu, Saida, & Shimura, 1993). However, the blind do appear to consistently out perform the sighted on tactile perception of twodimensional patterns: Recognition of Braille-like dot patterns (Foulke & Warm, 1967); detection of gaps in bars (Stevens, Foulke, & Patterson, 1996); judging bar orientation (Stevens et al., 1996); discrimination of grating orientation (Goldreich & Kanics, 2003; Van Boven, Hamilton, Kauffman, Keenan, & Pascual-Leone, 2000); and distinguishing whether or not the central dot in a linear three-dot array is offset laterally (Grant, Thiagarajah, & Sathian, 2000). The blind, though, do not differ significantly from the sighted on some tasks, such as discriminating bar length (Stevens et al., 1996) or textures (Grant et al., 2000; Heller, 1989b). Superior tactile spatial performance in the blind may result from their specific experience. Thus, practiced sighted subjects can match the blind on detection of a dot offset (Grant et al., 2000) and on tasks with the Optacon (Craig, 1988), a vibrotactile reading aid for the blind. They can also achieve parity with the deaf-blind in decoding speech by the Tadoma method, which involves feeling the speaker’s face and neck (Reed, Doherty, Braida, & Durlach, 1982; Reed, Rubin, Braida, & Durlach, 1978).
An early report of cross-modal plasticity in blind humans used PET scanning to show increased metabolic activity in occipital (visual) cortical areas of early blind subjects, relative to late blind or sighted subjects, suggesting more synaptic activity in the early blind, perhaps due to incomplete synaptic pruning during development (Veraart et al., 1990). Many functional neuroimaging studies have since demonstrated recruitment of occipital cortical regions of blind subjects during Braille reading (Amedi, Raz, Pianka, Malach, & Zohary, 2003; Buchel, Price, Frackowiak, & Friston, 1998; Burton et al., 2002a; Melzer et al., 2001; Sadato, Okada, Honda, & Yonekura, 2002; Sadato et al., 1996; Sadato et al., 1998). Such activation of medial occipital cortex is specific to early blind subjects (Cohen et al., 1999; Sadato et al., 2002), as compared to the late blind and sighted who deactivate these regions (Sadato et al., 2002). Parallel studies established functional involvement in Braille reading in the blind. An early blind person developed alexia for Braille, despite intact basic somatosensory perception, after a bilateral occipital infarct (Hamilton, Keenan, Catala, & Pascual-Leone, 2000). TMS over medial occipital cortex impaired the ability of blind subjects to identify Braille or Roman letters, but had no effect on tactile identification of Roman letters by sighted subjects, who conversely were more affected than the blind by TMS over sensorimotor cortex (Cohen et al., 1997). The occipital TMS effects were also specific to the early blind (Cohen et al., 1999). These findings imply that visual cortical involvement in Braille reading depends on cross-modal plasticity during a critical period of visual development.
These studies left open whether visual cortical involvement in Braille reading by the blind reflects sensory or linguistic processing. Subsequently, evidence has accumulated for the idea that language tasks recruit visual cortex in the blind. Extensive regions of occipital and occipito-temporal visual cortical areas are active in blind subjects during covert verb generation in response to nouns presented via Braille (Burton, Snyder, Conturo, et al., 2002) or hearing (Burton, Snyder, Diamond, & Raichle, 2002), with more activity in early blind individuals (Burton, 2003; Burton, Diamond, & McDermolt, 2003; Burton, Snyder, Conturo, et al., 2002; Burton, Snyder, Diamond, et al., 2002). Visual cortical activity is stronger during semantic than phonological processing (Burton et al., 2003) and increases with both semantic and syntactic complexity (Röder, Stock, Bien, & Rösler, 2002). Medial occipital cortex of the congenitally blind activates during a verbal memory task, the strength of activation correlating with verbal memory performance (Amedi et al., 2003). There may be some segregation of language function within reorganized visual cortex, with verbal memory and verb generation (in response to heard nouns) showing a preference for posterior (hierarchically lower-order) occipital regions, and Braille reading, for (higher-order) LOC more anteriorly (Amedi et al., 2003).
Thus, it is clear that visual cortex is active during language processing in the blind; the nature of its involvement in somatosensory processes is less certain. There have been comparatively few investigations focusing on tactile perception itself, and these have been limited by the use of rest controls, so that the contributions of sensory and linguistic processes could not be disentangled. One notable study of Braille reading did attempt to control for linguistic processes, using an auditory word control: This study, contrary to the studies discussed above, reported medial occipital cortical activity in the late blind but not the early blind (Buchel et al., 1998). Category-selectivity during haptic perception of man-made objects, similar to that in sighted subjects, was found in inferotemporal cortex of blind subjects, although the category-selective voxels were located more ventrally in the blind compared to sighted subjects (Pietrini et al., 2004). We used the microspatial 3-dot array task described earlier in this review to compare activations in age-matched early blind, late blind, and sighted subjects. As outlined earlier, contrasting the condition in which subjects discriminated whether the central dot was offset to the left or right with one requiring a temporal discrimination resulted in little activity in the LOC in the sighted. Blind subjects showed greater activation on this contrast in the LOC, with the early blind in addition demonstrating activity in medial occipital cortex (Hanna, Stilla, Mariola, Flueckiger, Jegadeesh, & Sathian, not submitted for publication). These results indicate that tactile spatial tasks do recruit visual cortical activity in the blind, with early blind subjects showing greater penetration into early visual cortex. The basis on which such exuberant activity in visual cortex is functionally organized is still mysterious.
Effect of Short-Term Visual Deprivation on Tactile Processing
In an effort to understand the mechanisms by which blindness exerts its effects on tactile processing, there has been some interest in studying short-term visual deprivation. Blindfolding sighted subjects for only 90 minutes improves performance on discrimination of grating orientation (Facchini & Aglioti, 2003), by a similar magnitude as blindness (Van Boven et al., 2000). Blindfolding also increases visual cortical excitability in just an hour, as tested using TMS and fMRI (Boroojerdi et al., 2000), and after two hours, causes both significant de-activation during tactile form discrimination and gap detection in regions intermediate in the hierarchy of visual shape processing (V3A and vIPS) and task-specific increases in activation along the IPS and in regions of frontal and temporal cortex (Weisser, Stilla, Peltier, Hu, & Sathian, 2005). Over five days of blindfolding, Braille character discrimination improves (Kauffman, Theoret, & Pascual-Leone, 2002), occipital cortex develops novel responsiveness to tactile discrimination of Braille characters and auditory tone discrimination, and occipital TMS becomes able to disrupt Braille reading (Pascual-Leone & Hamilton, 2001). These amazing findings suggest that cross-modal plasticity may not need the formation of new connections, but could take advantage of pre-existing connections between areas representing individual sensory modalities. Visual deprivation may serve merely to magnify the normal range of cross-modal recruitment. A crucial goal of future work will be to further characterize the effects of short-term, long-term, and congenital visual deprivation with respect to specific perceptual and cognitive domains, and relate these effects to findings in those privileged with normal vision.
The work reviewed here has firmly established that visual cortical areas in normally sighted individuals are intimately involved in processing somatosensory information, whether acquired through stimulation of passive body parts or via active exploration of objects. There is controversy over the role of visual imagery in mediating such cross-modal recruitment, and increasing evidence favouring the existence of modality-independent spatial representations. These two potential explanations for cross-modal engagement of visual cortex in touch are not necessarily in conflict, if one considers that a modality-independent representation might be accessible not only bottom-up through sensory inputs but also top-down via visual imagery. Long-term visual deprivation enhances the extent of cross-modal recruitment of visual cortex, especially if such deprivation is present during the critical period of development of the visual system. The relationship between behavioural improvements in the blind and these neural changes is uncertain, since practiced sighted people can match the blind in performance. Even short-term visual deprivation for hours to days can trigger substantive changes in visual cortical responsiveness to nonvisual inputs. Taken together, these studies indicate that the boundaries between processing of inputs derived from different modalities, rather than being sharp and immutable, are fluid and flexible.
Alivisatos, B., & Petrides, M. (1997). Functional activation of the human brain during mental rotation. Neuropsychologia, 36, 111-118.
Amedi, A., Jacobson, G., Hendler, T., Malach, R., & Zohary, E. (2002). Convergence of visual and tactile shape processing in the human lateral occipital complex. Cerebral Cortex, 12, 1202-1212.
Amedi, A., Malach, R., Hendler, T., Peled, S., & Zohary, E. (2001). Visuo-haptic object-related activation in the ventral visual pathway. Nature Neuroscience, 4, 324-330.
Amedi, A., Raz, N., Pianka, P., Malach, R., & Zohary, E. (2003). Early ‘visual’ cortex activation correlates with superior verbal memory performance in the blind. Nature Neuroscience, 6, 758-766.
Bailes, S. M., & Lambert, R. M. (1986). Cognitive aspects of haptic form recognition by blind and sighted subjects. British Journal of Psychology, 77, 451-458.
Blake, R., Sobel, K. V., & James, T. W. (2004). Neural synergy between kinetic vision and touch. Psychological Science, 15, 397-402.
Boroojerdi, B., Bushara, K. O., Corwell, B., Immisch, I., Battaglia, E, Muellbacher, W., et al. (2000). Enhanced excitability of the human visual cortex induced by short-term light deprivation. Cerebral Cortex, 10, 529-534.
Büchel, C., Price, C., Frackowiak, R. S. J., & Friston, K. (1998). Different activation patterns in the visual cortex of late and congenitally blind subjects. Brain, 121, 409-419.
Burton, H. (2003). Visual cortex activity in early and late blind people. Journal of Neuroscience, 23, 4005-4011.
Burton, H., Diamond, J. B., & McDermott, K. B. (2003). Dissociating cortical regions activated by semantic and phonological tasks: A FMRI study in blind and sighted people. Journal of Neurophysiology, 90, 1965-1982.
Burton, H., Snyder, A. Z., Conturo, T. E., Akbudak, E., Ollinger, J. M., & Raichle, M. E. (2002). Adaptive changes in early and late blind: A fMRI study of Braille reading. Journal of Neuropbysiology, 87, 589-607.
Burton, H., Snyder, A. Z., Diamond, J. B., & Raichle, M. E. (2002). Adaptive changes in early and late blind: A fMRI study of verb generation to heard nouns. Journal of Neurophysiology, 88, 3359-3371.
Carpenter, P. A., & Eisenberg, P. (1978). Mental rotation and the frame of reference in blind and sighted individuals. Perception and Psychophysics, 23, 117-124.
Cohen, L. G., Celnik, P., Pascual-Leone, A., Corwell, B., Faiz, L., Dambrosia, J., et al. (1997). Functional relevance of cross-modal plasticity in blind humans. Nature, 389, 180-183.
Cohen, L. G., Weeks, R. A., Sadato, N., Celnik, P., Ishii, K., & Hallett, M. (1999). Period of susceptibility for cross-modal plasticity in the blind. Annals of Neurology, 45, 451-460.
Craig, J. C. (1988). The role of experience in tactual pattern perception: A preliminary report. International Journal of Rehabilitation Research, 11, 167-183.
Craig, J. C. (2006). Visual motion interferes with tactile motion perception. Perception, 35, 351-367.
D’Angiulli, A., Kennedy, J. M., & Heller, M. A. (1998). Blind children recognizing tactile pictures respond like sighted children given guidance in exploration. Scandinavian Journal of Psychology, 39, 187-190.
Davidson, P. W. (1972). Haptic judgments of curvature by blind and sighted humans. Journal of Experimental Psychology, 93, 43-55.
Dean, G. M., Dewhurst, S. A., Morris, P. E., & Whittaker, A. (2005). Selective interference with the use of visual images in the symbolic distance paradigm. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 1043-1068.
Dellantonio, A., & Spagnolo, F. (1990). Mental rotation of tactual stimuli. Acta Psychologies, 73, 245-257.
De Voider, A. G., Toyama, H., Kimura, Y., Kiyosawa, M., Nakano, H., Vanlierde, A., et al. (2001). Auditory triggered mental imagery of shape involves visual association areas in early blind humans. Neurolmage 14, 129-139.
Easton, R. D., Greene, A. J., & Srinivas, K. (1997). Transfer between vision and haptics: Memory for 2-D patterns and 3-D objects. Psychonomic Bulletin and Review, 4, 403-410.
Easton, R. D., Srinivas, K., & Greene, A. J. (1997). Do vision and haptics share common representations? Implicit and explicit memory within and between modalities. Journal of Experimental Psychology: Learning, Memory, and Cognition, 23, 153-163.
Facchini, S., & Aglioti, S. M. (2003). Short term light deprivation increases tactile spatial acuity in humans. Neurology, 60, 1998-1999.
Falchier, A., Clavagnier, S., Barone, P., & Kennedy, H. (2002). Anatomical evidence of multimodal integration in primate striate cortex. Journal of Neuroscience, 22, 5749-5759.
Feinberg, T. E., Rothi, L. J., & Heilman, K. M. (1986). Multimodal agnosia after unilateral left hemisphere lesion. Neurology, 36, 864-867.
Foulke, E., & Warm, J. S. (1967). Effects of complexity and redundancy on the tactual recognition of metric figures. Perceptual and Motor Skills, 25, 177-187.
Freides, D. (1974). Human information processing and sensory modality: Cross-modal functions, information complexity, memory and deficit. Psychological Bulletin, 81, 284-310.
Galletti, C., Battaglini, P. P., & Fattori, P. (1991). Functional properties of neurons in the anterior bank of the parietooccipital sulcus of the macaque monkey. European Journal of Neuroscience, 3, 452-461.
Goldreich, D., & Kanics, I. M. (2003). Tactile acuity is enhanced in blindness. Journal of Neuroscience, 23, 3439-3445.
Grant, A. C., Thiagarajah, M. C., & Sathian, K. (2000). Tactile perception in blind Braille readers: A psychophysical study of acuity and hyperacuity using gratings and dot patterns. Perception and Psychophysics, 62, 301-312.
Grefkes, C., Geyer, S., Schormann, T., Roland, P., & Zilles, K. (2001). Human somatosensory area 2: Observer-independent cytoarchitectonic mapping, interindividual variability, and population map. Neurolmage, 14, 617-631.
Grefkes, C., Weiss, P. H., Zilles, K., & Fink, G. R. (2002). Crossmodal processing of object features in human anterior intraparietal cortex: An fMRI study implies equivalencies between humans and monkeys. Neuron, 35, 173-184.
Griesbach, H. (1899). Vergleichende Untersuchungen über die Sinnesschärfe Blinder und Sehender. [Comparative studies of perceptual acuity in the blind and sighted.] Pflugers Archiv. European Journal of Physiology, 74, 577-638.
Grill-Spector, K., Kushnir, T., Hendler, T., Edelman, S., Itzchak, Y., & Malach, R. (1998). A sequence of object-processing stages revealed by fMRI in the human occipital lobe. Human Brain Mapping, 6, 316-328.
Haenny, P. E., Maunsell, J. H. R., & Schiller, P. H. (1988). State dependent activity in monkey visual cortex. II. Retinal and extraretinal factors in V4. Experimental Brain Research, 69, 245-259.
Hagen, M. C., Franzen, O., McGlone, F., Essick, G., Dancer, C., & Pardo, J. V. (2002). Tactile motion activates the human middle temporal/V5 (MT/V5) complex. European Journal of Neuroscience, 16, 957-964.
Hamilton, R., Keenan, J. P., Catala, M., & Pascual-Leone, A. (2000). Alexia for Braille following bilateral occipital stroke in an early blind woman. NeuroKeport, 11, 237-240.
Heller, M. A. (1989a). Picture and pattern perception in the sighted and the blind: The advantage of the late blind. Perception, 18, 379-389.
Heller, M. A. (1989b). Texture perception in sighted and blind observers. Perception and Psychophysics, 45, 49-54.
Hollins, M. (1985). Styles of mental imagery in blind adults. Neuropsychologia, 23, 561-566.
Hollins, M. (1986). Haptic mental rotation: More consistent in blind subjects? Journal of Visual Impairment and Blindness, 80, 950-952.
Hollins, M. (1989). Understanding blindness: An integrative approach. Hillsdale, NJ: Lawrence Erlbaum Associates.
Iriki, A., Tanaka, M., & Iwamura, Y. (1996). Coding of modified body schema during tool use by macaque postcentral neurones. NeuroReport, 7, 2325-2330.
James, T. W., & Blake, R. (2004). Perceiving object motion using vision and touch. Cognitive, Affective and Behavioral Neuroscience, 4, 201-207.
James, T. W., Humphrey, G. K., Gati, J. S., Servos, P., Menon, R. S., & Goodale, M. A. (2002). Haptic study of three-dimensional objects activates extrastriate visual areas. Neuropsychologia, 40, 1706-1714.
James, T. W., James, K. H., Humphrey, G. K., & Goodale, M. A. (2006). Do visual and tactile object representations share the same neural substrate? In M. A. Heller & S. Ballesteros (Eds.), Touch and blindness: Psychology and neuroscience (pp. 139-155). Mahwah, NJ: Lawrence Erlbaum Associates.
Johnson, C. L., Paivio, A. U., & Clark, J. M. (1989). Spatial and verbal abilities in children’s cross-modal recognition: A dual-coding approach. Canadian Journal of Psychology, 43, 397-412.
Kauffman, T., Theoret, H., & Pascual-Leone, A. (2002). Braille character discrimination in blindfolded human subjects. NeuroReport, 13, 571-574.
Kitada, R., Kito, T., Saito, D. N., Kochiyama, T., Matsumura, M., Sadato, N., et al. (2006). Multisensory activation of the intraparietal area when classifying grating orientation: A functional magnetic resonance imaging study. Journal of Neuroscience, 26, 7491-7501.
Klatzky, R. L., Lederman, S. J., & Reed, C. (1987). There’s more to touch than meets the eye: The salience of object attributes for haptics with and without vision. Journal of Experimental Psychology: General, 116, 356-369.
Lacey, S., & Campbell, C. (2006a). Mental representation in visual/haptic crossmodal memory: Evidence from interference effects. Quarterly Journal of Experimental Psychology, 59, 361-376.
Lacey, S., & Campbell, C. (2006b, June). Object representation in visual/haptic cross-modal memory. Abstract. 7th International Multisensory Research Forum, Dublin.
Lederman, S. J., Klatzky, R. L., Chataway, C., & Summers, C. D. (1990). Visual mediation and the haptic recognition of two-dimensional pictures of common objects. Perception and Psychophysics, 47, 54-64.
Malach, R., Reppas, J. B., Benson, R. R., Kwong, K. K., Jiang, H., Kennedy, W. A., et al. (1995). Object-related activity revealed by functional magnetic resonance imaging in human occipital cortex. Proceedings of the National Academy of Sciences of the USA, 92, 8135-8139.
Marks, D. F. (1973). Visual imagery differences in the recall of pictures. British Journal of Psychology, 64, 17-24.
Marmor, G. S., & Zaback, L. A. (1976). Mental rotation by the blind: Does mental rotation depend on visual imagery? Journal of Experimental Psychology: Human Perception and Performance, 2, 515-521.
Mellet, E., Tzourio, N., Crivello, R, Joliot, M., Denis, M., & Mazoyer, B. (1996). Functional anatomy of spatial mental imagery generated from verbal instructions. Journal of Neuroscience, 16, 6504-6512.
Melzer, P., Morgan, V. L., Pickens, D. R., Price, R. R., Wall, R. S., & Ebner, F. F. (2001). Cortical activation during Braille reading is influenced by early visual experience in subjects with severe visual disability: A correlational fMRI study. Human Brain Mapping, 14, 186-195.
Merabet, L., Thut, G., Murray, B., Andrews, J., Hsiao, S., & Pascual-Leone, A. (2004). Feeling by sight or seeing by touch? Neuron, 42, 173-179.
Morrongiello, B. A., Humphrey, K., Timney, B., Choi, J., & Rocca, P. T. (1994). Tactual object exploration and recognition in blind and sighted children. Perception, 23, 833-848.
Négyessy, L., Nepusz, T., Kocsis, L., & Bazso, F. (2006). Prediction of the main cortical areas and connections involved in the tactile function of the visual cortex by network analysis. European Journal of Neuroscience, 23, 1919-1930.
Newell, F. N., Woods, A. T., Mernagh, M., & Bulthoff, H. H. (2005). Visual, haptic and crossmodal recognition of scenes. Experimental Brain Research, 161, 233-242.
Newman, S. D., Klatzky, R. L, Lederman, S. J., & Just, M. A. (2005). Imagining material versus geometric properties of objects: An fMRI study. Cognitive Brain Research, 23, 235-246.
Pascual-Leone, A., & Hamilton, R. (2001). The metamodal organization of the brain. Progress in Brain Research, 134, 427-445.
Peltier, S., Stilla, R., Mariola, E., LaConte, S., Hu, X., & Sathian, K. (2007). Activity and effective connectivity of parietal and occipital cortical regions during haptic shape perception. Neuropsychologia, 45, 476-483.
Pietrini, P., Furey, M. L, Ricciardi, E., Gobbini, M. I., Wu, W.-H. C., Cohen, L., et al. (2004). Beyond sensory images: Object-based representation in the human ventral pathway. Proceedings of the National Academy of Sciences, 101, 5658-5663.
Pitzalis, S., Galletti, C., Huang, R. S., Patria, F., Committeri, G., Galati, G., et al. (2006). Wide-field retinotopy defines human cortical visual area V6. Journal of Neuroscience, 26, 7962-7973.
Prather, S. C., & Sathian, K. (2002). Mental rotation of tactile stimuli. Cognitive Brain Research, 14, 91-98.
Prather, S. C., Votaw, J. R., & Sathian, K. (2004). Task-specific recruitment of dorsal and ventral visual areas during tactile perception. Neuropsychologia, 42, 1079-1087.
Quinn, J. G., & McConnell, J. (1996). Indications of the functional distinction between the components of visual working memory. Psychologische Beitrage, 53, 355-367.
Reales, J. M., & Ballesteros, S. (1999). Implicit and explicit memory for visual and haptic objects: Cross-modal priming depends on structural descriptions. Journal of Experimental Psychology: Learning, Memory, and Cognition, 25, 644-663.
Reed, C. L., Shoham, S., & Halgren, E. (2004). Neural substrates of tactile object recognition: An fMRI study. Human Brain Mapping, 21, 236-246.
Reed, C. M., Doherty, M. J., Braida, L. D., & Durlach, N. I. (1982). Analytic study of the Tadoma method: Further experiments with inexperienced observers. Journal of Speech and Hearing Research, 25, 216-223.
Reed, C. M., Rubin, S. I., Braida, L. D., & Durlach, N. I. (1978). Analytic study of the Tadoma method: Discrimination ability of untrained observers. Journal of Speech and Hearing Research, 21, 625-637.
Rockland, K. S., & Ojima, H. (2003). Multisensory convergence in calcarine visual areas in macaque monkey. International Journal of Psychophysiology, 50, 19-26.
Röder, B., & Rösler, F. (1998). Visual input does not facilitate the scanning of spatial images. Journal of Mental Imagery, 22, 165-181.
Röder, B., Stock, O., Bien, S. N. H., & Rosier, F. (2002). Speech processing activates visual cortex in congenitally blind humans. European Journal of Neuroscience, 16, 930-936.
Rose, S. A. (1994). From hand to eye: Findings and issues in infant cross-modal transfer. In D. J. Lewkowicz & R. Lickliter (Eds.) The Development of intersensory perception: Comparative perspectives (pp. 265-284). Hove, UK: Lawrence Erlbaum Associates.
Sadato, N., Okada, T, Honda, M., & Yonekura, Y. (2002). Critical period for cross-modal plasticity in blind humans: A functional MRI study. Neurolmage, 16, 389-400.
Sadato, N., Pascual-Leone, A., Grafman, J., Deiber, M.-P., Ibanez, V., & Hallett, M. (1998). Neural networks for Braille reading by the blind. Brain, 121, 1213-1229.
Sadato, N., Pascual-Leone, A., Grafman, J., Ibanez, V., Deiber, M.-P., Dold, G., et al. (1996). Activation of the primary visual cortex by Braille reading in blind subjects. Nature, 380, 526-528.
Saito, D. N., Okada, T., Morita, Y., Yonekura, Y, & Sadato, N. (2003). Tactile-visual cross-modal shape matching: A functional MRI study. Cognitive Brain Research, 17, 14-25.
Sathian, K., Zangaladze, A., Hoffman, J. M., & Grafton, S. T. (1997). Feeling with the mind’s eye. NeuroReport, 8, 3877-3881.
Schroeder, C. E., & Foxe, J. J. (2002). The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. Cognitive Brain Research, 14, 187-198.
Schroeder, C. E., Lindsley, R. W., Specht, C., Marcovici, A., Smiley, J. F., & Javitt, D. C. (2001). Somatosensory input to auditory association cortex in the macaque monkey. Journal ofNeuropbysiology, 85, 1322-1327.
Schroeder, C. E., Smiley, J., Fu, K. G., McGinnis, T., O’Connell, M. N., & Hackett, T. A. (2003). Anatomical mechanisms and functional implications of multisensory convergence in early cortical processing. International Journal of Psychophysiology, 50, 5-17.
Sergent, J., Ohta, S., & MacDonald, B. (1992). Functional neuroanatomy of face and object processing. A positron emission tomography study. Brain, 115, 15-36.
Shepard, R. N., & Metzler, J. (1971). Mental rotation of three-dimensional objects. Science, 171, 701-703.
Shimizu, Y., Saida, S., & Shimura, H. (1993). Tactile pattern recognition by graphic display: Importance of 3-D information for haptic perception of familiar objects. Perception and Psychophysics, 53, 43-48.
Stevens, J. C., Foulke, E., & Patterson, M. Q. (1996). Tactile acuity, aging and Braille reading in long-term blindness. Journal of Experimental Psychology: Applied, 2, 91-106.
Stilla, R., Deshpande, G., LaConte, S., Hu, X., & Sathian, K. (in press). Posteromedial parietal cortical activity and inputs predict tactile spatial acuity. Journal of Neuroscience.
Stilla, R., & Sathian, K. (in press). Selective visuo-haptic processing of shape and texture. Human Brain Mapping.
Stoeckel, M. C., Weder, B., Binkofski, F., Buccino, G., Shah, N. J., & Seitz, R. J. (2003). A fronto-parietal circuit for tactile object discrimination: an event-related fMRI study. Neurolmage, 19, 1103-1114.
Stoesz, M., Zhang, M., Weisser, V. D., Prather, S. C., Mao, H., & Sathian, K. (2003). Neural networks active during tactile form perception: Common and differential activity during macrospatial and microspatial tasks. International Journal of Psychophysiology, 50, 41-49.
Van Boven, R. W., Hamilton, R. H., Kauffman, T., Keenan, J. P., & Pascual-Leone, A. (2000). Tactile spatial resolution in blind Braille readers. Neurology, 54, 2230-2236.
Van Boven, R. W., Ingeholm, J. E., Beauchamp, M. S., Bikle, P. C., & Ungerleider, L. G. (2005). Tactile form and location processing in the human brain. Proceedings of the National Academy of Sciences of the USA, 102, 12601-12605.
Veraart, C., De Voider, A. G., Wanet-Defalque, M.-C., BoI, A., Michel, C., & Goffinet, A. M. (1990). Glucose utilization in human visual cortex is abnormally elevated in blindness of early onset but decreased in blindness of late onset. Brain Research, 510, 115-121.
Weisser, V, Stilla, R., Peltier, S., Hu, X., & Sathian, K. (2005). Short-term visual deprivation alters neural processing of tactile form. Experimental Brain Research, 166, 572-582.
Zangaladze, A., Epstein, C. M., Grafton, S. T., & Sathian, K. (1999). Involvement of visual cortex in tactile discrimination of orientation. Nature, 401, 587-590.
Zhang, M., Mariola, E., Stilla, R., Stoesz, M., Mao, H., Hu, X., et al. (2005). Tactile discrimination of grating orientation: fMRI activation patterns. Human Brain Mapping, 25, 370-377.
Zhang, M., Weisser, V. D., Stilla, R., Prather, S. C., & Sathian, K. (2004). Multisensory cortical processing of object shape and its relation to mental imagery. Cognitive, Affective and Behavioral Neuroscience, 4, 251-259.
Zhou, Y.-D., & Fuster, J. M. (1997) Neuronal activity of somatosensory cortex in a cross-modal (visuo-haptic) memory task. Experimental Brain Research, 116, 551-555.
K. Sathian, Emory University and Atlanta VAMC Rehabilitation
R&D Center of Excellence
Simon Lacey, Emory University
Correspondence should be addressed to K. Sathian, MD, PhD, Department of Neurology, Emory University School of Medicine, 101 Woodruff Circle, WMB-6000, Atlanta, GA 30322 (E-mail: email@example.com). Current research support to KS from the National Eye Institute, National Science Foundation, and the Veterans Administration is gratefully acknowledged.
Copyright Canadian Psychological Association Sep 2007
Provided by ProQuest Information and Learning Company. All rights Reserved