|Jamie Ward (2009), Scholarpedia, 4(3):8251.||doi:10.4249/scholarpedia.8251||revision #146336 [link to/cite this article]|
Observed touch refers simply to the process of watching touch to an inanimate or animate object. In many situations, this is a purely visual event. For example, when watching inanimate objects or other people being touched. However, just because the stimulus is visual in nature does not mean that the brain processes it in purely visual terms. There is now convincing evidence that observed touch, at least in humans, involves cognitive and neural processes that are traditionally considered part of the somatosensory system. In other situations, observed touch may consist of both a visual and a tactile event. For example, when observing our own body being touched or when observing touch to somebody/something else whilst our own body is being touched. Studies on observed touch can therefore provide important insights into how these two senses (vision and touch) interact with each other.
Observed Touch Influences Felt Touch
Observing touch can enhance the ability to detect real tactile stimuli. For example, watching a movie of a right hand being stroked increases subsequent tactile sensitivity in the participant’s own (unseen) right hand but not his/her left hand . Similarly watching a mirror reflection of one’s own right hand being stroked (i.e. so that it looks like the left hand is stroked) can increase subsequent tactile sensitivity in the participant's left hand, although merely looking at the left hand (without the mirror reflection of the touched right hand) does not . In some cases, merely observing a body part, without observing the touch itself, can increase tactile acuity . This effect can be enhanced by making the body part seem larger (using a magnifying glass) and reduced by placing an inanimate object in the same apparent location.
The studies mentioned above involve observing touch (or observing the body part to be touched) followed later in time by unseen tactile stimuli. An alternative approach is to observe touch at exactly the same time as real touch is applied to see if the two interact. In such experiments, it is important that the tactile stimulator cannot be seen otherwise it could provide trivial information about when, how and where the participant was touched. One can circumvent this problem by touching the participants hands or face (or other body part) whilst they observe touch to a different person, dummy body parts, or images of bodies or objects on a computer screen [4-8]. For example, one study applied weak tactile stimuli to the participant's left or right cheek whilst they observed touch either to an image of themselves, an image of another person, or an image of a house . Their task was to report the location of the felt touch (left, right or both) whilst ignoring the location of the observed touch (also left, right, or both). Accuracy of detection was enhanced when the observed touch matched the felt touch (e.g. observed touch to both cheeks and felt touch to both cheeks) and was greatest for the self-face relative to other-face, and for other-face relative to the house images. This suggests that the interplay between observed touch and felt touch depends on self-other similarity. This extends beyond physical similarity to also include similarity in political attitudes . Thus, the system linking observed touch with felt touch is sensitive to top-down biases.
In some cases, when observed touch is incongruent with felt touch it can give rise to bizarre illusions. Perhaps the most commonly studied illusion involving observed touch is the so-called ‘rubber hand illusion’ . In this illusion, the participants own hand is hidden from view whilst a rubber hand is placed on the table near their real hand. The experimenter then strokes both the rubber hand and the real hand in the same place at the same time. As such, the participant observes touch to the rubber hand but feels touch in their own hand. Over time, participants report the following statements to be true: “I felt as if the rubber hand were my hand”, and “It seemed as though the touch I felt was caused by the paintbrush touching the rubber hand” . A more objective correlate of the illusion is to ask participants to report the felt location of the finger that is being touched by placing a ruler on the seen surface where the rubber hand lies. Participants in the illusion are less accurate at doing this (the position of the felt finger gravitates towards the rubber hand). The presence of the illusion, measured both subjectively and objectively, depends on a number of factors. It disappears when the observed and actual stroking are out of synchrony; if the observed hand is the ‘wrong’ hand, for example a left hand instead of a right hand; if the hand is replaced by a stick; and if the observed hand is in the ‘wrong’ orientation, for example at 90 degrees to the real hand . The explanation for this illusion is that the sense of proprioception is fooled. Our proprioceptive sense conveys information about the current position of the muscles and joints. It provides information about where the hand is in space, e.g. relative to the body, at a given location in time (note: the sense of touch carries the information ‘my finger is being stroked’ but not where the finger is located in space). In the rubber hand illusion, there is a spatial conflict between information from proprioception and vision but the discrepancy is resolved in vision’s favour (so-called visual capture) because vision tends to be spatially more accurate. This illusion has recently been extended using virtual reality to create something akin to an out-of-body experience [13, 14].
Observed Touch Activates the Somatosensory System
Several functional imaging studies, using fMRI, have measured brain responses to watching humans being touched. These studies show activation of the primary and/or secondary somatosensory cortex either when touch is compared to no touch or when touch to a human is compared to touch to an object . As such a purely visual stimulus can, under some situations, reliably activate the somatosensory system. These same regions respond when the real touch to the appropriate body part is felt, which suggests that there is a mirror system for touch (i.e. a system that responds both to our own touch and that of other people and possibly objects). This is perhaps remarkable given the general belief that our tactile sensory experiences are private rather than shared.
The question of why somatosensory activity when observing touch is not accompanied by a conscious experience of touch is interesting. It may reflect sub-threshold activity and/or the absence of activity in other parts of the brain outside of the primary and secondary somatosensory cortex. Nonetheless, this modulation of somatosensory processing by observed touch may be sufficient to alter behaviour (e.g. improve acuity) or lead to certain illusions. There is evidence that activity in the somatosensory cortex is reduced during the rubber hand illusion presumably as a result of the mismatch between the location of real touch and the illusion that one is feeling touch to the rubber hand .
It is unlikely that these behavioural and neurophysiological effects are mediated by direct connections between early visual areas and primary/secondary somatosensory regions. The effect of observed touch depends on far more than vision per se. It depends, to some degree, on whether the touch is to humans or objects, and the perspective and location of the observed body part relative to the observer’s own body. There are neurons within the primate parietal lobe and frontal lobe (premotor region) that respond to both touch and vision [17, 18]. Observed touch may affect the somatosensory regions via feedback from these multi-sensory regions. The visual and tactile receptive fields of these neurons tend to be aligned such that the visual receptive field will respond to visual events on or near the arm, irrespective of where the arm is. In some situations the visual receptive fields can extend beyond the body. For example, when using tools or watching one’s own body projected onto a computer screen. There is a class of neurons termed 'body matching neurons' discovered in the macaque parietal lobe that responds to both felt touch and observed touch on the same body part of another person .
Observed Touch in Atypical Populations
After loss of a limb, the majority of patients experience (at some point in time) a vivid sense that the missing limb is still present. It may itch, gesticulate, or be painful. Several studies report that these phantom limb experiences can be modified by visual feedback, for example, by placing a mirror down the midline so that the intact limb is seen to be reflected into the space where the phantom is felt[22, 23]. Touching the normal hand whilst watching its mirror reflection in the space where the phantom is felt can create a sensation of touch in the phantom hand as well as the real hand. It has subsequently been found that some amputees experience touch when observing touch, particularly painful touch, on another person .
Some brain damaged patients with impaired ability to detect touch (whilst blindfolded) nevertheless report tactile sensations when observing the touch applied to their arm or a live/recorded movie of touch to their arm[25, 26]. Vision of an appropriate body part (e.g. watching an arm whilst an arm is touched) can enhance tactile acuity in brain damaged patients with poor somatosensation even if the act of touching is blacked out. The same is not found if an inanimate object or the wrong body part (e.g. a foot) is observed.
Synaesthesia (also spelled synesthesia) is the involuntary experience of a perceptual quality that is triggered by a stimulus that does not normally evoke such a quality (for examples, letters evoking colours, sounds evoking tastes). Observed touch can sometimes trigger involuntary experiences of felt touch and this constitutes a form of synaesthesia. This can be acquired after amputation or brain damage[24, 25], or it could be developmental in origin with no known external cause[28, 29]. In an experimental study involving congruent or incongruent observed and felt touch, synaesthetic touch (induced by observed touch) tended to be confused with real touch. There were other intriguing aspects of this study. There appears to be two spatial mechanisms for linking observed touch with ones own body. In most synaesthetes, observed touch on the left of someone's face triggers a tactile experience on the synaesthete's right cheek (i.e. like a reflection) but in others it is felt on their left cheek (i.e. like a rotation). A functional imaging study using fMRI of developmental mirror-touch synaesthesia showed hyper-activity within certain regions of the same network that responds when non-synaesthetic people observe touch. This included the primary and secondary somatosensory cortex. The same study found, using structural MRI, increased grey matter density in a region of secondary somatosensory cortex (amongst other regions). Activity beyond some threshold may be linked to conscious experiences of touch (in synaesthetes) but below the threshold may have unconscious behavioural correlates (in controls) such as those documented previously. This was assumed to reflect a difficulty in a self-other gating mechanism that enables ones own body to be processed as separate from other bodies.
1. Schaefer, M., H.-J. Heinze, and M. Rotte, Viewing touch improves tactile sensory threshold. NeuroReport, 2005. 16: p. 367-370.
3. Kennett, S., M. Taylor-Clarke, and P. Haggard, Noninformative vision improves the spatial resolution of touch in humans. Current Biology, 2001. 11: p. 1188-1191.
4. Maravita, A., et al., Seeing your own touched hands in a mirror modulates cross-modal interactions. Psychological Science, 2002. 13: p. 350-355.
5. Pavani, F., C. Spence, and J. Driver, Visual capture of touch: Out-of-the-body experiences with rubber gloves. Psychological Science, 2000. 11: p. 353-359.
6. Spence, C., F. Pavani, and J. Driver, Spatial constraints on visual-tactile cross-modal distractor congruency effects. Cognitive, Affective and Behavioral Neuroscience, 2004. 4: p. 148-169.
7. Tipper, S.P., et al., Vision influences tactile perception without proprioceptive orienting. NeuroReport, 1998. 9: p. 1741-1744.
8. Tipper, S.P., et al., Vision influences tactile perception at body sites that cannot be viewed directly. Experimental Brain Research, 2001. 139: p. 160-167.
9. Serino, A., F. Pizzoferrato, and E. Ladavas, Viewing a face (especially one's own face) being touched enhances tactile perception on the face. Psychological Science, 2008. 19: p. 434-438.
10. Serino, A., et al., I Feel what You Feel if You Are Similar to Me. PLoS One, 2009. 4(3): 7.
11. Botvinick, M. and J. Cohen, Rubber hands ‘feel’ touch that eyes see. Nature, 1998. 391: p. 756.
12. Tsakiris, M. and P. Haggard, The Rubber Hand Illusion revisited: Visuotactile integration and self-attribution. Journal of Experimental Psychology: Human Perception and Performance, 2005. 31: p. 80-91.
13. Ehrsson, H.H., The experimental induction of out-of-body experiences. Science, 2007. 317: p. 1048-1048.
14. Lenggenhager, B., et al., Video ergo sum: Manipulating bodily self-consciousness. Science, 2007. 317: p. 1096-1099.
15. Keysers, C., et al., Somatosensation in social perception. Nature Reviews Neuroscience, 2010. 11: p. 417-428.
16. Tsakiris, M., et al., Neural signatures of body ownership: A sensory network for bodily self-consciousness. Cerebral Cortex, 2007. 17: p. 2235-2244.
17. Graziano, M.S., D.F. Cooke, and C.S.R. Taylor, Coding the location of the arm by sight. Science, 2000. 290: p. 1782-1786.
18. Graziano, M.S.A., Where is my arm? The relative role of vision and proprioception in the neuronal representation of limb position. Proceedings of the National Academy of Sciences of the United States of America, 1999. 96: p. 10418-10421.
19. Iriki, A., M. Tanaka, and Y. Iwamura, Coding of modified body schema during tool use by macaque postcentral neurons. NeuroReport, 1996. 7: p. 2325-2330.
20. Iriki, A., et al., Self-images in the video monitor coded by monkey intraparietal neurons. Neuroscience Research, 2001. 40: p. 163-173.
21. Ishida, H., et al., Shared Mapping of Own and Others' Bodies in Visuotactile Bimodal Area of Monkey Parietal Cortex. Journal of Cognitive Neuroscience, 2009. 22(1): p. 83-96.
22. Ramachandran, V.S. and D. Rogers-Ramachandran, Synaesthesia in phantom limbs induced with mirrors. Proceedings of the Royal Society of London B, 1996. 263: p. 377-386.
23. Ramachandran, V.S., D. Rogers-Ramachandran, and S. Cobb, Touching the phantom limb. Nature, 1995. 377: p. 489-490.
24. Goller, A. I., et al., Mirror-touch synaestheisa in the phantom limb of amputees. Cortex, 2013. 49(1): p. 243-251.
25. Halligan, P.W., et al., When seeing is feeling: Acquired synaesthesia or phantom touch? Neurocase, 1996. 2: p. 21-29.
26. Halligan, P.W., et al., Somatosensory assessment: Can seeing produce feeling? Journal of Neurology, 1997. 244: p. 199-203.
27. Serino, A., et al., Can vision of the body ameliorate impaired somatosensory function? Neuropsychologia, 2007. 45: p. 1101-1107.
28. Banissy, M. and J. Ward, Mirror touch synaesthesia is linked with empathy. Nature Neuroscience, 2007. 10: p. 815-816.
29. Holle, H., et al., Functional and structural brain correlates of mirror-touch synaesthesia. NeuroImage, 2013. 83: p. 1041-1050.
- William D. Penny and Karl J. Friston (2007) Functional imaging. Scholarpedia, 2(5):1478.
- Seiji Ogawa and Yul-Wan Sung (2007) Functional magnetic resonance imaging. Scholarpedia, 2(10):3105.
- Giacomo Rizzolatti and Maddalena Fabbri-Destro (2008) Mirror neurons. Scholarpedia, 3(1):2055.
- Rodolfo Llinas (2008) Neuron. Scholarpedia, 3(8):1490.
- Jose-Manuel Alonso and Yao Chen (2009) Receptive field. Scholarpedia, 4(1):5393.
- Arkady Pikovsky and Michael Rosenblum (2007) Synchronization. Scholarpedia, 2(12):1459.
- Vilayanur S. Ramachandran and David Brang (2008) Synesthesia. Scholarpedia, 3(6):3981.