Haptic exploration refers to purposive action patterns that perceivers execute in order to encode properties of surfaces and objects, patterns that are executed spontaneously and also appear to optimize information uptake.
Specialized patterns of exploration, called exploratory procedures (EPs), have been described. These exploratory patterns are linked to specific object properties, in two respects: The EP associated with an object property is (a) executed spontaneously when information about that property is desired, and (b) appears to optimize information uptake about that property (Lederman & Klatzky, 1987). A basic set of EPs, along with their associated properties and behavioral invariants, is as follows:
- Exploratory Procedure: Lateral motion
- Associated Property: Surface texture
- Behavior: The skin is passed laterally across a surface, producing shear force.
- Exploratory Procedure: Pressure
- Associated Property: Compliance or hardness
- Behavior: Force is exerted on the object against a resisting force; for example, by pressing into the surface, bending the object, or twisting.
- Exploratory Procedure: Static contact
- Associated Property: Apparent temperature
- Behavior: The skin surface is held in contact with the object surface, without motion; typically a large surface (like the whole hand) is applied. This EP gives rise to heat flow between the skin and the object.
- Exploratory Procedure: Unsupported holding
- Associated Property: Weight
- Behavior: The object is held while the hand is not externally supported; typically this EP involves lifting, hefting or wielding the object.
- Exploratory Procedure: Enclosure
- Associated Property: Volume; Global shape
- Behavior: The fingers (or other exploring effector) are molded closely to the object surface.
- Exploratory Procedure: Contour following
- Associated Property: Exact shape
- Behavior: Skin contact follows the gradient of the object's surface or is maintained along edges when they are present.
Use of exploratory procedures
For purposes of haptic perception, people tend to explore objects using EPs that optimize information apprehension (Klatzky & Lederman, 1987). An EP that is optimal for one property may also deliver information about another; for example, contour following on a surface will be informative about its texture, since the contour following EP, like lateral motion, produces shear forces. Conversely, although a specialized EP maximizes information intake about an associated object property, it has the further consequence of limiting access to some other properties. For example, use of the static hand to perceive temperature is incompatible with executing lateral motion to perceive texture. Some EPs can be executed together, allowing simultaneous access to multiple object properties (Klatzky, Lederman, & Reed, 1987). For example, people tend to exert pressure while they are rubbing an object to obtain simultaneous compliance and texture information. Also, when people grasp and lift an object, they obtain information about its shape, volume, and weight.
Specialized EPs have been documented to occur when perceivers focus on extracting particular properties of objects by touch. However, when vision is present, EPs tend to be executed only when people wish to perceive material properties (e.g., compliance, texture), and then only when relatively precise discrimination is demanded (Klatzky, Lederman & Matula, 1993). For example, the rough texture of coarse sandpaper is salient to vision, and hence the object is unlikely to elicit haptic exploration. In contrast, a person attempting to determine whether a surface is free from grit (a fine discrimination) will be likely to explore using a specialized EP, lateral motion.
In addition to using EPs, people can acquire accurate information about the environment, as well as tool length and weight, from dynamic touch during manipulation. Turvey and colleagues focus on information that can be obtained from grasping and wielding (e.g., raising, lowering, pushing, turning, transporting: Turvey & Carello, 1995). Information obtained from wielding includes length, weight, width, shape of the object tip, and orientations of hands in relation to objects. This information is obtained from the sensitivity of body tissues to values of rotational dynamics under rotational forces (torques) and motions. For example, wielding can enable people to make highly accurate judgments of a rod's length.
Haptic exploration in nonhumans and young humans
A variety of species exhibit specialized patterns of exploration similar to those of humans. These include squirrel and capuchin monkeys (Hille, Becker-Carus, Dücker, & Dehnhardt, 2001; Lacreuse & Fragaszy, 1997). Whisker sweeps of seals during shape discrimination may serve a similar function to following contours of an object by humans (Denhardt & Dücker, 1996). However, in humans, the way in which an EP is executed will depend further on the object being explored and the specific intentions of the perceiver (Riley et al., 2002).
Systematic exploration is observed in young children as well as adults and follows a developmental progression. Early grasping and fingering may be precursors of the EPs of enclosure and lateral motion, for example (Ruff, 1989). The occurrence of these EPs has been found to be predicated on which properties are most salient; for example, textured objects promote fingering. The fact that infants tend to explore with the mouth may reflect early motor control of the oral musculature as well as the density of sensory input. Pre-schoolers spontaneously execute adult-like EPs and use them appropriately to test the desirable properties of tools (Klatzky, Lederman & Mankinen, 2005).
Haptic exploration from a neurophysiological perspective
To understand why there is specialization of exploration during haptic perception, one must consider two perspectives: the physical interaction between the perceiver and the object, and the neural consequences of that interaction. In general, EPs put into place physical interactions that then optimize the signal to neural receptors. For example, the EP called Static Contact, whereby a large skin surface is held without motion against a surface in order to perceive its temperature, provides an opportunity for heat flow between the skin and surface. The resulting change of temperature at the skin surface is sensed by neurons within the skin that specialize in the detection of coolness and warmth, and initiate signals of thermal change to the brain (Ho & Jones, 2008).
Specialized exploration also has implications for the brain’s ability to recognize objects by touch. For tactile object recognition, perceptual input information must be transformed through a series of processing stages before object identification. The first stage of processing involves the extraction and processing of the properties and features of the felt object. For example, to recognize a quarter one takes in its small size, cool temperature, round shape, and rough edges. The perceptual information provided by EPs is combined or integrated into a common modality-specific object description, which is then used to access information about object identity and function. When people are allowed to either use specialized EPs or a more general grasp for recognizing real, multidimensional objects, brain regions specific to touch are activated in addition to brain regions activated for visual object recognition. These touch-specific brain regions include primary somatosensory cortex (SI), secondary somatosensory cortex (SII), parietal operculum, and the insula. Some activated brain regions are similar to object recognition using vision, such as the lateral occipitotemporal cortex, medial temporal lobe, and prefrontal areas, all of which support cross-modal information integration leading to object recognition (Reed, Shoham, & Halgren, 2004). However, when the touched stimuli are primarily two-dimensional spatial patterns that do not permit EPs, a greater reliance on visual cortical regions (e.g., V1, V2, etc) is observed (Zangaladze, Epstiein, Grafton, & Sathian, 1999).
To what extent are EPs necessary for object recognition? Patients with brain damage provide some insight into the relative contributions from sensory and motor inputs. Patients with “tactile agnosia” have a deficit in recognizing common objects by touch following brain damage, usually in the inferior parietal lobe (Reed, Caselli, & Farah, 1996). They tend to have relatively intact tactile sensation, memory, and general intellectual function. To examine the extent to which somatosensory and motoric inputs influence the tactile object recognition process, researchers have demonstrated that patients who have damage to their hands or peripheral nervous system, as well as patients with lesions in SI, do not always show tactile object recognition deficits. Likewise, patients with hand paralysis do not necessarily have significant deficits in tactile object recognition (Caselli, 1991). Because the hand uses both tactile inputs and hand movements, usually in the form of EPs, to extract somatosensory information, these patients may be using the motions of object parts or hand movement cues to extract relevant object information that they cannot obtain through purely tactile perception. Nonetheless, other patients with right hemisphere damage and disorganized or random EPs do show tactile object recognition deficits (Valenza, Ptak, Zimine, Badan, Lazeyras & Schnider, 2001). In sum, the inability to execute EPs does contribute to tactile agnosia and make object recognition more difficult, but the object recognition process can be aided by previous knowledge of object parts and functions that can assist sensory and motoric limitations.
Active haptic exploration versus passive touch
Loomis and Lederman (1986) distinguished between (a) active haptic perception, which is controlled by the observer and encompasses both cutaneous (skin) and kinesthetic (muscle, tendon, joint) inputs, and (b) various passive tactual modes of perceiving, in which one or more of these components is missing. In a classic experiment, Gibson (1962) found that active haptic exploration of object contours (specifically cookie cutters) resulted in better subsequent recognition performance than when the same objects were passively presented, either as stationary forms or by being moved against the hand. In texture perception, however, passive motion of a surface across the skin can lead to a percept of surface roughness similar to that resulting from active exploration of the same surface (e.g. Lederman, 1981); the important condition here is that the surface moves tangentially relative to the skin. Thus one should not conclude that active control of haptic exploration is invariably superior.
Haptic exploration and tool use
Finally, people explore objects not only with their hands or other parts of the body, but also with tools. Although tools limit what can be perceived, they can nevertheless provide at least coarse information about some object properties. For example, when a rigid tool is used to explore a rigid object, the resulting vibrations can enable people to perceive its surface texture (Katz, 1925). Any dentist knows that the deformation and resistance of an object under the force of a probing tool can allow the perception of compliance. The same EP that is executed with the bare hand to extract an object property may not be observed when a tool is used, given the changes in the physical interaction that arise from tool use.
In conclusion, haptic exploration involves exploratory procedures, active touch patterns that are specific to the demands of the task in that they optimize the extraction of information the perceiver needs to obtain. Haptic exploration allows people and animals to extract specific types of tactile information from the environment and provides information about the material or substance properties of objects that cannot be achieved by vision alone. Further, exploratory procedures expand the basic sensory functions of animate bodies by allowing them to perceive the world through tools.
- Caselli, R.J. (1991). Rediscovering tactile agnosia. Mayo Clinic Proceedings, 66, 129-142.
- Denhardt, G., & Dücker, G. (1996). Tactual discrimination of size and shape by a California sea lion (Zalophus californianus). Animal Learning & Behavior, 24, 366-374.
- Gibson, J. J. (1962). Observations on active touch. Psychological Review, 69, 6, 477-491.
- Hille, P., Becker-Carus, C., Dücker, G., & Dehnhardt, G. (2001). Haptic discrimination of size and texture in squirrel monkeys (Saimiri sciureus). Somatosensory & Motor Research, 18, 50-61.
- Ho, H. & Jones, L.A. (2008). Warm or cool, large or small? The challenge of thermal displays. IEEE Transactions on Haptics, 1, 53-70.
- Katz, D. (1925). Der aufbau der tastwelt (The world of touch). Translated by L. Krueger (1989). Mahwah, NJ: Erlbaum.
- Klatzky, R.L., Lederman, S.J., & Mankinen, J.M. (2005). Visual and haptic exploratory procedures in children’s judgments about tool function. Infant Behavior and Development, 28, 240-249.
- Klatzky, R. L., Lederman, S. J., & Matula, D. E. (1993). Haptic exploration in the presence of vision. Journal of Experimental Psychology: Human Perception and Performance, 19, 726-743.
- Klatzky, R.L., Lederman, S.J., & Reed, C.L. (1987). There’s more to touch than meets the eye: the salience of object attributes for hpatics with and without vision. Journal of Experimental Psychology: General, 116, 356-369.
- Lacreuse, A., & Fragaszy, D. M. (1997). Manual exploratoryprocedures and asymmetries for a haptic search task: A comparison between capuchins (Cebus apella) and humans. Laterality, 2, 247-266.
- Lederman S.J. (1981). The perception of surface roughness by active and passive touch. Bulletin of the Psychonomic Society, 18, 253-255.
- Lederman, S.J., & Klatzky, R. L. (1987). Hand movements: a window into haptic object recognition. Cognitive Psychology, 19, 342-368.
- Loomis, J. M., & Lederman, S. J. (1986). Tactual perception. In K. Boff, L. Kaufman, & J. Thomas (Eds.), Handbook of perception and human performance (pp. 31-1-31-41). New York: Wiley.
- Reed, C.L., Caselli, R.J., & Farah, M.J. (1996). Tactile agnosia: Underlying impairment and implications for normal tactile object recognition. Brain, 119, 875-888.
- Reed, C.L., Shoham, S., & Halgren, E. (2004). Neural substrates of tactile object recognition: an fMRI study. Human Brain Mapping, 21, 236-246.
- Riley, M. A., Wagman, J. B., Santana, M.-V., Carello, C., & Turvey, M. T. (2002). Perceptual behavior: Recurrence analysis of a haptic exploratory procedure. Perception. 31, 481-510.
- Ruff, H. A. (1989). The infant's use of visual and haptic information in the perception and recognition of objects. Canadian Journal of Psychology/Revue Canadienne De Psychologie, 43, 302-319.
- Turvey, M.T., & Carello, C. (1995). Dynamic touch. In W. Epstein & S. Rogers (Eds.), Handbook of perception and cognition, Vol. 5, Perception of space and motion (pp.l 401-490). San Diego: Academic Press.
- Valenza, N., Ptak, R., Zimine, I., Badan, M., Lazeyras, F., & Schnider, A. (2001). Dissociated active and passive tactile shape recognition: A case study of pure tactile apraxia. Brain, 124, 2287-2298.
- Zangaladze, A., Epstein, C.M., Grafton, S.T., & Sathian, K. (1999). Involvement of visual cortex in tactile discrimination of orientation. Nature, 401, 587-590.
- Howard Eichenbaum (2008) Memory. Scholarpedia, 3(3):1747.
- Rodolfo Llinas (2008) Neuron. Scholarpedia, 3(8):1490.
- Sliman Bensmaia (2009) Texture from touch. Scholarpedia, 4(8):7956.