Active tactile perception

From Scholarpedia
Nathan Lepora (2015), Scholarpedia, 10(3):32364. doi:10.4249/scholarpedia.32364 revision #149470 [link to/cite this article]
Jump to: navigation, search
Post-publication activity

Curator: Nathan Lepora

We do not just touch, we feel (Bajcsy 1987). Our tactile sense is not merely a passive receiver of information, but actively selects and refines sensations according to our present goals and perceptions (Gibson, 1962). Our fingers, hands and bodies are not external from the world, but direct actions within it to access the information that we need. Thus, tactile sensation, perception and action cannot be considered simply as a forward process, but instead form a closed ‘active perception’ loop.


Active perception versus active sensing

A potential confusion lies in that some researchers refer to active sensing and others to active perception, when describing the same process. There is also a distinction between teleceptive and contact active sensory systems.

Sensation and perception are considered distinct stages in the processing of the senses in humans and animals. Tactile sensation refers to the first stages in the functioning of the senses, related to the effect of a physical stimulus on touch receptors in the skin and their transduction and transmittal from the peripheral nervous system to the sensory areas of the brain; tactile perception refers to later stages where the sensation is processed, organized and interpreted so that the organism may use the information to guide its behaviour based on understanding its environment.

Therefore active sensing could refer to controlling the movements of the sensory apparatus while contacting a stimulus; for example, brushing our fingertips across a surface to feel texture. However, those movements are themselves guided in response to perceiving other sensory information; for example, we may control the force exerted by our fingertips against the surface to best feel the texture. Therefore the active process can refer to the entire sensation-perception-action loop, rather than just the sensory or perceptual parts of it. A further subtlety is that some of the movements involved in active sensing may be purely reflexive, in that they been hardwired by evolution as successful strategies to take during sensing; for example, reflexive orienting movements are made by the superior colliculus in the midbrain without involvement of higher brain areas such as sensory cortex. It is then debateable whether the movement was truly from ‘perceiving’ the stimulus, so the process could then just be described as active sensing, which is the term commonly used in the biological literature.

Another source of confusion is that the term active sensing (and active sensory system) is commonly used to refer to sensory receptors that are activated by probing the environment with self-generated energy. Examples include the echolocation of bats and dolphins, and the electrosensory detection of electric fish. Thus the term active perception can remove ambiguity by emphasizing that the process referred to is the control of the sensor to select and refine perceptions according to the present goals and perceptions of the organism. Thus even if the movement was reflexive, if the movement was to aid perception, then the process could be referred to as active perception. For this reason, in this article we will refer to active tactile perception, but we acknowledge that some authors prefer the term active tactile sensing and in many ways the terms can be treated synonymously.

Active sensory systems, in the sense of using self-generated energy, are also further distinguished into teleceptive systems that propagate energy (e.g. acoustic or electromagnetic) and contact sensory systems that use physical contact between the stimulus and the sensor. One can therefore view tactile sensing as mediated by contact active sensory systems, which can use active perception to select, refine and interpret the sensory information to understand the tactile environment.

Views of active perception

Concepts related to active tactile perception and active touch have been defined by several scientists over the last 50 years (Prescott et al 2011). Their terminology does change. For example Gibson (1962) refers to active touch, (Bajcsy 1987) sometimes refers to active sensing and O'Regan and Noe (2001) refer to sensory contingencies. However, all definitions can be seen as applying to active tactile perception, albeit possibly in a more limited manner than that intended originally by the authors. This is particularly the case with touch, which encompasses all tactile phenomena not just perception; also there are differences in the meaning of perception versus exploration, and the sensory contingency theory of O'Regan and Noe is intended as a broader treatment of conscious phenomena rather than just active perception.

Active touching

In his 1962 article on ‘Observations on Active Touch’, the Psychologist James J. Gibson makes a number of observations about the relation between bodily movements and the sense of touch that would underlie later work on active tactile perception.

He begins by defining that: ‘Active touch refers to what is ordinarily called touching. This ought to be distinguished from passive touch, or being touched. In one case the impression on the skin is brought about by the perceiver himself and in the other case by some outside agency.’ (Gibson, 1962). Here he is making a distinction between whether the agent is itself controlling its body to sense, or instead whether a tactile sensation is exerted on the agent. It would therefore be wrong, in Gibson’s view, to think of passive touch as simply the absence of movement during tactile perception; immobile touch may in some circumstances be appropriate to best sense a tactile percept, such as holding a fingertip against a vibrating surface to best perceive the strength of vibration. Instead, passive touch is when a tactile event occurs that is unanticipated or not of concern, which could then become active touch if the organism responds to that passive sensation to further select or refine the sensory information.

Gibson clarifies his position further by criticizing a view of active touch as merely a combination of kinesthesis (the feeling of bodily movement) and touch proper (the feeling of contact), in that ‘it fails to take account of the purposive character of touching.’ He emphasizes further that movements (or lack thereof) underlying active touch are purposive: ‘the act of touching or feeling is a search for stimulation or, more exactly, an effort to obtain the kind of stimulation which yields a perception of what is being touched. When one explores anything with his hand, the movements of the fingers are purposive. An organ of the body is being adjusted for the registering of information.’ Here he is using the purposiveness of movements in active touch, or for active tactile perception, to emphasize that organisms use perception for a reason: to achieve their goals and needs. Active tactile perception seeks to help achieve those goals by actively selecting and refining the sensations to give the appropriate perceptual information.

Active perception

In her 1988 paper on ‘Active Perception’, the Engineering Scientist Ruzena Bajcsy used the term active to denote ‘purposefully changing the sensor’s state parameters according to sensing strategies,’ such that these controlling strategies are ‘applied to the data acquisition process which will depend on the current state of the data interpretation and the goal or the task of the process’ (Bajcsy 1988). As such, her definition of active perception is in essence equivalent to Gibson’s definition of active touch, although her terminology and nomenclature are adapted to Engineering. In particular, her phrase ‘changing the sensor’s state parameters’ is equivalent to ‘moving’ or ‘adjusting’ the sensor, assuming all state parameters correspond to physical changes of the sensor brought about by movement; ‘data acquisition’ means the same as ‘sensing’; and ‘data interpretation’ the same as ‘perceiving’.

Since Bajcsy brought her definition within Engineering terminology, one could perhaps interpret active perception as an application of Control Theory, by interpreting active perception as a closed-loop system that uses sensory feedback to control the state parameters of the sensor. However, Bajcsy emphasis that this interpretation is too simplistic, because: ‘the feedback is performed not only on sensory data but on complex processed sensory data’ and ‘the feedback is dependent on a priori knowledge.’ Instead she views active perception as ‘an application of intelligent control theory which includes reasoning, decision making and control.’ Here she is indicating the limitations of classical control theory for addressing active perception, which conventionally applies to systems with a simple relation between the sensory data and the variables to be controlled. Modern techniques in Intelligent Control have helped in utilizing complex processed sensory data, although there remain significant challenges, as discussed below in the section on Active Tactile Perception in Robotics.

Similar views were also brought forwards by other authors around the same time, focussing on the modality of vision. Dana H. Ballard in his work on ‘Animate Vision’ argues that vision is understood ‘in the context of the visual behaviours that the system is engaged in, and that these behaviours may not require elaborate categorical representations of the 3-D world’ (Ballard, 1990). Meanwhile, John Aloimonos and colleagues in ‘Active vision’ take a similar approach to Bajcsy in defining ‘an observer is called active when engaged in some kind of activity wholse purpose is to control the geometric parameters of the sensor sensory apparatus.’ (Aloimonos et al 1990), with ‘The purpose of the activity is to manipulate the constraints underlying the observed phenomena in order to improve the quality of the perceptual results.’

Haptic exploration

Closely related to active tactile perception, Haptic exploration was introduced in Psychologists Susan J. Lederman and Roberta L. Klatzky’s 1987 article on ‘Hand Movements: A Window into Haptic Object Recognition’ (Lederman and Klatzky, 1987) They propose that ‘the hand (more accurately, the hand and brain) is an intelligent device, in that it uses motor capabilities to greatly extend its sensory functions,’ consistent with observations on active touch by Gibson and others. They then extended these ideas by offering a ‘taxonomy for purposive hand movements that achieve object apprehension. Specific exploratory procedures appear to be linked with specific object dimensions.’ Thus, like Gibson, they emphasise the purposive nature of the movements guiding the hand and fingers to recognize objects, but they go further in proposing the motor mechanisms – a set of exploratory procedures – that constitute those purposive movements.

Although Lederman and Klatzky’s proposals on Haptic exploration and Bajcsy’s definition of Active Perception were presented independently, there are various ways they can be related. For example, Active Haptic Exploration can also be contrasted with passive touch (Loomis and Lederman, 1986), as discussed in the article (Klatzky and Reed, 2009). Moreover, Haptic Exploration can be active in the sense of Bajcsy’s formulation as feedback control, as is apparent from considering specific exploratory procedures; for example, contour following requires that the orientation of the contour be perceived at each instance and used to guide the motor commands to keep the finger in contact with the object. Another relation between their proposals concerns the optimality of the purposive movements. Lederman and Klatzky observe that in most cases the ‘exploratory procedures… optimize the speed or accuracy with which the readings of the object along the named dimensions are obtained.’ Meanwhile, Bajcsy also discusses optimality, but within the context that ‘A control sequence is to be evaluated relative to its expected utility… of two parts: the performance of the estimation procedure for that choice of strategy and the cost of implementing that strategy.’ If ‘strategy’ is read as ‘exploratory procedure’ then there is confluence between the two bodies of work.

Sensorimotor contingencies

Although formulated in the modality of vision, the concept of sensorimotor contingencies also formalises aspects of active perception and active exploration, and has general applicability across all sensory modalities. Psychologist Kevin O'Regan and Philosopher Alva Noë in ‘A sensorimotor account of vision and visual consciousness’ propose that ‘seeing is a way of acting’ so that ‘vision is a mode of exploration of the world that is mediated by knowledge of … sensorimotor contingencies’ (O'Regan and Noë, 2001), with sensorimotor contingencies being the ‘rules governing the sensory changes produced by various motor actions.’ Their sensorimotor contingency theory is intended to go further than active perception by offering an explanation of visual consciousness: ‘the basic thing people do when they see is that they exercise mastery of the sensorimotor contingencies governing visual exploration. Thus visual sensation and visual perception are different aspects of a person’s skillful exploratory activity… Visual awareness depends further on the person’s integration of these patterns of skillful exercise into ongoing planning, reasoning, decision making, and linguistic activities.’ Hence they conclude that ‘these ingredients are sufficient to explain the otherwise elusive character of visual consciousness.

Active tactile perception in humans and animals

A large body of literature has documented aspects of active tactile sensing and perception with the human hand and fingertips. Notable early works include observations that the sensation of the roughness of a surface is altered by the changes in the manner of feeling that surface (Lederman and Taylor 1972), a comparison of active and passive touch on the perception of surface roughness (Lederman 1981), and a psychophysical comparison of active and passive touch in perceiving texture (Lamb 1982). A comprehensive review of foundational research into touch and haptics is documented in the article ‘tactual perception’ (Loomis and Lederman 1986).

Complementary observation on active tactile perception have been made from studies of how rodents deploy their whisker system, which involves modulating how they rhythmically ‘whisk’ their facial vibrissae during perception and exploration of their environment. An early study observed that better performers appeared to optimize whisking frequency bandwidth and the extent to which the vibrissae would be bent by object control (Carvell and Simons 1995). A large number of studies have since been made on active tactile perception in the rodent vibrissal system, including on the neural encoding of vibrissal active touch (Szwed et al 2003) and the role of feedback in active tactile sensing (Mitchinson et al 2007), (Grant et al 2009). A review of the earlier work in this area is documented in the article ‘Active sensation: insights from the rodent vibrissae sensorimotor system’ (Kleinfeld et al 2006). A more recent review article (Diamond and Arabzadeh 2013) further clarifies whisker-mediated active perception by arguing for two distinct modes of operation: generative and receptive modes. In the generative mode, the rodent whisks to actively seek and explore objects; in the receptive mode, the rodent immobilizes its whiskers to optimize information gathering from a self-moving object. Both the generative and receptive modes are strategies for active perception, with the task and stimulus determining which mode is more suited for a given situation.

Studies of active touch in other whiskered mammals and with insect antennae have also been made, including cockroach antennae, stick insect antennae, and tactile sensing in the naked mole rat, Etruscan shrew and seals. Sensorimotor contingencies were also found to be essential for coding vibrissal active touch; where sensory cues alone convey ambiguous information about object location, the relationships between motor and sensory cues convey unique information (Bagdasarian et al., 2013).

Active tactile perception in robots

Around the time of the work on active perception (Bajcsy 1988) and haptic exploration (Lederman and Klatzky 1987), there were several initial proposals that robot touch could be based around related principles (Stansfield 1986); (Bajcsy, Lederman, Klatzky 1987) (Roberts, 1990) (Allen, 1990). Early examples of active perception with tactile fingertips came soon after, including motion control of a tactile fingertip for profile delineation of an unseen object (Maekawa et al 1992) and controlling the speed, and hence spatial filtering, of a tactile fingertip to measure surface roughness (Shimojo and Ishikawa 1993). Implementations of haptic exploration of unknown objects with dexterous robot hands followed a few years later; both by using exploratory procedures where some fingers grasp and manipulate the object while others feel the object surface (Okamura, Turner and Cutkosky 1997) and to use active perception to compute contact localization for grasping (Kaneko and Tanie 1994). A surveys of early work in these areas can be found in (Cutkosky, Howe and Provancher 2008); the review ‘tactile sensing – from humans to humanoids’ also usefully contrasts perception for action, such as grasp control and dexterous manipulation, with action for perception, such as haptic exploration and active perception (Dahiya et al 2011).

Active perception has also been demonstrated on whiskered robots. Early work included using an active antenna to detect contact distance (Kaneko et al 1998) and an active artificial whisker array for texture discrimination (Fend et al 2003). These early studies used active to mean the sensor is moving, but did not consider a feedback loop to modulate the whisking. Later whiskered robots implemented sensorimotor feedback between the tactile sensing and motor control (Pearson et al 2007) enabling demonstration of the contact dependency of texture perception with a whiskered robot (Fox et al 2009) and that texture perception was improved when the whisking motion was controlled using a sensory feedback mechanism (Sullivan et al 2012).

Recent work on active touch with robot fingertips has used probabilistic approaches for active perception, where the statistical hypotheses correspond to discrete percepts. In one body of work, active Bayesian perception is implemented with a control policy that takes intermediate estimates of the percept probabilities during perception to guide movements of the sensor (Lepora et al 2013). This approach can be used to improve the quality of the perception, for example to attain tactile superresolution sensing (Lepora et al 2015), gives robust sensing under positional uncertainty (Lepora et al 2013) and can implement exploratory procedures such as contour following (Martinez-Hernandez et al 2013). A related method of Bayesian exploration selects movements that best disambiguate a percept from a set of plausible candidates, and on a large database of textures with control of contact force and movement speed 'was found to exceed human capabilities' (Fishel and Loeb 2012).


Although there has been an evolving definition of active perception and active touch, an underlying theme is that active tactile perception utilizes sensor movement control to aid interpretation of the tactile data. A large body of research has documented that active tactile perception is central to understanding the function of the human hand and fingertips and also vibrissal sensing systems in rodents and other animals. This psychological and neuroscientific research has inspired the implementation of active tactile perception with robot fingertips and whiskers, with potential to lead to artificial tactile devices with comparable or superior performance to human and animal touch.


  • Allen, P.K. (1990). Mapping haptic exploratory procedures to multiple shape representations ICRA 1990 : 1679-1684.
  • Aloimonos, J. (1990). Purposive and qualitative active vision Pattern Recognition, 1990. Proceedings., 10th International Conference on. 1: 346-360.
  • Bagdasarian K., Szwed M.; Knutsen P.M., Deutsch D.; Derdikman D., Pietr M. and Simony E., Ahissar E. (2013). Pre-neuronal morphological processing of object location by individual whiskers Nature Neuroscience 16: 622-631.
  • Bajcsy, R. (1999). Active perception Proceedings of the IEEE 76(8): 966-1005.
  • Bajcsy, R.; Lederman, S.J. and Klatzky, R. (1987). Object exploration in one and two fingered robots Proceedings of the 1987 IEEE International Conference on Robotics and Automation 3: 1806-1810.
  • Ballard, D.H. (1991). Animate vision Artificial intelligence 48(1): 57-86.
  • Bajcsy, R. and Campos, M. (1992). Active and exploratory perception CVGIP: Image Understanding 56.1: 31-40.
  • Carvell, G.E. and Simons, D.J. (1995). Task-and subject-related differences in sensorimotor behavior during active touch Somatosensory and motor research 12(1): 1-9.
  • Cutkosky, M.R. and Howe, R.D. (2008). Force and Tactile Sensors Springer Handbook of Robotics : 455-476.
  • Dahiya, R.S.; Metta, G.; Valle, M. and Sandini, G. (2010). Tactile sensing—from humans to humanoids Robotics, IEEE Transactions on 26(1): 1-20.
  • Diamond, M. and Arabzadeh, E. (2013). Whisker sensory system—from receptor to decision Progress in Neurobiology 103: 28-40.
  • Fend, M.; Bovet, S.; Yokoi, H. and Pfeifer, R. (2003). An active artificial whisker array for texture discrimination In Intelligent Robots and Systems (IROS 2003), Proceedings. 2003 IEEE/RSJ International Conference on 2: 1044-1049.
  • Fishel, J.A. and Loeb, G.E. (2012). Bayesian exploration for intelligent identification of textures Frontiers in neurorobotics 6: .
  • Fox, C.W.; Mitchinson, B.; Pearson, M.J.; Pipe, A.G. and Prescott, T.J. (2009). Contact type dependency of texture classification in a whiskered mobile robot Autonomous Robots 26(4): 223-239.
  • Gibson, J.J. (1962). Observations on active touch Psychological review 69(6): .
  • Grant, R.A.; Mitchinson, B.; Fox, C.W. and Prescott, T.J. (2009). Active Touch Sensing in the Rat: Anticipatory and Regulatory Control of Whisker Movements During Surface Exploration Journal of Neurophysiology 101(2): 862-874.
  • Kaneko, M.; Kanayama, N. and Tsuji, T (1998). Active antenna for contact sensing IEEE Transactions on Robotics and Automation 14(2): 278-291.
  • Kleinfeld, D.; Ahissar, E. and Diamond, M.E. (2006). Active sensation: insights from the rodent vibrissa sensorimotor system Current opinion in Neurobiology 16: 435-444.
  • Lederman, S.J. (1983). Tactual roughness perception: Spatial and temporal determinants Canadian Journal of Psychology/Revue canadienne de psychologie 37(4): .
  • Lederman, S.J. and Klatzky, R.L. (1987). Hand movements: A window into haptic object recognition Cognitive psychology 19(3): 342-368.
  • Lederman, S.J. and Taylor, M.M. (1972). Fingertip force, surface geometry, and the perception of roughness by active touch Perception & Psychophysics 12(5): 401-408.
  • Lepora, N.F.; Martinez-Hernandez, U. and Prescott, T.J. (2013). Active Bayesian Perception for Simultaneous Object Localization and Identification In Robotics: Science and Systems. : .
  • Lepora, N.F.; Martinez-Hernandez, U. and Prescott, T.J. (2013). Active touch for robust perception under position uncertainty In Robotics and Automation (ICRA), 2013 IEEE International Conference on : 3020-3025.
  • Lepora, N.F.; Martinez-Hernandez, U. and Prescott, T.J. (2015). Tactile superresolution and biomimetic hyperacuity  : .
  • Loomis, J.M. and Lederman, S.J. (1986). Tactual perception Handbook of perception and human performances 2(2): .
  • Maekawa, H.; Tanie, K.; Komoriya, K.; Kaneko, M.; Horiguchi , C.and Sugawara; T. (1992). Development of a finger-shaped tactile sensor and its evaluation by active touch In Robotics and Automation, 1992. Proceedings., 1992 IEEE International Conference on  : 1327-1334.
  • Martinez-Hernandez, U.; Metta, G.; Dodd, T.J.; Prescott, T.J. and Lepora, N.F. (2013). Active contour following to explore object shape with robot touch In World Haptics Conference (WHC) : 341-346).
  • Mitchinson, B.; Martin, C.J.; Grant, R.A and Prescott, T.J (2007). Feedback control in active sensing: rat exploratory whisking is modulated by environmental contact Proceedings of the Royal Society, B 274: 1035-1041.
  • Okamura, A.M.; Turner, M.L. and Cutkosky, M.R. (1997). Haptic exploration of objects with rolling and sliding In Robotics and Automation, 1997. Proceedings., 1997 IEEE International Conference on : 2485-2490.
  • O'Regan, J.K. and Noë, A. (2001). A sensorimotor account of vision and visual consciousness Behavioral and brain sciences 24(05): 939-973.
  • Pearson, M.J.; Pipe, A.G.; Melhuish, C.; Mitchinson, B. and Prescott, T.J. (2007). Whiskerbot: A Robotic Active Touch System Modelled on the Rat Whisker Sensory system Adaptive Behavior 15(3): 223-240.
  • Prescott, T.J.; Diamond, M.E. and Wing, A.M. (2011). Active touch sensing Philosophical Transactions of the Royal Society B: Biological Sciences 366(1581): 2989-2995.
  • Roberts, K. S. (1990). Robot active touch exploration: Constraints and strategies In Robotics and Automation, 1990. Proceedings., 1990 IEEE International Conference on : 980-985.
  • Shimojo, M. and Ishikawa, M. (1993). An active touch sensing method using a spatial filtering tactile sensor In Robotics and Automation, 1993. Proceedings., 1993 IEEE International Conference on  : 948-954.
  • Sullivan, J.C.; Mitchinson, B.; Pearson, M.J.; Evans, M.; Lepora , N.Fand Prescott; T.J. (2012). Tactile Discrimination Using Active Whisker Sensors IEEE Sensors 12(2): 350-362.
  • Szwed, M.; Bagdasarian, K. and Ahissar, E. (2003). Encoding of vibrissal active touch Neuron 40(3): 621-630.
Personal tools

Focal areas