Tactile Substitution for Vision

From Scholarpedia
Yael Zilbershtain- Kra et al. (2015), Scholarpedia, 10(4):32457. doi:10.4249/scholarpedia.32457 revision #150530 [link to/cite this article]
Jump to: navigation, search
Post-publication activity

Curator: Ehud Ahissar

Sensory Substitution (SenSub) is an approach that allows perceiving environmental information that is normally received via one sense (e.g., vision) via another sense (e.g., touch or audition). A typical SenSub system includes three major components: (a) a sensor that senses information typically received by the substituted modality (e.g., visual), (b) a coupling system that can process the sensor’s output and drive the actuator, and (c) an actuator that activates receptors of the substituting modality (e.g., skin mechanoreceptors or auditory hair cells) (Bach-y-Rita, 2002; Bach-y-Rita and Kercel, 2003; Lenay et al., 2003; Renier and De Volder, 2005; Ziat et al., 2005). Vision, the predominant sense in sighted humans, is typically the substituted modality in SenSub. The substituting modalities are touch and hearing (Renier and De Volder, 2005). This chapter focuses on visual-to-touch SenSub (VTSenSub).

Contents

VTSenSub feasibility

The ability to use a novel sensory modality, such as that offered by a SenSub, builds on the plasticity and flexibility of the perceptual system (Bach-y-Rita, 2004). Yet, as the range of plastic changes is limited, not every SenSub system is feasible. There is an advantage to SenSub that bridges between modalities that operate on similar low-level principles and share high-level classifications. Vision and touch share several major perceptual principles. Both modalities are based on two-dimensional (2D) arrays of receptors that actively scan the environment (Bach-y-Rita, 1972; Geldard, 1960), encoding its spatial aspects via spatial and temporal cues (Collins and Saunders, 1970). In comparison, acoustic signals activate a one-dimensional array of cochlear receptors in a relatively passive ear, coding spectral and temporal cues. Furthermore, vision and touch exhibit similar strategies of active sensing (Ahissar and Arieli, 2001; Bach-y-Rita, 2002). These similarities between vision and touch enable exploitation of existing natural mechanisms in tactile substitutions for vision, an advantage that cannot be applied to hearing. This, together with other disadvantages of hearing as a substituting sense for vision (Hanneton et al., 2010; Kim and Zatorre, 2008; Tang and Beebe, 1998; Visell, 2009), make VTSenSub a natural choice. Indeed, there is evidence for perception in visual terms (e.g., shadow or luminance) while using VTSenSub systems (Bach-y-Rita, 1969; Bach-y-Rita, 1972; Bach-y-Rita, 1995; Bach-y-Rita, 1998; Bach-y-Rita, 1987; White, 1970). Interestingly, attempts to use combined tactile-hearing SenSub did not succeed, possibly due to a cognitive overload (Chekhchoukh et al., 2011; Jansson and Pedersen, 2005).

In spite of these similarities between vision and touch, the nature of the qualia perceived via tactile substitution is not yet clear: is it visual-like (Hurley and Noë, 2003; Noë, 2004; O'Regan and Noë, 2001), tactile-like (Block, 2003; Prinz, 2006) or representing a completely new modality (Lenay et al., 2003). The latter is consistent with the limited resolution of VTSenSub (Bach-y-Rita, 2002; Lenay et al., 2003). Yet, VTSenSub experiences had been reported to be rich enough to evoke emotional excitements, for example when being able to watch a moving candle flame (Guarniero, 1974) or finding a toy for a blind child (Bach-y-Rita, 2002).

TactileSubstitutionforVision tongue.png  

Fundamental limitations of VTSenSub

Most existing VTSenSub devices convert visual space to tactile space and visual luminance to tactile vibrations (Bach-y-Rita et al., 1969; Krishna et al., 2010; Ziat et al., 2005). The idea of conveying information through vibrating array of pins on the skin is known from the 1920’s and the first attempt to convey visual information via touch was done in the 60’s by Starkiewicz and later by Bach-y-Rita who coined the term sensory substitution (Starkiewicz, 1963; Bach-y-Rita, 1969). With such active conversion, conveyed to a passive sensory organ, the crucial factor determining perceptual resolution is the resolution of the actuators array. This poses a significant limitation on VTSenSub as actuators arrays usually contain only tens of actuators, with the densest device containing 1000 actuators (Visell, 2009). The amount of information that can be conveyed via such arrays is several orders of magnitude lower than what can be conveyed by an intact retina, which severely limits the functioning of VTSenSub as visual substitute.

Another fundamental limitation of VTSenSub is its competition with other essential functions of the blind person. In order to maximize perceptual resolution a VTSenSub system should be attached to a sensitive skin surface, such as that of the fingertips or the tongue. This of course comes with a cost of losing co-functioning of these organs in other essential functions (Bach-y-Rita, 1972). Finally, energy consumption of VTSenSub systems is typically high (Auvray et al., 2007; Lenay et al., 2003).

TactileSubstitutionforVision forhead.jpg  

Active SenSub (ASenSub)

Visual and tactile sensory organs are attached to muscles whose activation enables information acquisition. Although muscle-driven active sensing can be bypassed by flashing stimuli on passive sensory organs, active sensing typically outperforms passive sensing (Gamzu and Ahissar, 2001; Heller, 1980; Heller and Myers, 1983; LaMotte and Whitehouse, 1986; Lederman and Klatzky, 1993; Loomis and Lederman, 1986; Saida and Wake, 1982; Saig et al., 2012; Yanagida et al., 2004). With active sensing, motor-sensory relations, and not sensory signals per-se, are the relevant cues for the perception of external objects (Held, 1973; O'Regan, 2001; Ahissar, 2001; Gibson, 1962; Katz, 1989; Ahissar, 1990; Ahissar, 2014; Saig, 2012; Bagdasarian, 2013; Ahissar, 2014).

Active haptic exploration via a SenSub system enables the development of a unique scanning strategy for each participant (Tang and Beebe, 1998), which is instrumental for accurate perception (Jansson, 1998; Rovira et al., 2010). Sensor movement can facilitate perception even when dissociated from the sensory organ. For example, with head-attached video camera and fingers-attached actuators, recognition and learning improve when participants are allowed to move their heads (Siegle and Warren, 2010). This improvement is often associated with “externalization” of the sensed object, i.e., feeling it in a remote location rather than on the skin, a phenomenon crucially dependent on active control over sensor motion (Harman, 1990; Lenay et al., 1999; Siegle and Warren, 2010).

Bach-y-Rita and others demonstrated that in order to achieve externalization, the user must be trained, the sensor must be placed on one of the user’s motor systems and a motor-sensory control should be obtained (Bach-y-Rita, 2005; Bach-y-Rita, 2002; Loomis, 1992; White, 1970; Epstein, 1986). Bach-y-Rita also suggested that the sensor movement can be replaced with a virtual movement, which can also result in externalization and assumed that there is no importance to the position of the sensor and actuator (Bach-y-Rita and Kercel, 2003). Consistent with this, no significant difference was found between having the actuator on the same hand that moved the sensor or at the other hand, although some participants claimed for disruption at the split-hand condition (Pietrzak et al., 2009). This assumption, however, appears to be contrasted by experiments in humans and animals demonstrating the dependency of active sensing on natural sensory-motor loops (Visell, 2009; Saig, 2010; Ahissar, 2008). Indeed, SenSub devices in which the sensor and actuator ate attached to the same organ, show improved performance (Chan et al., 2007; Zilbershtain-Kra et al., 2014).

Having the sensor and actuator on the same organ is not sufficient for driving active sensing. As naturally a perceptual system acquires its sensations by movements, the SenSub device must not induce any active actuation by itself – actuation should result only from sensor motion. Indeed, devices in which no tactile vibrations are used, and actuation depends solely on sensor motion, show improved performance (Chan et al., 2007; Zilbershtain-Kra et al., 2014). Such devices, in which the sensor and actuator are attached to the same organ and sensations are generated only via sensor motion, are termed Active SenSub (ASenSub) devices.

Evaluating the perceptual power of VTSenSub devices

Evaluation of the relative perceptual power of SenSub systems can be done by comparing behavioral performance in different tasks (Visell, 2009). Tasks that has been used for this purpose include: distance measurement (Bach-y-Rita et al., 1969), line orientation detection (Bach-y-Rita et al., 1969; Chekhchoukh et al., 2011; Ziat et al., 2005), shape identification (Bach-y-Rita et al., 1969; Chan et al., 2007; Shinohara et al., 1998; Tang and Beebe, 1998; Ziat et al., 2005; Zilbershtain-Kra et al., 2014), object identification (Bach-y-Rita et al., 1969; Shinohara et al., 1998; Zilbershtain-Kra et al., 2014), face recognition (Bach-y-Rita et al., 1969), letter reading (Bliss et al., 1970; Chan et al., 2007; Linvill and Bliss, 1966; Loomis, 1974; Ziat et al., 2005), movement detection (Chekhchoukh et al., 2011), body movement recognition (Miletic, 1988; Jansson, 1983; Mandik, 1999), hand – “eye” coordination (Guarniero, 1974; Miletic, 1988), navigation (Segond et al., 2005) and assembling tasks (Bach-y-Rita, 1995).

Comparisons across tasks and research groups still lack a common acceptable metrics. Importantly, in most cases performance depends crucially on learning and learning time and effort should be taken into account when comparing different approaches.

Exploration strategy with VTSenSub

Differences in performance levels between subjects in the same task can often be accounted for by differences in motion strategies (Gamzu and Ahissar, 2001; Rovira et al., 2010). To date, information about these dependencies is sparse. Existing data are based on both subjective reports and objective measurements of sensor motion. Strategies reported so far include: contour following, orthogonal (horizontal and vertical) scanning, random scanning and feature oriented scanning (Chan et al., 2007; Guarniero, 1974; Hsu et al., 2013; Rovira et al., 2010; Ziat et al., 2005; Zilbershtain-Kra et al., 2014).

Challenges for VTSenSub

Although the first study with VTSenSub system was done five decades ago, those systems are still not used by the blind community. It seems that the two major technical challenges of these systems remain the limited resolution of “foveal” sensation and the lack of “peripheral” sensation. Unlike in vision, with currently available VTSenSub devices subjects not only receive impoverished “foveal” information, but are also in the dark in respect to the rest of the visual field. While foveal acuity can be facilitated in ASenSub devices, using appropriate motor-sensory strategies, peripheral sensation appears to require specific design of the actuators array and of its coupling with the sensor.

Additional challenges include financial and ergonomics considerations. The system should better be comfortable and easy to use, enabling free movement, as well as esthetical, while keeping the costs affordable to the end customer (Lenay et al., 2003). 

Research interests

The basic practical aim of SenSub is to provide a tool that can help the visually impaired in everyday tasks such as navigation, object localization and identification, and communication. In addition, SenSub serves as a research tool for neuroscientists addressing perceptual mechanisms, learning and plasticity (Bach-y-Rita, 1995; Hanneton et al., 2010; Lenay et al., 2003; Sampaio et al., 2001; Visell, 2009). Importantly, SenSub gives a unique possibility to study the emergence of perception via a novel modality, addressing the development of specific perceptual aspects such as distal awareness (externalization), environmental structure and object familiarity (Loomis et al., 2012; Siegle and Warren, 2010).

References

  • Ahissar, E and Arieli, A (2001). Figuring space by time. Neuron 32: 185-201.
  • Auvray, M; Hanneton, S and O'Regan, J K (2007). Learning to perceive with a visuo-auditory substitution system: Localisation and object recognition with 'The vOICe'. Perception 36: 416-430.
  • Bach-y-Rita, P (1972). Brain Mechanisms in Sensory Substitution. New York: Academic Press.
  • Bach-y-Rita, P (1995). Nonsynaptic Diffusion Neurotransmission and Late Brain Reorganization New York: Demos.
  • Bach-y-Rita, P (2002). Sensory substitution and qualia. In: A Noë and E Thompson, Vision and Mind: Selected Readings in the Philosophy of Perception (pp. 497-514).
  • Bach-y-Rita, P (2004). Tactile sensory substitution studies. Annals of the New York Academy of Sciences 1013: 83-91.
  • Bach-y-Rita, P; Collins, C C; Saunders, F A; White, B and Scadden, L (1969). Vision substitution by tactile image projection. Nature 221: 963-964.
  • Bach-y-Rita, P and Kercel, S (2003). Sensory substitution and the human–machine interface. Trends in Cognitive Sciences 7: 541-546.
  • Bliss, J C; Katcher, M H; Rogers, C H and Shepard, R P (1970). Optical-to-tactile image conversion for the blind. IEEE Transactions on Man-Machine Systems 11: 58-65.
  • Block, N (2003). Tactile sensation via spatial perception. Trends in Cognitive Sciences 7: 285.
  • Chan, J S et al. (2007). The virtual haptic display: A device for exploring 2-D virtual shapes in the tactile modality. Behavior Research Methods 39: 802-810.
  • Chekhchoukh, A; Vuillerme, N and Glade, N (2011). Vision substitution and moving objects tracking in 2 and 3 dimensions via vectorial electro-stimulation of the tongue. In: Actes de ASSISTH 2011, 2ème Confèrence internationale sur l'Accessibilitè et les Systèmes de Supplèance aux personnes en situaTions de Handicaps.
  • Collins, C and Saunders, F (1970). Pictorial display by direct electrical stimulation of the skin. Journal of Biomedical Syst 1: 3-16.
  • Gamzu, E and Ahissar, E (2001). Importance of temporal cues for tactile spatial-frequency discrimination. The Journal of Neuroscience 21: 7416-7427.
  • Geldard, F A (1960). Some neglected possibilities of communication. Science 131(3413): 1583-1588.
  • Guarniero, G (1974). Experience of tactile vision. Perception 3: 101-104.
  • Hanneton, S; Auvray, M and Durette, B (2010). The Vibe: A versatile vision-to-audition sensory substitution device. Applied Bionics and Biomechanics 7: 269-276.
  • Harman, G (1990). The intrinsic quality of experience. Philosophical Perspectives 4: 31-52.
  • Heller, M A (1980). Reproduction of tactually perceived forms. Perceptual and Motor Skills 50: 943-946.
  • Heller, M A and Myers, D S (1983). Active and passive tactual recognition of form. The Journal of General Psychology 108: 225-229.
  • Hsu, B et al. (2013). A tactile vision substitution system for the study of active sensing. In: Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE (IEEE) (pp. 3206-3209).
  • Hurley, S and Noë, A (2003). Neural plasticity and consciousness. Biology and Philosophy 18: 131-168.
  • Jansson, G (1983). Tactile guidance of movement. International Journal of Neuroscience 19: 37-46.
  • Jansson, G (1998). Haptic perception of outline 2D shape: The contributions of information via the skin, the joints and the muscles. In: B Bril, A Ledebt, G Dietrich and A Roby-Brami (Eds.), Advances in Perception-Action Coupling (pp. 25-30). Paris: Editions EDK.
  • Jansson, G and Pedersen, P (2005). Obtaining geographical information from a virtual map with a haptic mouse. In: XXII International Cartographic Conference (ICC2005).
  • Kim, J-K and Zatorre, R J (2008). Generalized learning of visual-to-auditory substitution in sighted individuals. Brain Research 1242: 263-275.
  • Krishna, S; Bala, S; McDaniel, T; McGuire, S and Panchanathan, S (2010). VibroGlove: An assistive technology aid for conveying facial expressions. In CHI'10 Extended Abstracts on Human Factors in Computing Systems (ACM) (pp. 3637-3642).
  • LaMotte, R H and Whitehouse, J (1986). Tactile detection of a dot on a smooth surface: Peripheral neural events. Journal of Neurophysiology 56: 1109-1128.
  • Lederman, S J and Klatzky, R L (1993). Extracting object properties through haptic exploration. Acta Psychologica 84: 29-40.
  • Lenay, C; Gapenne, O; Hanneton, S; Marque, C and Genouëlle, C (2003). Sensory substitution: Limits and perspectives. In: Y Hatwell, A Streri and E Gentaz, Touching for Knowing: Cognitive Psychology of Haptic Manual Perception (pp. 275-292). Amsterdam: John Benjamins Publishing.
  • Lenay, C; Gapenne, O; Hanneton, S and Stewart, J (1999). Perception et couplage sensori-moteur: Expériences et discussion épistémologique. Intelligence Artificielle Située 99: 71-86.
  • Linvill, J G and Bliss, J C (1966). A direct translation reading aid for the blind. Proceedings of the IEEE 54: 40-51.
  • Loomis, J M (1974). Tactile letter recognition under different modes of stimulus presentation. Perception & Psychophysics 16: 401-408.
  • Loomis, J M; Klatzky, R L and Giudice, N A (2012). Sensory substitution of vision: Importance of perceptual and cognitive processing. In: R Manduchi and S Kurniawan (Eds.), Assistive Technology for Blindness and Low Vision (pp. 161-193). Boca Raton: CRC Press.
  • Loomis, J M and Lederman, S J (1986). Tactual perception. In: K R Boff, L Kaufman and J P Thomas (Eds.), Handbook of Perception and Human Performances, Vol. 2 (pp. 31/1-31/41). New York: Wiley.
  • Mandik, P (1999). Qualia, space, and control. Philosophical Psychology 12: 47-60.
  • Noë, A (2004). Action in perception Cambridge, MA: The MIT Press.
  • O'Regan, J K and Noë, A (2001). A sensorimotor account of vision and visual consciousness. Behavioral and Brain Sciences 24: 939-972.
  • Pietrzak, T; Crossan, A; Brewster, S A; Martin, B and Pecci, I (2009). Exploring geometric shapes with touch. In: Human-Computer Interaction–INTERACT 2009 (pp. 145-148). Springer.
  • Prinz, J (2006). Putting the brakes on enactive perception. Psyche 12: 1-19.
  • Renier, L and De Volder, A G (2005). Cognitive and brain mechanisms in sensory substitution of vision: A contribution to the study of human perception. Journal of Integrative Neuroscience 4: 489-503.
  • Rovira, K; Gapenne, O and Ammar, A A (2010). Learning to recognize shapes with a sensory substitution system: A longitudinal study with 4 non-sighted adolescents. In: Development and Learning (ICDL), 2010 IEEE 9th International Conference on (IEEE) (pp. 1-6).
  • Saida, S and Wake, T (1982). Computer-controlled TVSS and some characteristics of vibrotactile letter recognition. Perceptual and Motor Skills 55: 651-653.
  • Saig, A; Gordon, G; Assa, E; Arieli, A and Ahissar, E (2012). Motor-sensory confluence in tactile perception. The Journal of Neuroscience 32: 14022-14032.
  • Sampaio, E; Maris, S and Bach-y-Rita, P (2001). Brain plasticity: "Visual" acuity of blind persons via the tongue. Brain Research 908: 204-207.
  • Segond, H; Weiss, D and Sampaio, E (2005). Human spatial navigation via a visuo-tactile sensory substitution system. Perception 34: 1231-1249.
  • Shinohara, M; Shimizu, Y and Mochizuki, A (1998). Three-dimensional tactile display for the blind. IEEE Transactions on Rehabilitation Engineering 6: 249-256.
  • Siegle, J H and Warren, W H (2010). Distal attribution and distance perception in sensory substitution. Perception 39: 208.
  • Tang, H and Beebe, D J (1998). A microfabricated electrostatic haptic display for persons with visual impairments. IEEE Transactions on Rehabilitation Engineering 6: 241-248.
  • Visell, Y (2009). Tactile sensory substitution: Models for enaction in HCI. Interacting with Computers 21: 38-53.
  • Yanagida, Y; Kakita, M; Lindeman, R W; Kume, Y and Tetsutani, N (2004). Vibrotactile letter reading using a low-resolution tactor array. In: Proceedings of the 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, HAPTICS '04 (IEEE) (pp. 400-406).
  • Ziat, M; Gapenne, O; Stewart, J and Lenay, C (2005). A comparison of two methods of scaling on form perception via a haptic interface. ACM ICMI 236-243.
  • Zilbershtain-Kra, Y; Ahissar, E and Arieli, A (2014). Speeded performance with active- sensing based vision- to- touch substitution. FENS Abstract D036.
Personal tools
Namespaces

Variants
Actions
Navigation
Focal areas
Activity
Tools