Active touch sensing

Tony J. Prescott, Mathew E. Diamond, Alan M. Wing

Abstract

Active sensing systems are purposive and information-seeking sensory systems. Active sensing usually entails sensor movement, but more fundamentally, it involves control of the sensor apparatus, in whatever manner best suits the task, so as to maximize information gain. In animals, active sensing is perhaps most evident in the modality of touch. In this theme issue, we look at active touch across a broad range of species from insects, terrestrial and marine mammals, through to humans. In addition to analysing natural touch, we also consider how engineering is beginning to exploit physical analogues of these biological systems so as to endow robots with rich tactile sensing capabilities. The different contributions show not only the varieties of active touch—antennae, whiskers and fingertips—but also their commonalities. They explore how active touch sensing has evolved in different animal lineages, how it serves to provide rapid and reliable cues for controlling ongoing behaviour, and even how it can disintegrate when our brains begin to fail. They demonstrate that research on active touch offers a means both to understand this essential and primary sensory modality, and to investigate how animals, including man, combine movement with sensing so as to make sense of, and act effectively in, the world.

1. Introduction

[…] the knowing touch projects us outside our body through movement. […] There are tactile phenomena, alleged tactile qualities, like roughness and smoothness, which disappear completely if the exploratory movement is eliminated. Movement and time are not only an objective condition of knowing touch, but a phenomenal component of tactile data. They bring about the patterning of tactile phenomena, just as light shows up the configuration of a visible surface. (Maurice Merleau-Ponty [1, p. 367])

It is increasingly recognized that the separation of perception from action in theoretical analyses of intelligent behaviour is misleading. Sensing of most kinds is best considered as an active process rather than as a passive one, however, in touch, the integration of movement with sensing is a critical feature, to discover the world through touch we must act upon it [1,2]. For humans, the movement of the hands and fingers is particularly important to our tactile sense—we stroke a surface to detect texture, palpate gently or trace edges to judge shape, press to determine hardness and so on [3,4]. In the animal kingdom, there are many other fascinating examples of active tactile sensing organs, such as the antennae of insects, the whiskers (vibrissae) of rodents and other mammals and the unique facial tentacles of the star-nosed mole. Each of these systems exhibits impressive sensory capacities. For instance, rats are able to discriminate texture using their whiskers with similar accuracy to the human fingertip [5]; seals can use their vibrissae to follow the trails of water turbulence left by fish [6]; and the tentacles of the star-nosed mole allow for the identification and capture of prey animals [7].

The peculiarities and commonalities of different species' sense of touch will become apparent through the course of this theme issue. However, an important characteristic of all of these tactile sensing capacities is that precision is coupled with speed [7]. Thus, biological active touch sensing has all the hallmarks of a process that can inform and guide initiatives in artificial sensing and is therefore of growing interest to engineers seeking to build intelligent machines such as robots. Our belief is that progress in the biological sciences, and in the engineering of intelligent adaptive systems, can be accelerated by comparative analyses and through the convergence of experimental and synthetic approaches. Such interdisciplinary progress in understanding active touch sensing will have impacts in the fields of animal behaviour, neuroethology, neuroscience, computational neuroscience and robotics (including humanoid and bioinspired robots, micro-electronics and material science). By merging contributions from life scientists and engineers, this theme issue seeks to advance our general understanding of what information can be derived through tactile sensing, and how that information can be best obtained, transduced, conveyed and analysed. Further, although the theme issue is focused on touch (including tactile and proprioceptive inputs), the different contributions also identify more general principles, concerning, for instance, how the behaviour of animals (and artefacts) can be effectively controlled by closed sensorimotor loops where there is no strong demarcation between sensation and action. Such insights could have a significant impact on current theories of brain function, and on the design of efficient control architectures for autonomous robots.

2. Defining active touch

In order to draw parallels across the different contributions to this theme issue, it is useful to distinguish a number of different ways in which sensing can be said to be active and to provide a definition that could promote convergence between biological and synthetic approaches to active touch.

First, in engineering terminology, a sensor may be described as ‘active’ if it operates by emitting energy and then measuring the effect of the emitted signal on the environment [8]; thus a sonar or laser sensor is active, a camera (using natural light) or a microphone is not. This distinction can also be applied to biological systems; echolocation in bats and dolphins is clearly ‘active’ in this sense, as is electrolocation in weakly electric fish. This definition might even be stretched to include the expenditure of mechanical energy in touch sensing, as in rat whisker movement, although in this case, energy is not emitted outside the boundary of the animal. The use of self-generated energy is of interest from the point of view of understanding the cost to the animal in terms of energy metabolism and increased conspicuousness (e.g. exposure to predation) relative to the information gained, however it is not especially pertinent from the point of view of understanding the sensory-guidance of behaviour. We therefore mention this meaning of active sensing in order to be clear that we are not using the term in this way.

Second, and somewhat closer to our intended meaning, active sensing can also be understood to refer to the situation where sensation arises through the movement of the sensor rather than through the movement of the stimulus (or any general passive exposure of the sensor to a stimulus). In the context of tactile sensing, Gibson [9] famously made the distinction as follows: ‘active touch refers to what is ordinarily called touching. This ought to be distinguished from passive touch, or being touched’ (p. 477). Gibson further claimed to find a substantial advantage when active touch was compared with passive touch in a tactile shape recognition task (95% success rate compared with 49%). However, as critics have pointed out (e.g. [10]), Gibson's experiment was not well-designed in terms of distinguishing the contributions of movement, proprioception and intention to the observed differences between active and passive touch. Gibson's article can be read as implying that a crucial difference does lie, simply, in whether the hand moved against the object or the object moved against the hand; however, subsequent studies have failed to support the idea that movement alone provides any substantial advantage when making tactile discriminations. Specifically, whenever psychophysical studies have been performed in which the excitation pattern of the receptors has been constrained to be the same under the two conditions (sensor movement or object movement), the results have shown perceptual equivalence between active and passive touch [1015]. We conclude then, that the presence of sensor movement may not in itself confer advantages to the perceiver and thus ‘active sensing’ may need to mean more than this if it is to be a helpful concept for understanding the efficient performance of biological sensing systems or for designing synthetic ones.

Indeed, to describe Gibson's position as being concerned primarily with sensor movement is to misrepresent it. For Gibson, as for many others since, the advantages of active sensing in touch are attributable, in main part, to the intentional nature of the movements of the hand and digits. For instance, Gibson wrote that ‘when one explores anything with his hand the movements of the fingers are purposive. An organ of the body is being adjusted for the registering of information’ and further that ‘the purpose of the exploratory movements of the hand is to isolate and enhance [our italics] the component of stimulation which specifies the shape and other characteristics of the object being touched’ (p. 478). A more general proposal, aligned with this view, is that ‘active sensing’ systems should be defined as purposive and information-seeking. Where appropriate, active sensing will involve sensor movement, but more fundamentally, we should think of it as involving control of the sensor apparatus, in whatever manner best suits the task. In some cases, this may even mean that the sensor is held stationary where that is the best strategy for gaining useful information at a given moment. This general thesis is supported by considerable work in haptic touch [4,16,17] and vibrissal touch [1821], as well as research in other modalities such as human vision [2225], and audition [26]. However, an important seed for this idea comes, not only from biology, but also from computer science and robotics. Specifically, in a landmark paper, Bajcsy [27] defined active sensing as ‘purposefully changing the sensor's state parameters according to sensing strategies … [that] depend on the current state of the data interpretation and the goal or the task’ (p. 996). Bajcsy also highlighted the importance of feedback in regulating ongoing movement of the sensor, emphasizing the need for a sequence of decisions that determine, based on that feedback, where and how the sensors should move next so as to maximize task-related data acquisition. Her proposal was also presented within a mathematical (Bayesian) framework intended to maximally reduce uncertainty about the required data while minimizing the costs arising from active control.

Whereas Bajcsy emphasized online optimization of sensing control, we might also wish to allow that some useful sensor control strategies can be pre-defined (or, in biological systems, selected by evolution or pre-configured through learning) such that, when provided with appropriate feedback signals, they operate to boost the acquisition of useful information without the need to explicitly compute the optimal sensing strategy on a moment-by-moment basis. Learning might also be used to improve control strategies for future use based on the evaluation of its effectiveness in the current (just performed) task. Thus, we have a range of possibilities: (i) that sensing strategies are selected by evolution and fixed for life; (ii) that strategies are at least partly learned and remain modifiable based on experience; and (iii) that strategies are computed in real time using online mechanisms for optimizing sensor control. Note that a fusion of these possibilities is also conceivable—that a set of active sensing ‘primitives’ are provided by evolution and/or long-term adaptation and that the decision-making mechanism selects sequences of these that are appropriate to the current task setting.

While this notion of active touch sensing is somewhat pragmatically conceived—our intention is to assist the experimentalist in hypothesis generation, and the engineer in the design of control systems for sensing artefacts—it should be noted that it has its roots in ideas from the philosophical tradition of phenomenology [1,28], as illustrated, for instance, by Merleau-Ponty's description of ‘knowing touch’ with which we began this article. This should remind us that the appeal of touch sensing, as a realm for scientific endeavour, lies not only in the possibility of uncovering the principles underlying perceptual experience in this particular modality, or of building artificial systems that exploit these principles, but also as a means for understanding the ‘perceptual ground’ [1] through which we, as human beings, come to know our world.

3. Contents of theme issue

The comparative approach in the biological sciences is an effective strategy for identifying general principles in the organization of behaviour and for the design of effective control systems that might be usefully transferred to intelligent machines (see also [2931]). In this theme issue, we therefore look at active touch across a broad range of species from insects, through terrestrial (both scansorial and fossorial) and marine mammals, through to humans.

Our first two articles consider active touch behaviour in insects. Dürr & Schütz [32] used high-speed videography methods to investigate the control of antennal movements in the Indian stick insect Carausius morosus at the level of single joint kinematics. They show that active touch sensing in this species is important for initiating climbing behaviour when the animal encounters an obstacle. Further, the control of the antennae switches mode, following such an encounter, from a strategy suited to searching for obstacles to one suited to sampling along a vertical edge—an elegant example of how a change in the behavioural task directly modifies the control of the sensor apparatus. Comer & Baba [33] combine behaviour monitoring techniques with a range of neurophysiological and anatomical techniques to look at the role of active touch sensing in the escape behaviour of orthopteroid insects focusing on the cockroach Periplaneta americana. Although escape behaviour can be triggered passively, for instance, when an object unexpectedly contacts the antennae, active sensing can play an important role too. Comer and Baba describe how, when an object is introduced into the visual periphery, a cockroach will often orient to explore the object with its antennae. By palpating the novel object, tactile surface properties, such as texture, can be discerned that allow the animal to distinguish a non-threatening object, such as a conspecific, from a dangerous one such as a spider. By examining active touch in a multi-sensory context, and at both the behavioural and network architecture level, this work is beginning to determine how the Orthoptera have evolved to use both visual and tactile cues to boost the effectiveness of their escape strategies.

Moving to Mammalia, the article by Catania [34] considers an unusual but fascinating tactile specialist the star-nosed mole, Condylura cristata. The study of highly-specialized systems often allows better insight into more generalized ones, as a striking feature that is highly noticeable in the specialist may reflect a general but less obvious trend in its less remarkable relatives. The star-nosed mole represents an extreme in the evolution of the Eimer organ, a tiny, dome-like tactile sensory structure found on nearly all moles. In the star-nosed mole, up to 25 000 of these organs are found on 22 highly mobile appendages surrounding the nostrils. The mole is known for the speed of its foraging behaviour that allows it to identify and consume small prey in as little as 120 ms per item. Catania shows that the process by which the mole locates and recognizes its prey has interesting similarities to the saccadic eye movements of primates with the 11th appendage (of the 22) acting as a tactile ‘fovea’. When a potential food object is detected on any other appendage, the animal moves the star with great rapidity and accuracy and touches the target item with the 11th appendage in order to decide whether to eat or move on.

If the star-nosed mole represents one extreme of the mammal class then our next animal, the Etruscan shrew, Suncus etruscus, considered in the article by Brecht et al. [35], certainly represents another. The smallest terrestrial mammal and possessing the smallest mammalian brain, this animal again shows a remarkable facility for high-speed prey capture based on the sense of touch. This shrew is an insectivore that can capture insects, almost as large as itself, in darkness and in the absence of olfactory cues. It does so on the basis of a highly honed vibrissal (whisker) sense to which much of its tiny brain is tuned. In their article, Brecht et al. show that the shrew can distinguish prey from non-prey in a single touch, and is able to use this information to initiate a fast and precisely targeted attack. The tactile representation of the prey animal in the brain of the shrew involves the cooperation of multiple brain areas including at least four distinct somatosensory regions of cortex.

The article on the shrew introduces a series of articles concerned with the topic of vibrissal touch in a range of species. The strong focus on this general topic is for a good reason. Research on vibrissal tactile sensing in rodents (the subject of much of the research in this field) is reaching a critical phase where a preliminary plan of its underlying neural substrates is now available [36], and data are beginning to emerge that will allow precise mappings to be shown between brain activity and vibrissal movements and deflections in awake behaving animals [37]. There is thus the real prospect of ‘cracking’ the rodent vibrissal system; that is, it could be the first mammalian sensorimotor system for which we achieve an overall understanding of its neural bases, computational architecture and functionality.

The next six articles look at vibrissal sensing in a range of species, from marsupials, through the vibrissal-specialists—rodents and pinnipeds—to whiskered robots, and finally to consider how, by asking humans to mimic rodent vibrissal behaviour, we can gain insight into common principles between vibrissal and haptic touch.

In the article by Mitchinson et al. [38] a comparative study of vibrissal touch is presented showing that active vibrissal sensing, similar to that seen in rodents, is present in the marsupial Monodelphis domestica, an animal which is thought to share many features with early mammals. A tentative but intriguing conclusion is that the evolution of the vibrissal touch system may have been a critical milestone in the evolution of early mammals, and one of the triggers for an enlarged forebrain and the rich capacity for somatosensory representation seen in modern mammals including man. Research on vibrissal sensing is now increasingly focusing on mouse models owing to the ease with which genetic manipulations can be performed in these animals. In this context, Mitchinson et al. also present a new analysis showing that the whisking (whisker movement) patterns of mice may be more complex than those of rats involving both a base frequency and a second harmonic modulation. This result could have significant implications for future studies of rodent active touch sensing, and should make us mindful that interesting differences in active sensing behaviour may exist between closely related species.

When a rat or mouse contacts an unexpected object with its whiskers, a frequently observed consequence is that it will turn and investigate the object more closely both with its longer, movable macrovibrissae and with the shorter, non-actuated microvibrissae on the upper lip and chin (the ‘foveal’ area of the rodent vibrissal system). The ability to accurately orient to detected targets depends on being able to compute the radial distance to the contact point (i.e. the distance from the base of the whisker shaft to where it was deflected against the surface). In their contribution, Solomon & Hartmann [39] survey a number of different methods that could be applied to compute radial distance based on whisker bending as detected at the base of the whisker within the follicle. A general principle in psychophysics, owing to the experimental psychologist Ernst Weber, is that the ‘just noticeable difference’ in the change in the magnitude of a stimulus is proportional to the size of the initial stimulus. Solomon and Hartmann argue from Weber's Law to a prediction that the tapered whisker of a rat could provide for good resolution of radial distance close to the whisker tip despite the much reduced axial force that can be expected to arise within the follicle owing to a contact near the tip. This result could be helpful in understanding the active sensing strategies employed by the rat that often appear to favour contact near to the tip of the whisker (e.g. [21]).

Another tactile discrimination that rats are known to be good at is to distinguish, using their whiskers, between surfaces that differ in texture. As for radial distance, a variety of different mechanisms have been proposed that could account for this ability (e.g. [40]) and the ability to make effective discriminations is known to depend on how the whiskers are controlled [41]. The article by Zuo et al. [42] approaches the texture discrimination task from an active sensing perspective by comparing the whisker control strategies used by individual rats. They find that animals differ in some of the parameters of whisker control, and further that these differences can predict some of the errors observed. Further, following clipping of some of the whiskers, there is evidence of change in the whisker control strategy to compensate for the change in the morphology of the vibrissal array. Earlier we raised the question, are sensing strategies selected by evolution and fixed for life, or are strategies at least partly learned and modifiable based on experience? The evidence from the study of Zuo et al. is more consistent with the notion of adaptable sensorimotor strategies, at least in rat texture sensing.

A further discrimination task in which rats can successfully employ vibrissal touch is to judge the relative horizontal positions of two poles, positioned to the left and right of the snout, learning to turn to the left or right to obtain reward depending on which of the two stimuli are more proximal [43]. In this task, rats typically use an active strategy that involves a short bout of palpating the whiskers against both poles simultaneously. In their article, Horev et al. [44] directly compare the behaviour of rats on this task with blindfolded humans asked to make a similar judgement using plastic rods attached to their fingertips. They find both similarities and differences between the active control strategies used by rats and humans—rats rely more on spatial cues derived from vibrissal control and humans on temporal cues—however, notably, both species make use of an iterative and active sensing process to converge upon a stable percept.

The next article on vibrissal touch by Miersch et al. [45] switches the focus to the pinnipeds—seals and sea lions—which, if mechanoreceptor density is a good indicator, have the most sensitive vibrissae of all mammals. In addition to using their whiskers to investigate objects directly, several pinniped species have been shown to use their whiskers to detect the hydrodynamic trails left by swimming fish. Comparisons with terrestrial animals show some remarkable adaptations for this new form of vibrissal touch (flow sensing), including changes in the shape and morphology of the hair shafts, with flow tank experiments suggesting that signal-to-noise ratio is near optimal in harbour seals that possess non-tapered, oval and undulating whiskers. While pinnipeds do not whisk (the much higher density of water compared with air presumably makes repetitive whisker motion too energetically costly), seals protract their whisker into their most forward position while searching for and tracking hydrodynamic trails. Head and body movements are also critical for appropriately positioning the whiskers within the flow field that the animal is seeking to follow.

Advances in engineering materials, transduction and actuation mechanisms and microelectronics, offer the prospect of building artificial tactile sensor systems that match the capabilities of their biological counterparts within the next decade. Robot guidance systems based on touch sensing and tactile object recognition systems can come into their own when vision sensors fail (e.g. in turgid water, or in dust- or smoke-filled rooms), or where touch is simply more effective (e.g. in measuring surface properties such as compliance). Our final article on vibrissal touch discusses the prospect of developing vibrissal-like sensing systems for robots that could match those of rodents for discrimination of tactile surface properties and for the control of behaviour. Pearson et al. [46] first review the history of artificial whiskers beginning with first attempts using steel wires in the 1980s, through to recent biomimetic robot models of rat vibrissae. The article then describes a new whiskered robot platform Shrewbot (inspired, naturally, by the Etruscan shrew) that captures better than previous attempts, important aspects of mammalian vibrissal morphology and control. A key result, in the current context, is that they demonstrate that whisking pattern generation using a mixture of excitatory and inhibitory feedback control—similar to that thought to underlie rodent whisking [38]—results in a greater number of whisker contacts with target objects whilst usefully constraining the dynamic range of these contacts (promoting sensitivity).

Several contrasting approaches are represented in the various articles relating to human active touch in this theme issue. In these contributions, the integration of proprioceptive (including muscular and vestibular systems) and tactile inputs with control of exploratory movement (or the stabilization of movement), commonly termed haptics, comes to the fore in articles that span the perception of objects and their properties, the space in which they are situated, and the body and its states. Research on human active touch is important for understanding mechanisms for such integration, but also for its potential to contribute to the design of improved displays for those who are especially dependent on touch (e.g. the visually impaired), and the development of systems or strategies designed to substitute for the sense of touch in those with peripheral (e.g. through nerve lesion) or central (e.g. through stroke) impairments of touch, or their combination (e.g. in the elderly).

Although our environment is three-dimensional, vision works remarkably well with two-dimensional displays. In their contribution, Klatzky & Lederman [47] point out that this is not true of touch—a finding, which would seem to limit the potential of two-dimensional tactile displays. In two-dimensional touch, the need for sequential contour following makes object recognition slow and memory demanding. By contrast, studies of haptic face perception show relatively little effect of reduction from three dimensions to two dimensions. These authors suggest that these findings may reflect the existence of fundamentally different modes of haptic processing for faces and objects; this is borne out by empirical evidence showing the re-emergence of reductions in performance in haptic tests with inverted two-dimensional and three-dimensional faces.

A fundamental component of the perception of object shape is surface curvature. Kappers [48] reviews a series of her studies with colleagues demonstrating that first-order (local attitude, angle) rather than the zero-order (displacement, height) or second-order (curvature) information at each finger is the primary factor in curvature perception. Standard psychophysical methods require comparison of one stimulus with a second presented after a short interval. It might be thought that memory decay would result in deterioration of the perceived curvature and that simultaneous touching using two hands might be better. Kappers summarizes results showing, surprisingly, that this is not the case and suggests that this reflects limitations on transfer of curvature information between the hemispheres. Brain mechanisms are also invoked as a possible account of the way that touching one curved surface influences the way the curvature of the next surface is perceived, even when it is felt with a different finger (ruling out a purely peripheral basis for the after effect).

The dependence of shape discrimination on first-order (local attitude) cues described by Kappers was determined with an apparatus in which the finger was held against a flat plate whose inclination was coupled to left–right displacement by the hand (sweeping out a rectilinear scan path for the finger). The percept was of a curved surface even though finger height was constant throughout the trajectory. Interestingly, a similar curvature percept was obtained with this apparatus whether the motor strategy kept the finger orientation constant in space (so that the point of contact with the plate moved round the finger) or whether the contact point was held constant (requiring the finger to rotate). According to the article by Hayward [49], this finding reflects the nervous system's ability to apply simplifying assumptions in order to interpret the different attitude cues (tactile, proprioceptive) in terms of surface curvature. Haptics is thus considered by Hayward to be the problem of identifying the assumptions that the nervous system is making to cope with the complexity of the ‘plenhaptic’ function (intended to describe all that can potentially be felt through parameters such as position and deformation of the digits) in the recovery of desired object attributes during actual behaviour.

Active touch involves tactile and proprioceptive sensing under the control of movement. This is related to Turvey & Carello's [50] use of the term dynamic or effortful touching. In their paper, they describe exploratory movements for dynamic touching that span space and time scales ranging from briefly hefting an object to walking though the environment. Their review includes studies of hand-selective perception of attributes of hand-held or appended objects which also relate to perception of segments of the body. In their empirical contribution, they broaden this focus to consider distance covered over the ground in walking as an attribute accessible through dynamic touch.

An important aspect of the control of movement is the stabilization of posture. Wing et al. [51] review the contributions of light contact touch to maintaining stable standing balance. Body sway is reduced by light touch contact with a stable support. If, by contrast, the support moves, there is a tendency for the movement to entrain the sway. These authors describe a new paradigm, ‘light tight touch’, in which a robotic manipulandum is coupled to the index finger and the effects on balance of different imposed finger movement patterns are investigated. This approach promises to provide insight into the effects on balance of contact between people, for example, in holding hands. When the object that is contacted is another person, ‘touching’ and ‘being touched’ form a complex interaction, and an intriguing domain for future research.

There are a host of neurological pathologies that involve distortions or loss of touch, or disturbances of sensorimotor integration. For instance, although the most common consequence of stroke is hemiparesis (weakness of one side of the body), somatosensory deficits are also common and range from low-level (detection), mid-level (object perception), through to high-level (cognitive body representation). Touch disorders after stroke and their interactions are the focus of the article by Van Stralen et al. [52]. After a review of the components of active touch that supports object recognition, the authors provide new data showing the effects of self-touch on body representation. If supported by further research, their findings may lead to a better understanding of the maintenance of body image and contribute to therapies for cognitive level somatosensory disorders.

Advances in technologies for touch sensing will have important uses in a variety of areas including prosthetic hands with fingertip-like sensors, and systems for minimally invasive surgery where there is a pressing need to provide surgeons with haptic signals similar to those they would obtain if operating on bodily tissues with their hands [53]. In the article by Bicchi et al. [54] the possibility of developing biomimetic hands, for prostheses or for humanoid robots, is examined from the perspective of control of the hand in grasp and exploratory movement. By showing that human grasp patterns can be understood as residing in an attractor space of ‘soft synergies’, Bicchi et al. are able to define a tractable method for controlling an artificial hand so as to take hold of objects in a human-like way. As Katz wrote, ‘on the day that the child uses its hand as a unique instrument of prehension, it become equally a unique instrument of touch’ [28]. Thus, understanding how control of the hand is used to grip a complex shape is an important step towards developing systems that can combine object manipulation with feature extraction, efficiently exploit motor control for tactile exploration and extract tactile affordances for holding and using objects such as a pen, a surgeon's scalpel or a hammer.

As the above survey indicates, touch sensing is itself something of a ‘scientific hammer’ with which to investigate how animals, including humans, make sense of, and become attuned to, their surroundings. We therefore commend this theme issue to the reader and hope that you will find within it useful insights, and new inspiration, from the vibrant world of active touch.

Acknowledgements

This theme issue arose from a Theo Murphy Discussion Meeting sponsored by the Royal Society that took place at the Kavli Royal Society International Centre, Chicheley Hall, Buckinghamshire, from the 31st January to the 2nd February 2011. This event received additional support from the European Union Framework 7 projects BIOTACT (BIOmimetic Technology for vibrissal Active Touch), EFAA (Experimental Functional Android Assistant) and CSN (Convergent Science Network for biomimetic and biohybrid systems). We are grateful to the organizers of that event Kirstie Eaton, Lynne Boshier from the Kavli Centre and Gill Ryder from the University of Sheffield, to our host for that event Sir Peter Knight FRS, and to all the participants of the meeting whose ideas and feedback have influenced and improve the content of the articles presented here. We are particularly indebted to Prof. John Nicholls FRS who helped initiate the plans for the Active Touch Sensing meeting at the Kavli Centre. The articles in this theme issue benefited from insightful and thorough reviewing from a panel of international reviewers. We are also grateful to Joanna Bolesworth of the Royal Society for her editorial contributions. Finally, the authors would like to express their thanks to members of their research laboratories—the Active Touch Laboratory at the University of Sheffield, the Tactile Perception and Learning Laboratory at the International School of Advanced Studies in Trieste, and the Sensory Motor Neuroscience Centre at the University of Birmingham.

Footnotes

References

View Abstract