Closing the gap between human tactile sensation and tactile sensor design

Workshop at the IEEE World Haptics Conference 2023
Delft, The Netherlands

July 10th, 2023


The sense of touch is of great interest to both the neuroscience and engineering communities. While the neuroscience community rightly focuses on the fundamental questions about how humans and primates sense, perceive, and act in the physical world using touch, the engineering community, strongly motivated by rapid recent advancements in sensing, robotic, and artificial intelligence technologies, aims to design and manufacture artificial tactile sensors and associated algorithms that facilitate dexterous manipulation using robotic grippers/hands. Closer collaboration between these communities will accelerate the progress of tactile sensor design and application.

This workshop aims to bridge the information gap between the neuroscience and engineering communities. Exchanging knowledge between these communities may help move us closer to achieving the most efficient tactile sensor designs, that measure the most relevant tactile information, implemented using the most pragmatic engineering approaches. Reaching this goal could have a profound impact on the field of robotic manipulation.


Organized by:
Ingvars BIRZNIEKS (University of New South Wales, Australia),
David CORDOVA BULENS (University College Dublin, Ireland),
Benoit DELHAYE (University of Louvain, Belgium),
and Stephen REDMOND (University College Dublin, Ireland).

pathways_pic
Source: © Colin Anderson / Blend Images / Getty Images

Call for posters!

If you want to share your work by
presenting a poster at the workshop,
please e-mail David CORDOVA BULENS.

Quick links: - Program - Abstracts -

Program at a glance


9:00 - 9:05 - Welcome and introduction


9:05 - 10:05 - Session 1 (4 x 15min)


Dr Ingvars Birznieks (University of New South Wales, Australia)

Dr David Gueorguiev (Sorbonne University, France)

Dr Laurent Opsomer (UCLouvain, Belgium)

Dr Hannes Saal (University of Sheffield, UK)


10:05 - 10:35 - Plenary talk


Dr Rochelle Ackerley (Aix-Marseille University, France)


10:35 - 11:30 - Long coffee break and poster session


11:30 - 12:45 - Session 2 (5 x 15min)


Dr Michaël Wiertlewski (Delft University of Technology, The Netherlands)

Dr Ben Ward-Cherrier (University of Bristol, UK)

Dr Roberto Calandra (Meta)

Dr Wenzhen Yuan (Carnegie Mellon University, USA)

Dr Stephen Redmond (University College Dublin, Ireland)


12:45 - 13:00 - Discussion and closing



Talk abstracts


The complexity of touch encoding in humans and applying it in tactile sensors

Dr Rochelle Ackerley (Aix-Marseille University, France)

How real can artificial touch be? How closely does it need to resemble the biology to be useful or convey complex messages? The questions are at the heart of understanding how touch is encoded and applying it to aide dexterous manipulation. I will present insights into touch encoding in humans and how these can be measured. The technique of microneurography allows recordings from inside peripheral nerves in humans and we have learned a lot over the past 50 years about the human tactile system, especially about encoding in the glabrous skin of the hands. New developments in microneurography and the analysis of data mean that the insights from human afferent recordings can be better applied to tactile sensor design. For example, the specific roles of different afferents in the skin and what excites them most. As well as giving a neurophysiological overview, the talk will focus on different aspects of naturalistic touch, where often very subtle signals can produce clear sensations, which can be maximized for sensor design.


Perception of subtle tactile cues through friction

Dr David Gueorguiev (Sorbonne University, France)

Humans are able to sense tiny tactile cues such as nanoscale wrinkles on chemically identical materials or chemical differences between materials with identical topography. This ability is surprising considering that the size of corneocytes, which are constitutive of the outer layer of the stratum corneum is larger than the tactile features that the sense of touch is able to perceive. Friction has been linked with the human ability to detect these tiny cues, which is astonishing since finger-surface friction is inherently variable and affected by a large number of phenomena such as the occlusion that develops within a few seconds of touching flat surfaces, stick-slips that perturb smooth tactile exploration, or the orientation of the finger with respect to the direction of the exploratory motion that changes the occurring friction. By recording the acceleration and forces during tactile exploration of surfaces and simultaneously implementing psychophysical tasks, it is possible to study potential processing mechanisms that enable the tactile sense to achieve such an accurate level of perception. Promising metrics have been observed when exploration is constrained but it is not yet clear how the sense of touch isolates meaningful tactile cues within the apparent chaos of natural exploration.


Deformation of fingerprint ridges during tactile interactions

Dr Hannes Saal (University of Sheffield, UK)

The human fingertip can detect small tactile features with a spatial acuity roughly the width of a fingerprint ridge. However, how individual ridges deform under contact to support accurate and high-precision tactile feedback is debated. In this talk I will present recent work, where we imaged the sub-surface deformations of hundreds of individual fingerprint ridges during contact events. We quantified the deformations in multiple skin layers resulting from static indentation, stick-to-slip events, sliding of a flat surface in different directions, and interaction with small tactile features, such as edges and grooves. Our findings highlight the profound role of the skin in shaping tactile feedback and suggest that the mechanics of the substrate around the sensor should be considered in robotic applications.


Robots grasping like humans

Dr Michaël Wiertlewski (Delft University of Technology, The Netherlands)

Industrial robots are constantly improving their capabilities, yet there is still one challenging task they struggle with: object manipulation. In order to manipulate objects accurately, robots must have an understanding of the friction between their fingers and the objects. I will demonstrate how we observe and interpret the state of friction through human somatosensory touch. Through our investigation of skin deformation under different frictional loads, we discovered consistent patterns of deformation. This valuable knowledge guided us in developing novel robotic sensors and processing systems that replicate the way humans regulate their grip. The resulting tactile gripper is robust against disturbances and adjusts seamlessly to the object being held.


Neuromorphic optical tactile sensing - bridging artificial and biological touch

Dr Ben Ward-Cherrier (University of Bristol, UK)

My long-term research aim is to build upper-limb prosthetics with tactile sensing that seamlessly integrates with the human body. I believe neuromorphic tactile sensing technologies will help to achieve this goal. Our research group builds neuromorphic optical tactile sensors and develops spike-based algorithms to solve tactile perception problems such as texture identification, slip detection or contour following.


Title

Dr Roberto Calandra (Meta)

ABSTRACT


What have we learned from using a high-resolution robot tactile sensor?

Dr Wenzhen Yuan (Carnegie Mellon University, USA)

Making robots able to intelligently perceive the physical world through touch has been a long-standing challenge. The lesson from human tactile sensing system design has been a big inspiration. In the past decade, the introduction of the high-resolution tactile sensor GelSight for robots has brought in new possibilities for robotic tactile perception because it provides very rich information in mapping the geometry of the contact surface. In this talk, I will revisit the design and working principle of the GelSight sensor and the new perception capability it brings to robots. Through the comparison with human touch sensing, I will discuss how the physical design of the sensor is connected to the sensing capabilities.


Gravity biases the haptic estimation of forces

Laurent Opsomer (UCLouvain, Belgium)

Successful interaction with an artificial haptic device requires very fine control of fingertip forces, which relies on the integration of cutaneous and proprioceptive feedback. Yet, haptic devices can sometimes be used in environments experiencing changing or altered gravito-inertial accelerations, such as airplanes or space stations, which modulates the proprioceptive feedback and motor commands associated with a specific force output. Here, we examined the influence of gravito-inertial forces on finger force control under isometric conditions, conditions that are frequent when using a haptic interface. We trained participants to reproduce isometric vertical forces on a dynamometer held between the thumb and the index finger in normal gravity, and tested them during parabolic flight creating phases of micro- and hypergravity, thereby strongly influencing the motor commands and the proprioceptive feedback associated with the prescribed vertical forces.

We found that gravity creates the illusion that upward forces are larger than downward forces of the same magnitude. The illusion increased under hypergravity and was abolished under microgravity. Gravity also affected the control of the grip force employed to secure the grasp. These findings suggest that gravity biases the haptic estimation of forces, which has implications for the design of haptic devices to be used during flight or space activities.


The secret of tiny hand movements to feel and manipulate objects

Dr Ingvars Birznieks (University of New South Wales, Australia)

This presentation will give insight into sophisticated strategies on how motor control and sensory systems reciprocally cooperate achieving fine control of hand movements. Frictionalinformation is crucial in achieving fine grip force adjustments during manipulation tasks avoiding objects to slip out of hands. Our research has discovered a surprising significant physiological mechanism that small submillimeter range hand movements, which we are not ourselves aware of, significantly enhance human ability to perceive friction differences when object is gripped. The evidence will be presented comprising psychophysics experiments, biomechanical analyses, and object manipulation tasks. By observing hand movement strategies during object manipulation, we notice that yet another mechanism must be used, especially when an object is already lifted and held in hands. Rotational forces (torque) are unavoidably arising when the mass of the object is not perfectly distributed relative to the grip axis and when vertical movement is achieved by rotation in the wrist or elbow joints. We suggest that the advantage of torque-induced object rotation is that it may induce localised slips depending on distance from the rotational centre helping to measure friction without vertical translation with gravity which would endanger grip safety and the object might fall out from the hands.

We suggest that implementation of such relatively simple movement strategies enabling acquisition of sensory information may substantially enhance capabilities of the robotic and prosthetic devices.


Development of optical tactile sensors for robotic applications

Dr Stephen Redmond (University College Dublin, Ireland)

Prosthetic and robotic hands demonstrate poor dexterity during object manipulation, often dropping objects. Humans rarely allow objects to slip because, among other strategies, we can sense if an object is slippery and adjust our grip. In recent years, while we have learned more about the biomechanics and neuroscience underpinning our ability to sense friction, there is still much to learn. Perhaps unsurprisingly, given how poorly we understand human friction sensing, very little research has been directed at replicating this ability to sense friction or slipperiness using artificial sensors. In this talk, I will present some friction-based tactile sensor prototypes which were inspired by our various hypotheses on how humans sense friction, and I will demonstrate how these novel optical sensors might enable improved dexterity in robotic manipulation tasks. It is hoped that the outcomes of this research, which would endow artificial hands with the ability to feel the slipperiness and/or impending loss of grip of a grasped object, will help advance the fields of prosthetics, telesurgery, and service, agricultural, and manufacturing robotics.



Posters


Tactile intensity perception is shaped by temporal integration: evidence from single afferent recording and stimulation in humans

Roger H. Watkins (Aix-Marseille University, France)

In glabrous hand skin, four types of fast-conducting mechanoreceptive afferent encode different aspects of touch. Temporal firing patterns in these afferents reflect stimulus quality, whereas the overall firing rate relates to perceived intensity. We used broadband vibration reflecting real world tactile interactions and found intensity ratings dropped markedly with stimulus duration <400ms. We used microneurography to relate these perceptual equivalence judgements of intensity at different durations to coding in single mechanoreceptive afferents. Afferent firing rate gave a poor prediction of perceived intensity, which was instead best predicted by binned firing rate across 100ms windows. We also stimulated single afferents with intraneural microstimulation to evoke precise spike trains and associated tactile percepts. For type I afferents, intensity was perceived as lower at durations below 100ms, but delivering the same number of impulses produced equivalence. In type II afferents, intensity judgements were reduced at longer stimulus durations (>200ms). These data suggest that tactile intensity perception is integrated over short time windows for type I afferents, but this might be different in type II afferents. This has implications for the design of tactile feedback, where temporal aspects may be utilized to provide efficient modulation of vibrotactile and prosthetic neurostimulation intensity feedback.


BodySense: a collaborative research platform for development of simple, open source and replicable somatosensory research tools

Rochelle Ackerley (Aix-Marseille University, France)

The BodySense research platform is a technical resource which provides somatosensory researchers with simple tools for use in touch experiments, as well as for examining the other senses further, such as in proprioceptive, vestibular, visual, and auditory research. The platform provides collaborative support for the development of tools that provide stimulation and quantify behavior in a variety of sensory modalities. The focus of the platform is production of simple, affordable and open-source tools. The development of several projects in tactile and vestibular research will be presented, which incorporate the design of 3D printed components that can be made with simple integrated sensors and stimulators. These devices can be readily duplicated and incorporated into a system that allows for the precise synchronization of the stimulators and recordings with particular expertise in compatibility physiological measures with critical timing requirements (EEG, EMG, and neural activity, such as in microneurography) as well as co-ordinated behavioral video capture.


Developing Platform for Able-Bodied Users in Haptic-Enabled Prosthesis Research

Finn Heath (University of Bristol, UK)

Research using advanced prosthetic devices can be limited by a lack of access to prosthesis users as participants. This project looks at developing a 3D-printed socket for able-bodied users to control and receive haptic feedback from prosthetic hands. This platform was designed to decrease torque on the participant's arm muscles while allowing EMG control and vibrotactile haptic feedback via the same arm. The aim is to use this platform in a controlled study investigating user perception of urgent haptic signals under cognitive load. We expect this research to help direct future development in tactile sensors and haptic interfaces for prosthetics, in particular indicating optimal applications for vibrotactile sensors compared to other haptic feedback modalities.


Human Exploratory Strategies for Texture Detection - Can We Apply Them to Robots?

Ao Li (University of Bristol, UK)

This research poster will present our novel study exploring human tactile exploratory procedures in texture identification, examining their potential application to robotic systems. The crux of our investigation involves a hands-on experiment, where participants will interact with ten distinct textures. We will record hand motions through a vision system and applied pressure with a force sensor. Our goal is to shed light on the intuitive and efficient methods humans employ in texture exploration and perception. We will test the hypothesis that humans employ an adaptive strategy, wherein a broad, systematic exploration shifts to focused, targeted investigation for texture identification. Using the gathered data, we will develop an anonymized and open-source dataset that will provide a robust basis for further research in the field, helping to identify common patterns and unique approaches in human texture exploration.The conclusions drawn from our study have the potential to guide the design of robotic perception systems, enhancing their tactile acuity and interaction with their environment.


A Neuromorphic System for Real-time Tactile Texture Classification

George Brayshaw (University of Bristol, UK)

Tactile exploration of surfaces is a key component of everyday life, allowing us to make complex inferences about our environments even when vision is occluded. The emergence of biomimetic neuromorphic hardware in recent years has furthered our ability to create biologically plausible sensing solutions. While these platforms continue to improve in regard to latency and power consumption, within recent literature on tactile texture classification there is an emphasis on accuracy at the expense of real-time processing. In order for these tactile sensing systems to find use outside of experimental laboratory environments, it is key to design systems capable of capturing and processing data in real-time. Within this paper we present work on a system for the real-time, parallel capture and classification of texture data using a neuromorphic tactile sensor and a spiking neural network. Our real-time system has achieved classification accuracies of 100% on datasets of both natural textile surfaces and artificial 3D-printed textures. Furthermore, our system is capable of identifying textures at human-level performance in as little as 388 ms. This system out-performed previous work by the authors both in terms of accuracy and classification speed.


3-D Reconstruction of the Fingertip During Tactile Interactions

Donatien Doumont (Université catholique de Louvain, Belgium)

Tactile signals underlying our perception of touch originate from the mechanotransduction of skin deformation. Quantification of such deformation can provide new insights into the tactile information provided to the brain during interactions between the fingertips and objects. Measuring fingertip skin deformation without affecting its behavior is challenging, and studies have successfully used optical imaging and digital image correlation (DIC) to reconstruct planar deformations at the contact interface, using a transparent surface. Moreover, recent work has shown the possibility to reconstruct 3-D surface deformation of the fingerpad resulting from interactions with von Frey hairs using a multiple-camera setup and stereo vision. In this poster, we extend this work and developed a method to reconstruct the 3-D deformation of the skin inside (planar) and outside (non-planar) the contact resulting from the normal and tangential loading of the fingertip over a flat transparent surface.


Tactile feedback in the size-weight illusion

David Cordova Bullens (University College Dublin, Ireland)

Tactile feedback is crucial for dexterous manipulation, but the underlying mechanisms are not well understood due to the complex interplay between predictive and feedback control in setting our grip force (GF) in response to varying load forces (LF). The GF-LF relationship has been investigated for decades using instrumented objects that measure forces, but these instrumented objects do not measure tactile feedback. Recent developments in skin strain imaging have provided a proxy measurement of tactile feedback during the manipulation of such instrumented objects. However, these instrumented objects only image a single finger, providing an incomplete picture of cutaneous tactile feedback. To address this limitation, we present a novel instrumented object capable of imaging both the index finger and thumb in a pincer grasp with high contrast. Our pilot study aims to investigate the beginning of grip and lift phases using a size-weight illusion paradigm which renders visual cues and anticipatory scaling of GF unreliable. Without the ability to reliably predict the GF, we expect any subsequent GF adjustments to be informed by tactile feedback, including cutaneous sensation from the finger and thumb. We anticipate that our device will provide a more comprehensive understanding of the role of tactile feedback in GF adaptation. Our pilot study’s findings may help establish a causal link between partial slips and grip force adaptation, shedding light on the underlying mechanisms of dexterous manipulation.


An Optical 3-Axis Force and Displacement Sensing Array

Olivia Leslie (University College Dublin, Ireland)

Providing robotic grippers with tactile sensation is essential to improve robotic dexterity. The human fingertip has a dense array of mechanoreceptors providing tactile feedback about deformations and slips happening on the skin. To mimic such tactile feedback, tactile sensors require a distributed array of sensing elements. Here, we present a small optical tactile sensing array that can estimate a distributed array of 3D force and 3D displacement using our previously designed and implemented novel Light-Vector sensing principle. The sensing array is designed to have ten measurement points with relatively free movement between the sensing elements to allow for slip detection. The preliminary results demonstrate the successful sensing of 3-axis force and displacement from all ten individual elements in the range of 10s of mN and 10s of microns, making it helpful in improving robotic manipulation.


Developing a Silicone Elastomer Neuromorphic Tacile Sensor Applied to Edge-Orientation Classification Tasks

Fraser LA Macdonald & Oliver Girling (University of Bristol, UK)

Advancements in novel tactile robotic systems enables development of faster and more adaptable tools for investigating touch. This research develops a tactile sensor to facilitate a streamlined Spiking Neural Network which utilises multiple learning principles. The combination of a robust silicone gel exterior with a neuromorphic camera interior offers a lightweight, compact and biologically-inspired novel tactile sensor. These features cooperate together towards greater understanding of how human-like touch principles can be emulated in robotics.



Organizers


Ingvars BIRZNIEKS (University of New South Wales, Australia)

.


Benoit DELHAYE (Université catholique de Louvain, Belgium)

Dr Delhaye received a BS degree in engineering (2007), MS degree in electro-mechanical engineering (2009), and PhD degree in applied science (2014) from the Université catholique de Louvain, Belgium. He was a postdoc scholar at the University of Chicago from 2015 to 2018 and is now a research associate at UCLouvain. His research interest includes neuroprosthetics, haptics and neuroscience in the context of touch and sensory-motor control.


David CORDOVA BULENS (University College Dublin, Ireland)

David Córdova Bulens is a postdoctoral researcher at the Biomedical Signals and Sensors group at University College Dublin. He received his MS degree in 2013 in electro-mechanical engineering and his PhD degree in 2017 at UCLouvain in Belgium. His work focuses on two aspects: on the one hand, he is interested in how tactile feedback enables humans to regulate grasping forces, looking for relationships between skin deformations and grip force adjustments. On the other hand, he works on developing tactile sensors for use in robotics and prosthetics applications.


Stephen REDMOND (University College Dublin, Ireland)

Associate Professor Stephen Redmond holds Bachelor’s and PhD degrees in Electronic Engineering from University College Dublin (UCD). His PhD was in the area of biological signal processing, focusing on pattern classification, in particular the application of machine learning techniques to physiological signals. He has extensive experience in monitoring human movement using wearable sensors, predictive modelling of health events, and robust measurement of physiological signals in the home. He spent ten years at The University of New South Wales (Sydney, Australia), where he held an Australian Research Council Future Fellowship, prior to his return to UCD in 2019. He returned to Ireland to take up a prestigious Science Foundation Ireland President of Ireland Future Research Leaders' Award (2019-2023) to develop next-generation tactile sensors for robotic and prosthetic gripping that can feel slipperiness. He has a strong track record in the exploitation and commercialisation of research. He has worked in an advisory/consultancy capacity for several biomedical engineering companies, served as Data & Insights Director and Irish medtech start-up, FIRE1, developing a technology to manage heart failure, and co-founded Australian tactile sensor company, Contactile.