If you’ve ever sat on your foot or slept on your arm and then tried to use those appendages as the pins and needles of returning sensation washed over them, you’ve noticed that you can still stand and walk, hold a glass to your lips or type out an email, however strange and difficult it may be. That’s because despite the lack of full touch sensation at the interface of our extremities and the surfaces they come into contact with, minute vibrations can still be sensed that give us an idea of how we are interacting with our environment.
This is especially true of our hands, according to Yon Visell, an assistant professor in the Department of Electrical and Computer Engineering, the Department of Mechanical Engineering, and UCSB’s Media Arts and Technology graduate program. Distributed throughout our hands, he explained, are specialized sensory end organs that can capture different types of mechanical vibrations.
“We can liken this to the different ways that a bell will sound if it is struck by a metal hammer or a rubber mallet,” said Visell.
To explore the subtler ways our hands feel the world around us, Visell and colleagues devised a study employing a specialized array of accelerometers, or vibration sensors, that subjects wore around the sides and backs of their fingers and hands while they performed certain tasks. The tasks included tapping or sliding one or more fingers on a surface, using an item such as a pencil to tap on a surface, and grasping objects.
Each action yielded different signals; the intensity of the vibration depended also on how many and which digits were being used, as well as on the object being manipulated. Tapping with a single finger, for instance, produced stronger, more localized vibrations than did sliding, grasping, or gripping. Tapping with just the middle and index fingers was enough to produce vibrations that covered most of the hand. Holding a shot glass produced different vibration signals than holding a large cup.
The knowledge gained by exploring these mechanical touch patterns may have multiple applications. Besides adding to foundational understanding of touch, the data can also be used in the realm of virtual reality, for instance, where users can interact with different objects — a feather, a brick — as though they were the real thing. Robots would be better able to navigate and interact with elements in changing and uncertain environments, including those where humans are present. Sophisticated prosthetic devices able to relay the more-subtle information might enhance functionality for users.
Visell and his students are currently collaborating with medical researchers at the William Sansum Diabetes Center in Santa Barbara to investigate a new method for early detection of sensory neuropathy in those affected by diabetes.