18 Jul 2016 |
Research article |
Intelligent and Autonomous Systems
The Importance of Touch Sensing in Robotics
What would become of us without our sense of touch? One unfortunate man found out when he lost his sense of touch in an accident . He was no longer able to feel either his body or the ground; he needed months to relearn how to walk and manipulate objects. He had to surround himself with very sturdy objects as he could no longer control his strength. He also had to constantly watch what he was doing, but still could not make precise movements. This man’s limitations are similar to those of robotic arms without tactile sensors. But since robotic arms would be able to perform much more complex tasks if they had a sense of touch, it begs the question: why not develop tactile sensors that could enable robots to acquire a sense of touch?
Tactile sensors provide fairly basic information: amount and variation of force. As humans, we never say to ourselves, “OK, I’m applying three Newtons of force, which means the object won’t slip from my hand.” Our natural reflexes enable us to perform the movements we need to make unconsciously. The aim of our research was to develop a sensor capable of “recognizing” the texture and degree of rugosity of an object through touch. Future research could teach robots to apply this information in order to better manipulate objects.
Our study focused on recreating touch sensing in robotics, the same as what humans experience when touching different textures. We used a high frequency (~1000 hertz) capacitive touch sensor to characterize textures. Many studies were conducted in this area and have achieved recognition levels of over 90%.
To provide a solution applicable to any situation in the day-to-day usage of a robot, our work included three innovations:
- Two information acquisition movements were developed based on those of human beings and adaptable to a range of situations (see Figure 1). The first movement is linear, the second circular
- Genetic algorithms were used to optimize variables for our algorithm
- A voting system was adopted to optimize results
Our algorithm has four steps:
3. Extraction of determinants
The first step, acquisition, involves rubbing the sensor over various textures. Even this simple action required setting multiple parameters: movement path, acquisition speed, and applied force. To achieve learning, we conducted several acquisitions to form two sets: the learning set (with which the algorithm “learns” the textures) and the test set (which serves to verify the success of learning).
Pre-processing involves a series of steps designed to standardize the signal to keep processing errors to a minimum. A learning algorithm requires an input of acquisition information to allow the differentiation of textures. These inputs, called discriminants, are selected with care to enable the sensor to differentiate between textures using a minimal information. We then used a neural network learning algorithm which enables the sensor to recognize texture.
To optimize results, we used a “majority vote” system: prior to pre-processing, each acquisition is divided into five equal parts. Each of these five parts then undergoes pre-processing, extraction of determinants, and learning. Once the entire process is complete, we obtain five hypotheses on the correct texture. Each is considered as a vote, and the majority wins.
Short linear swipe
We tested the short linear swipe described above, at different speeds and levels of force, on the four textures illustrated in Figure 3.
A total of 1,800 swipes were made (in both the learning and test phases) on these four textures. Using the algorithm alone, a recognition rate of 85% was achieved; when the voting algorithm was incorporated, this rose to 98.2%. Textures (b) and (c), marked by striations that differ in depth only, proved more difficult to differentiate.
The rotating movement was tested on the five textures illustrated in Figure 4. Our aim was to test whether the orientation of the movement (with or against the striations) impacted the recognition rate.
Using these new textures, a total of 2,250 tests were made (in the both learning and test phases) on the five different textures. The recognition rate was 93%. When the experiment was repeated on the same textures with a different movement, the linear swipe, the recognition rate dropped to 87%.
Preliminary results analysis
Our results show that by using effective algorithms, we can now detect texture with a fairly high degree of accuracy. Further research and development is needed, however, to successfully integrate this process in the “normal” usage of a robot. Another considerable problem is the fact that textures must be learned before they can be recognized by a sensor. To address this, we conducted a second phase of experimentation designed to estimate the level of rugosity.
To the best of our knowledge, there are currently no studies on estimating texture using a tactile sensor. Our objective is to use a simple Rugosity Index (RI) to improve robots’ ability to manipulate objects. A benefit of this process is that it would require pre-learning only a limited set of textures, and other textures would then be learned automatically.
Of the 25 chosen textures (Figure 5), ten were used for learning and ten others for test (the remaining five fell too far outside the range to be assigned a value). The algorithm was quite similar to that used in the first phase of experimentation and requires the same steps to recognize textures.
In the two figures below, the rectangles represent a standard deviation (50% of acquisition results fall within the rectangle), while the line depicts the full range of values for each texture (200 acquisitions were performed on each texture). The average error during learning was over 0.5 on the Rugosity Index (Figure 6). During experimentation, the error was 1.65 on the Rugosity Index, with a large standard deviation (Figure 7). This means a Rugosity Index score of 5 could easily be perceived as an 8, or vice versa. The main reason for this large average error was an inadequate number of textures: by using a single texture for learning, only the texture and not its degree of rugosity, was learned.
Our research led to the creation of an algorithm, the selection of two simple motions for texture acquisition, and the development of a voting system to optimize texture recognition. Experiments using capacitive tactile sensors achieved recognition rates ranging from 87% to 98.2%. In this method however, textures must be pre-learned to enable recognition. We then conducted experiments on recognition of different degrees of rugosity. A 10-point rugosity scale was developed, using 10 different textures. The results did not allow us to estimate texture rugosity with an acceptable degree of precision. The lessons learned will guide future research:
- We believe a 5-point rugosity scale would be preferable to the 10-point scale used, for both humans and sensors
- The number of textures used in learning must be increased, in order to learn to estimate rugosity rather than texture. We believe future experiments could use a set of 40 textures (25 for learning and 15 for test) to improve learning on degrees of rugosity
We believe that in the next few years we may see robots capable of estimating the rugosity when touching a material, and using this information to better manipulate the object.
For further information on this subject, we recommend the following master’s thesis:
Discrimination de textures et quantification de rugosité par algorithme d’apprentissage
Rispal, Samuel, and Vincent Duchaine, École de technologie supérieure, October 2, 2014.
Samuel Rispal is project manager at the École de l’innovation citoyenne. He completed an M.A.Sc. in Electrical Engineering from ÉTS. His areas of specialization are artificial intelligence and tactile sensors.
Program : Electrical Engineering
Vincent Duchaine is a professor in the Department of Automated Manufacturing Engineering at ÉTS. He specializes in robotics, mechatronics and touch sensors, and directs 2 innovation programs: with McGill and Concordia, and with ESG UQAM.
Program : Automated Manufacturing Engineering
Research laboratories : CoRo – Control and Robotics Laboratory