Next Article in Journal
A Deep Learning-Based Classification Scheme for False Data Injection Attack Detection in Power System
Previous Article in Journal
A Multi-Party Contract Signing Solution Based on Blockchain
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tactile Sensor Data Interpretation for Estimation of Wire Features

Dipartimento di Ingegneria, Università Degli Studi Della Campania “Luigi Vanvitelli”, Via Roma 29, 81031 Aversa, CE, Italy
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(12), 1458; https://doi.org/10.3390/electronics10121458
Submission received: 21 May 2021 / Revised: 8 June 2021 / Accepted: 15 June 2021 / Published: 18 June 2021
(This article belongs to the Special Issue Force and Tactile Sensing for Robots)

Abstract

:
At present, the tactile perception is essential for robotic applications when performing complex manipulation tasks, e.g., grasping objects of different shapes and sizes, distinguishing between different textures, and avoiding slips by grasping an object with a minimal force. Considering Deformable Linear Object manipulation applications, this paper presents an efficient and straightforward method to allow robots to autonomously work with thin objects, e.g., wires, and to recognize their features, i.e., diameter, by relying on tactile sensors developed by the authors. The method, based on machine learning algorithms, is described in-depth in the paper to make it easily reproducible by the readers. Experimental tests show the effectiveness of the approach that is able to properly recognize the considered object’s features with a recognition rate up to 99.9%. Moreover, a pick and place task, which uses the method to classify and organize a set of wires by diameter, is presented.

1. Introduction

While robotics scientists are moving toward the creation of humanoid robots, thought to be able to work alongside humans in human-centric and unstructured environments, industries are faced with even more complex tasks, which require important technological advances and, in particular, human-like features, in the new generation of robotic systems. Whether they are designed to work with humans, e.g., in homes, schools and hospitals, performing high-level tasks, or designed to replace them in repetitive manufacturing tasks or hazardous, high-risk environments, robots should be able to perceive the external world to perform such challenging applications. To operate in complex and cluttered environments by mimicking human beings, the robots need to be able to dexterously manipulate objects. Humans can perform complex tasks, largely due their sophisticated tactile perception, e.g., grasping objects of different shapes and sizes, distinguishing between different textures, and avoiding slip by grasping object with a minimal force [1,2].
Tactile sensing would not only provide essential information about the manipulated object, but represents a chance to solve problems (e.g., object occlusion) related only to vision-based systems. Most research in tactile sensing has focused on building and characterising transducers [3,4,5,6,7]. Fewer works presented significant results that were obtained by applying touch sensing solutions to daily life applications. Jamali et al. [8] presented a soft, silicone finger with embedded strain gauge receptors for quality assurance and friction detection applications, which is able to distinguish object materials. The material textures are recognized by analyzing, using a Naive Bayes classifier, the vibrations generated when the finger slides along the object’s surface. Bandyopadhyaya et al. [9] designed a piezoresistive flexible tactile sensor, installed on two fingers of a robotic gripper, which is able to distinguish objects of different softness using a machine learning approach (Decision Tree and Naive Bayes methods) and is suitable for the fruit- and vegetable-grading industry. Liu et al. [10] have recently presented an effective way to discriminate object curvature using a sparse tactile sensor array. The authors overcomes the limitations of the use of vision-based systems, e.g., poor light or limited visibility during grasping, and obtains a good recognition performance (93.1%) by combining tactile sensor and neural network-based learning algorithms to estimate local object curvature during grasping maneuvers. Da Fonseca et al. [11] presented an interesting solution, which can enhance the grasp stability in an under-actuated hand, through tactile sensor data. The proposed system uses machine learning methods to estimate the orientation of in-hand objects from the data gathered by tactile sensors mounted on the phalanges of under-actuated fingers. Moreover, a dual fuzzy logic controller was presented to autonomously achieve stable grasping conditions, while forces were applied to in-hand objects. Several recognition strategies, based on the use of tactile sensors, have been developed. In Reference [12], the authors reconstruct the object models from discrete tactile point clouds. A spatial probabilistic approach to unknown objects has been proposed in [13]. A solution based on a bag-of-words approach has been presented in [14] to classify an object among a known set. The authors of [15] use a Bayesian approach to recognize object poses. An interesting aspect related to the tactile sensors is that the same sensor used for the recognition phase can be exploited for control purposes, such as grasping and manipulation [16,17]. Previous solutions represent only few examples, which demonstrate how fundamental tactile perception is to complex manipulation tasks. There are many papers where the tactile and vision data are combined to tackle very challenging tasks. Many researchers have been working on the integration of vision and tactile data for object recognition for more than thirty years [18]. The interested reader can deepen the topic of tactile sensing for dexterous manipulation by referring to the review papers [19,20,21], and references therein.
The authors in this paper present a specific application for Deformable Linear Object manipulation through tactile sensors, which represents one of the main objectives of the H2020 REMODEL project) (https://remodel-project.eu/ accessed on 16 June 2021). In particular, during the switchgear assembly, the recognition of specific features for manipulated wires proved fundamental for the successful execution of the task. In practice, knowledge of the wire pose between the gripper fingers can be used to correct the end effector pose and successfully complete the insertion and/or the cabling of the wire. Instead, the wire diameter information could be used to: determine if the grasped wire is the correct one; evaluate if the grasping force is adequate; explore a priori unknown features. The paper is organized as described in the following. Section 2 briefly recalls main features of the tactile sensor used in this paper. The interested reader can find a detailed characterization of the technology in [22]. Section 2 proposes a new optimized algorithm to estimate the shape of the grasped wire, by improving the estimation quality with respect to the solution presented by the authors in [23]. First, a suitable normalization and linearization procedure for sensor signals is proposed, and then an estimation algorithm, based on an appropriate choice of the normalized signals, is presented. In Section 4, a machine-learning-based approach to wire diameter estimation is proposed for the first time. Section 5 presents experiments for both wire shape estimation and the wire diameter recognition, with the aim of evaluating the effectiveness of the proposed approaches. An appropriate video has been prepared and attached to this paper (Please see the Supplementary Materials) to show the use of the wire diameter estimation during a sorting task.

2. Sensor Technology

The sensor used in this paper consists in an optoelectronic Printed Circuit Boards (PCB) with a discrete number of photo-reflectors, suitably assembled with a deformable layer. The contact information is obtained from the deformation of this layer measured by the optical sensing points, here called “taxels”. The main characteristics of the sensor, detailed in [22], are summarized in Table 1.

2.1. Hardware

The electronic part of the tactile sensor consists of two PCBs: one with optoelectronic-based taxels and one for power supply management and external communication. The first PCB integrates 25 taxels organized in a 5 × 5 matrix, corresponding to a sensitive area of about 21 × 21 mm 2 with a spatial resolution equal to 3.55 mm. The sensitive area corresponds to the contact area of the deformable pad (described below), where maximum dimensions have been fixed 3.55 mm away from the outermost taxels. Each taxel consists of a photo-reflector that is the combination of an infrared LED and an optical matched phototransistor. Mounted on this PCB, there is a low-power microcontroller by Microchip (code PIC16F19175), which integrates more than 25 low-noise A/D channels with a 12-bit resolution by allowing a simple interrogation firmware (explained in the next section) and a good signal-to-noise ratio. Finally, there are 2 adjustable current sources driving the LEDs, analogue buffers to decouple the phototransistors’ voltage signals from the A/D channels, and additional electronic components (resistors and diodes) used to improve the stability of the emitted light with respect to temperature variations. The second PCB is mechanically connected to the first through standard connectors, and it has been developed to adapt the sensor interface to different needs, without changing the sensing part. In fact, by modifying only this second PCB, it is possible to easily use the sensor with different electric interfaces or voltage supplies. For the configuration used in this paper, this PCB presents the microcontroller programming interface and DC/DC converters used to generate the needed power supplies from the available 5 V.
The mechanical part is composed of a deformable pad, a rigid grid between the pad and the PCB, and a two-part case. The deformable pad transduces the external contacts into deformations that can be sensed by the taxels and is made of silicone due to its good elastic properties and low hysteresis. The rigid grid is used to achieve a perfect alignment between the taxels and deformable pad. Moreover, the grid thickness guarantees that the photo-reflectors work in a monotonic range. Finally, the external case is composed of a bottom part, designed for the precise housing of the PCBs, and a top part to cover the electronic boards and lock the silicone pad along its edges. Figure 1a,b shows the sensorized finger mounted on a commercial gripper and the taxel disposition in the 5 × 5 grid, respectively. The location of each taxel is defined as t i j by exploiting its row and column indices within the matrix. The interested reader can find all details of the electronic and mechanical design in [22].

2.2. Software

The software is divided into two parts: the microcontroller (MCU) side and elaboration system side (PC-based elaboration unit). Starting from the former, the firmware was designed to be as efficient and simple as possible. Therefore, the MCU only performs voltage conversion and sends digitized data over the serial bus. In addition, it can accept simple commands to provide useful information about the tactile sensor, i.e., number of taxels and the sensor ID.
The software at elaboration system side uses a Robot Operating System (ROS). The application, consisting of an ROS node, mainly takes care of the communication with the MCU and makes the tactile data available to the user. Basically, the node continuously asks for new MCU data, reconverts the received digitized data into voltage values, indicated as v i j , and publishes the latter in an ROS topic. In addition, it removes the offset for each taxel, computed as the mean value on the first 50 received v i j samples, and publishes the resulting voltage variations Δ v i j in another ROS topic. Details of the software can be found in [22].

3. Normalization with Linearization and Wire Shape Estimation

In object manipulation tasks, it is very important to know how the object is positioned between the fingers. In switchgear assembly, for example, knowing how the wire has been grasped allows for the end effector pose to be corrected, ensuring tasks are completed successfull, such as insertion and/or the cabling. For this reason, an algorithm for wire shape estimation is here presented. The proposed method exploits and improves the one formulated in [23], where the wire shape in the contact area was approximated with a quadratic function. In that paper, the estimation constituted three steps: detection of the main grasping direction of the wire, computation of the centroid coordinates depending on the main direction, and computation of the parameters for the quadratic function, using the centroids. The shape estimation errors were mainly due to the different sensitivities of the taxels and the use of all voltage variations in wire shape computation, and instances where only a subset of taxels was effectively interested by the contact. The improvements to the method proposed here mainly consist of: using voltage normalized values; extending the estimation to a 5 × 5 taxel matrix; using only taxels which are useful for the estimation.

3.1. Normalization with Linearization

This section explains a normalization procedure for the voltage signals acquired from the tactile sensor. The normalization can improve the sensor performance and, in particular, the wire shape estimation. The need for normalization of the outputs of the tactile sensors (i.e., voltage variations) is due the differences among the responses of the phototransistors. These differences depend on several production aspects, such as: optoelectronic component soldering, imperfections of silicone reflective surfaces which are handmade, bonding of the grid on the PCB and bonding of the deformable pad on the grid. As an example, Figure 2a shows how the Δ v i j responses of some taxels to the same stimulus can differ. It is possible to see that the differences are not only in terms of the maximum reached value, but also in the shape of the response curve. For this reason, it is not possible to normalize by simply dividing the Δ v i j values by their maximums; it is necessary to use all response values to apply a linearization.
The whole procedure can be divided into two parts: a calibration phase and a value conversion phase. During the first one, the tactile fingers are slowly closed until the maximum closure is reached, and the Δ v i j responses of the 25 taxels are captured and stored. In this way, it is not necessary to repeat the calibration every time. The data obtained during the calibration consist of 25 different arrays, one per taxel, containing all the Δ v i j voltage variations acquired during the finger closure. Once the responses are known, a conversion phase (normalization with linearization) is applied to each taxel.
The conversion phase consists of the implementation of 25 standard lookup tables, one for each taxel, beginning with the data captured during the calibration. The lookup tables are two-column matrices, where the first column contains the Δ v i j voltage variations and the second one contains the corresponding Δ v ˜ i j normalized values. These tables are then used to convert the voltage variation of each taxel in the corresponding normalized value during the following process of data elaboration.
Figure 2b shows the normalized signals obtained by applying the lookup tables, corresponding to the ones in Figure 2a, which are a subset of the tactile signals acquired during the calibration phase.

3.2. Wire Shape Estimation

An improved version of the algorithm for the wire shape estimation presented in [23] is proposed here. The first step consists of determining the main direction of the grasped wire, i.e., horizontal (aligned with the x -axis) or vertical (aligned with the y -axis), with respect to the sensor pad reference frame shown in Figure 1b. This step can be achieved in the same way as explained in [23]. Hence, in the following, it is assumed that the main direction of the grasped wire is known—the horizontal one—without losing generality in the treatment. By considering the sensing area of the current tactile sensor (i.e., 21 × 21 mm 2 ) and the fact that standard wires are flexible enough to present simple curvatures in such an area, the shape of the grasped wire can be modelled by a quadratic function. Hence, the considered model for the horizontal grasp can be formalized as in [23]:
y = a x 2 + b x + c
where the parameters a, b and c have to be estimated to define the wire shape. For a vertically grasped wire, the variables x and y are simply exchanged.
Different from [23], where all taxels were used for the wire shape estimation, in this paper, the second step of the algorithm determines the rows of the sensing grid that are effectively interested by the wire grasp. Only the latter are used for the following estimation of wire shape model parameters in Equation (1). In order to identify the taxels interested by the contact, it is possible to localize the taxel with the maximum Δ v ˜ i j value in each column and then use only the rows containing these taxels and the nearest ones. Three simple examples are reported in Figure 3, where the red dots correspond to the taxels with the maximum value in each column, while the rows with red and green dots are used for the estimation of model parameters. The rows in black are excluded from the computation. Excluding the taxels far from the region of interest (i.e., the area where the wire is actually grasped) can avoid considering the mechanical coupling among taxels due to the silicone elasticity, which could be different for two points on the sensing pad and could only introduce dummy information about the contact.
From the implementation perspective, the vector containing the indices i m a x j of the taxels with the maximum value for each column can be defined as
i m a x = i m a x 1 i m a x 2 i m a x 3 i m a x 4 i m a x 5
On the basis of i m a x indices, the centroid y-coordinate y j c for each column can be computed as
y j c = i = i s t a r t i e n d y i Δ v ˜ i j i = i s t a r t i e n d Δ v ˜ i j j = 1 , , 5
with
i s t a r t = min ( 1 , min ( i m a x ) 1 ) i e n d = max ( 5 , max ( i m a x ) + 1 )
In Equation (3), the term y i is the mechanical y-coordinate of the i-th row and the term Δ v ˜ i j is the normalized value corresponding to the voltage variation in the taxel t i j . The complete coordinates of the centroids will be ( x j , y j c ) , where x j is the mechanical x-coordinate of the j-th column. Once the five centroids have been obtained, the three parameters a, b and c in the Equation (1) are computed using a least squares method. The whole procedure for a horizontally grasped wire is resumed with the pseudo-code in Algorithm 1. In case the wire is vertically oriented, the procedure remains the same, with the exchange of row and column indices.
Algorithm 1: Pseudo-code for wire shape estimation
         Input: Taxel normalized values Δ v ˜ i j and taxels mechanical coordinates ( x j , y i )
         Output: a, b and c parameters in Equation (1)
         1: Construct the vector in Equation (2);
         2: Compute the y-coordinate of the 5 centroids by using Equation (3);
         3: Use the centroids to compute a, b and c in Equation (1) via a least squares method;

4. Wire Diameter Recognition

In wire manipulation applications, besides knowing the shape and pose of the wire between the fingers, it could be interesting to have information about the diameter size of the grasped wire. For example: in switchgear assembly, this could be important to ensure that the wire which is about to be inserted has the correct diameter; in wire harness manipulation, this information could be used to confirm that, among all the wires, the correct one has been grasped; knowing the wire diameter can be used to evaluate if the grasping force will guarantee a stable grasp during a cabling task. Hence, in this section, a procedure for wire diameter recognition is proposed.
The first aspect to consider is what kinds of information are available and can be used. Using only data from the tactile sensor could be possible. In this case, the diameter could be inferred from the force exerted by the wire on the sensor, or from the width of the contact area. However, in the first method, the force depends on how much the wire is squeezed, and the fingers of the gripper have to be closed with the same distance for all wires. Considering that wires can have very different diameters; for example, the ones considered for this paper go from 1.5 mm to 4.0 mm, with steps of 0.5 mm, it is not easy to find a unique finger distance that fits for every wire. For the second method, some post-processing data would be necessary and, more importantly, a spatial resolution of 3.55 mm could be a problem when distinguishing diameters that differ by only 0.5 mm. For these reasons, the distance between fingers, typically available from a standard gripper, is used in addition to tactile data.
The second aspect is how to use the available information to recognize the wire diameter. In this case, since the diameter can assume a finite number of values, a classifier is used to complete the task. In order to create the classifier, a dataset composed of several wire observations is necessary. Each observation contains data from the tactile sensor and the gripper (finger distance), which, in this case, is a Robotiq Hand-E, as well as the actual wire diameter. Therefore, an observation can be seen as an array where the first element is the finger distance, elements from 2 to 26 are the normalized values Δ v ˜ i j of the 25 taxels, and the last element is the wire diameter. The dataset is obtained by grasping the wires with a sufficient number of repetitions, by means of a robotic arm (a UR5e by Universal Robots), by varying the distance of the fingers and grasping pose at each grasp (see Figure 4), so that the classifier can recognize the diameter independently from the pose of the grasped wire. The different grasping poses (such as the ones reported in Figure 4) were obtained by randomly changing the angle and the offset of the wire-grasping point with respect to the tactile sensor pad’s center. The whole dataset used for this paper is composed of about 20,000 observations and was separated into two subsets: a training set (containing 80 % of the data) and a testing set (containing the remaining 20 % of the data).
Three classification methods were considered: the first one was used as reference and based on a standard lookup table and two solutions were compared based on machine learning techniques.

4.1. Lookup Table

The first method of classification consists of using a 2 -D lookup table (partially reported in Table 2) which is built from the training dataset as follows:
1.
Computing the sum of the normalized values Δ v ˜ i j for the 25 taxels of each observation;
2.
Separating the observations according to the finger distance (the gripper maps used, with a distance to the range of [ 0 , 255 ] , with 0 corresponding to fully open and 255 to fully closed);
3.
Computing the mean value of the sums obtained in step 1 for each group found in step 2.
Therefore, the input for the classifier based on the lookup table is composed of the sum of the Δ v ˜ i j for the 25 taxels and the finger distance. The output, i.e., the wire diameter, is obtained by searching for the entry in the table with the same finger distance and the nearest value to the input sum.
As a clarifying example, when the inputs of the classifier are 218 finger distance and 3.2 for the sum of Δ v ˜ i j for the 25 taxels, looking at the 4-th row of Table 2, the column with the nearest value to the given input sum is the one corresponding to the wire with a diameter of 2.5 mm; hence, the predicted diameter will be 2.5 mm.

4.2. Machine Learning

The other two methods of classification are based on machine learning. In particular, two types of technique were considered: Support Vector Machines (SVMs) and Neural Networks (NNs).
Starting from the former, since SVM can be only used for binary classifications, while, in the considered case, there are seven different classes (one for each wire plus the “no-wire” case), an Error-Correcting Output Codes (ECOC) classifier was used [24]. This consists of a binary SVM for each possible combination of two different classes, meaning 21 in a case with seven classes. The SVMs used have second-order polynomial kernel functions and the ECOC classifier was trained in Matlab using the command “fitcecoc” on the training dataset. The resulting accuracy is close to 100 % .
The second considered classifier is based on a classical NN. This was obtained using the Matlab “fitcnet” command on the training dataset, leaving all the parameters to default [25]. The network structure is pictured in Figure 5. The resulting accuracy for this classifier is slightly less than the first one, but more than 99 % .

4.3. Validation of Classifiers

This section presents the validation of the three proposed classification methods, using the testing part of the dataset, which was not used during the training phase. The results are presented in the form of confusion matrices in Figure 6. Here, it is easy to see how the method based on the lookup table performs poorly compared to the other two cases. The reason for this different performance could be that, by using the sum of the 25 taxels values instead of all separate values, some information is lost. However, the errors mainly consist of confusing the class of a wire with one adjacent, meaning a difference in the diameter of 0.5 mm.

5. Experiments

This section presents some experiments for the three topics presented in the previous sections, i.e., normalization, shape estimation and wire diameter recognition, with the aim of demonstrating their effectiveness.

5.1. Normalization

In order to see the differences when applying normalization to the voltage variations, a flat surface was placed on the silicone pad of the tactile sensor. A comparison of the two tactile maps is shown in Figure 7, where the size of the blue dots is proportional to the not-normalized voltages’ Δ v i j output from the sensor, and the size of the red ones is proportional to the Δ v ˜ i j -normalized values. It is clear that the red dots have all the same size or the size only slightly changes, according to the grasping of a flat surface, while the size of the blue dots can be very different, even for two adjacent taxels. For example, in Figure 7, the second and third blue dots in the last row have a big difference in terms of size, even if they are equally stimulated. This behaviour can lead to problems in the following tactile data elaborations, e.g., for the shape estimation algorithm as shown in the next experiment.

5.2. Shape Estimation

The wire shape estimation obtained by direct use of the Δ v i j voltage variations coming from the tactile sensor can lead to strange behaviour for some grasped wire poses. For example, for the tactile sensor used in this paper, this occurs when the wire is positioned near the bottom of the silicone pad, where the taxel responses are very different. In fact, as was highlighted in the previous experiment, the voltage variations in the taxels in the bottom row are very different, even when stimulated in the same way, using a flat surface. For this reason, when the centroids were computed, the one related to the second column is slightly shifted upwards, and therefore the shape estimation is wrong. This problem does not occur if normalized values are used, since all taxels have almost the same values when they are equally stimulated, as shown in the previous section.
In order to highlight the entirety of the problem, the situation explained above was reproduced with an experiment, and the results are presented in Figure 8. It is easy to see that the shape estimated using the Δ v i j voltage variations is wrong (red line) because of the centroids (green triangles) of the second and forth columns. Looking at the green line, which is the shape estimated using the Δ v ˜ i j -normalized values, it is clear that the estimation is much closer to the real shape (as shown in Figure 8b). In fact, since, in this case, there are no big differences among the taxle values in the last row, all the centroids (orange stars) have almost the same y-coordinate, and so the interpolating curve is quasi-linear, as it should be, given the actual wire shape.

5.3. Wire Diameter Recognition

In order to test the “online” diameter recognition, an experiment, consisting of a real-time task in which the robot sorts some electrical wires depending on their diameters, was carried out. For the experiment, twelve wires were used, and they are shown in Figure 9. During the task, a user places one of the wires between the fingers of the gripper with a random pose; the fingers are then closed, and the robot places the wire in the correct position on the desk, depending on the diameter estimated by the trained classifier (in this case, the one based on the SVMs, since this is the best-performing one, as shown in Section 4.3). The robot also evaluates the case where no wire is grasped, and, in this case, it remains in initial position, ready to perform a new wire grasp.
Some frames from the video of the experiment (Please see the Supplementary Materials) are reported in Figure 10, where it is possible to see that the robot is given a wire that is correctly recognized as having a diameter of 4.0 mm, and this is placed in the corresponding space on the desk by the robot.
Figure 11 instead contains a frame showing how the wires were sorted by the robot on the basis of estimated diameters. Comparing this with Figure 9, it is clear that the robot correctly sorted all the wires, and that the classifier performed quite well. The attached video shows the complete sorting sequence of wires on the basis of their estimated diameters, repeated twice. At the beginning of the video, the correct distribution of the wires among the different diameter areas is shown. Then, the operator puts the wires in a random order, with random poses between the gripper fingers. The robot sorts the wires on the table on the basis of an on-line diameter classification. At the end of the task, all wires are correctly sorted. The task is repeated by putting the wires between the fingers with a different random sequence. The second sorting is also completed correctly. Finally, a comparison of the wires’ initial distribution and the two obtained sortings is reported, to simplify the evaluation of the proposed approach’s effectiveness for the reader.

6. Conclusions and Future Works

The paper presented an extended and improved version of the algorithm for the wire shape reconstruction presented in [23]. The aforementioned algorithm was adapted to a tactile sensor with a 5 × 5 taxels grid and does not directly use voltage variations to estimate the wire shape, but the corresponding suitably normalized values obtained using a normalization procedure also presented in this paper. Experiments showed that the improved method performs better. In addition, this paper presented a method to recognize the diameter size of a grasped wire by using a suitably trained classifier, which exploits the normalized data. The proposed approach was experimentally validated by showing how the classifier performs in a task in which an industrial robotic arm sorts wires by their diameters, with a near 100 % successful rate.
The limitations of the proposed approach are mainly related to the use of a specific commercial gripper and the flexibility of the material used for the finger realization. In particular, the normalization with linearization also depends on finger deformation during the gripper closure. The current finger cases, made of nylon, present a mechanical flexion during the gripper closure, which increases the differences between the taxel signals. This effect is compensated by the proposed approach but, in future activities, should better increase the mechanical stiffness of the finger cases. Regarding the wire diameter estimation, the proposed classifiers also use finger distance, whose representation changes with different commercial grippers. A generalization of this representation should be tackled in future works.

Supplementary Materials

Author Contributions

Hardware design, S.P.; Software design, A.C. and G.L.; Methodology definition and experiments definition G.L. and S.P.; Writing, all authors. All authors have read and agreed to the submitted version of the manuscript.

Funding

This work was partially supported by the European Commission within H2020 REMODEL Project (no. 870133).

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available since are strictly related to the specific hardware used.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Howe, R.D. Tactile sensing and control of robotic manipulation. J. Adv. Robot. 1994, 8, 245–261. [Google Scholar] [CrossRef]
  2. De Oliveira, T.E.A.; Cretu, A.M.; da Fonseca, V.P.; Petriu, E.M. Touch sensing for humanoid robots. IEEE Instrum. Meas. Mag. 2015, 18, 13–19. [Google Scholar] [CrossRef]
  3. Wang, Y.; Chen, J.; Mei, D. Flexible Tactile Sensor Array for Slippage and Grooved Surface Recognition in Sliding Movement. Micromachines 2019, 10, 579. [Google Scholar] [CrossRef] [Green Version]
  4. Liu, C.; Zhuang, C.; Nasrollahi, A.; Lu, L.; Haider, M.F.; Chang, F. Static Tactile Sensing for a Robotic Electronic Skin via an Electromechanical Impedance-Based Approach. Sensors 2020, 20, 2830. [Google Scholar] [CrossRef] [PubMed]
  5. Rosle, M.H.; Wang, Z.; Hirai, S. Geometry Optimisation of a Hall-Effect-Based Soft Fingertip for Estimating Orientation of Thin Rectangular Objects. Sensors 2019, 19, 4056. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Jones, D.; Wang, L.; Ghanbari, A.; Vardakastani, V.; Kedgley, A.E.; Gardiner, M.D.; Vincent, T.L.; Culmer, P.R.; Alazmani, A. Design and Evaluation of Magnetic Hall Effect Tactile Sensors for Use in Sensorized Splints. Sensors 2020, 20, 1123. [Google Scholar] [CrossRef] [Green Version]
  7. Makihata, M.; Muroyama, M.; Tanaka, S.; Nakayama, T.; Nonomura, T.; Esashi, M. Design and Fabrication Technology of Low Profile Tactile Sensor with Digital Interface for Whole Body Robot Skin. Sensors 2018, 18, 2374. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Jamali, N.; Sammut, C. Material classification by tactile sensing using surface textures. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 2336–2341. [Google Scholar] [CrossRef]
  9. Bandyopadhyaya, I.; Babu, D.; Kumar, A.; Roychowdhury, J. Tactile sensing based softness classification using machine learning. In Proceedings of the 2014 IEEE International Advance Computing Conference (IACC), Gurgaon, India, 21–22 February 2014; pp. 1231–1236. [Google Scholar] [CrossRef]
  10. Liu, W.; Zhan, B.; Gu, C.; Yu, P.; Zhang, G.; Fu, X.; Cipriani, C.; Hu, L. Discrimination of Object Curvature Based on a Sparse Tactile Sensor Array. Micromachines 2020, 11, 583. [Google Scholar] [CrossRef]
  11. Prado da Fonseca, V.; Alves de Oliveira, T.E.; Petriu, E.M. Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference. Sensors 2019, 19, 2285. [Google Scholar] [CrossRef] [Green Version]
  12. Gu, H.; Zhang, Y.; Fan, S.; Jin, M.; Zong, H.; Liu, H. Model recovery of unknown objects from discrete tactile points. In Proceedings of the 2016 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Banff, AB, Canada, 12–15 July 2016; pp. 1121–1126. [Google Scholar]
  13. Meier, M.; Schopfer, M.; Haschke, R.; Ritter, H. A Probabilistic Approach to Tactile Shape Reconstruction. IEEE Trans. Robot. 2011, 27, 630–635. [Google Scholar] [CrossRef]
  14. Schneider, A.; Sturm, J.; Stachniss, C.; Reisert, M.; Burkhardt, H.; Burgard, W. Object identification with tactile sensors using bag-of-features. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 243–248. [Google Scholar]
  15. Petrovskaya, A.; Khatib, O.; Thrun, S.; Ng, A.Y. Bayesian estimation for autonomous object manipulation based on tactile sensors. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando, FL, USA, 15–19 May 2006; pp. 707–714. [Google Scholar]
  16. Damian, D.D.; Newton, T.H.; Pfeifer, R.; Okamura, A.M. Artificial Tactile Sensing of Position and Slip Speed by Exploiting Geometrical Features. IEEE/ASME Trans. Mech. 2015, 20, 263–274. [Google Scholar] [CrossRef]
  17. Stachowsky, M.; Hummel, T.; Moussa, M.; Abdullah, H.A. A Slip Detection and Correction Strategy for Precision Robot Grasping. IEEE/Asme Trans. Mechatron. 2016, 21, 2214–2226. [Google Scholar] [CrossRef]
  18. Allen, P. Integrating vision and touch for object recognition tasks. Int. Robot. Res. 1988, 7, 15–33. [Google Scholar] [CrossRef]
  19. Francomano, M.T.; Accoto, D.; Guglielmelli, E. Artificial Sense of Slip—A Review. IEEE Sens. J. 2013, 13, 2489–2498. [Google Scholar] [CrossRef]
  20. Yousef, H.; Boukallel, M.; Althoefer, K. Tactile sensing for dexterous in-hand manipulation in robotics - A review. Sens. Actuators Phys. 2011, 167, 171–187. [Google Scholar] [CrossRef]
  21. Kappassov, Z.; Corrales, J.A.; Perdereau, V. Tactile sensing in dexterous robot hands—Review. Rob. Aut. Syst. 2015, 74, 195–220. [Google Scholar] [CrossRef] [Green Version]
  22. Cirillo, A.; Costanzo, M.; Laudante, G.; Pirozzi, S. Tactile Sensors for Parallel Grippers: Design and Characterization. Sensors 2021, 21, 1915. [Google Scholar] [CrossRef]
  23. Pirozzi, S.; Natale, C. Tactile-Based Manipulation of Wires For Switchgear Assembly. IEEE/ASME Trans. Mechatron. 2018, 23, 2650–2661. [Google Scholar] [CrossRef]
  24. Escalera, S.; Pujol, O.; Radeva, P. On the decoding process in ternary error-correcting output codes. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 120–134. [Google Scholar] [CrossRef] [PubMed]
  25. Nocedal, J.; Wright, S.J. Numerical Optimization, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
Figure 1. Sensorized finger (a) and taxels disposition scheme (b).
Figure 1. Sensorized finger (a) and taxels disposition scheme (b).
Electronics 10 01458 g001
Figure 2. Responses of some taxels to the same stimulus (finger closure): voltage variation values directly acquired from the sensor (a) and the corresponding normalized values (b).
Figure 2. Responses of some taxels to the same stimulus (finger closure): voltage variation values directly acquired from the sensor (a) and the corresponding normalized values (b).
Electronics 10 01458 g002
Figure 3. Examples for rows selection: wire on one row (a); wire on two rows (b); wire on the edge (c).
Figure 3. Examples for rows selection: wire on one row (a); wire on two rows (b); wire on the edge (c).
Electronics 10 01458 g003
Figure 4. Different grasping poses for the acquisition of the training and testing data.
Figure 4. Different grasping poses for the acquisition of the training and testing data.
Electronics 10 01458 g004
Figure 5. Structure of the neural network used for the classifier.
Figure 5. Structure of the neural network used for the classifier.
Electronics 10 01458 g005
Figure 6. Confusion matrices resulting from the validation of the testing dataset of lookup table (a), SVM (b) and NN (c) methods.
Figure 6. Confusion matrices resulting from the validation of the testing dataset of lookup table (a), SVM (b) and NN (c) methods.
Electronics 10 01458 g006
Figure 7. Tactile map of normalized values (red dots) vs. not-normalized ones (blue dots), during the grasping of a rigid flat object.
Figure 7. Tactile map of normalized values (red dots) vs. not-normalized ones (blue dots), during the grasping of a rigid flat object.
Electronics 10 01458 g007
Figure 8. Comparison between shape estimations using not-normalized and normalized values: tactile maps with shape estimations (a) and estimations superimposed on the actual finger with the grasped wire (b).
Figure 8. Comparison between shape estimations using not-normalized and normalized values: tactile maps with shape estimations (a) and estimations superimposed on the actual finger with the grasped wire (b).
Electronics 10 01458 g008
Figure 9. Wires used for the sorting task.
Figure 9. Wires used for the sorting task.
Electronics 10 01458 g009
Figure 10. Some frames taken from the video of the sorting task.
Figure 10. Some frames taken from the video of the sorting task.
Electronics 10 01458 g010
Figure 11. Wires at the end of the sorting task.
Figure 11. Wires at the end of the sorting task.
Electronics 10 01458 g011
Table 1. Characteristics of the tactile sensor.
Table 1. Characteristics of the tactile sensor.
Number of Taxels25Response Time<0.01 s
Sensing Area 21 × 21 mm 2 Hysteresis Error≈5%
Spatial Resolution 3.55 mmRepeatability Error≈3%
Sampling Frequency 500 HzSensitivity 0.018 V/N
Table 2. Some rows of the lookup table for the classifier.
Table 2. Some rows of the lookup table for the classifier.
Output
1.5 mm2.0 mm2.5 mm3.0 mm3.5 mm4.0 mm
FingerSum of Δ v ˜ ij for the 25 Taxels
Distance
2150.61681.03181.53762.05844.15695.7366
2160.73661.31552.04282.54404.82776.4678
2170.88441.75482.54333.02835.50817.2301
2180.95182.27163.08023.57356.20547.9888
2191.12712.79823.68014.13126.89968.6168
2201.40703.42914.33234.76357.61369.3378
2211.86024.02895.02585.42778.27489.9498
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cirillo, A.; Laudante, G.; Pirozzi, S. Tactile Sensor Data Interpretation for Estimation of Wire Features. Electronics 2021, 10, 1458. https://doi.org/10.3390/electronics10121458

AMA Style

Cirillo A, Laudante G, Pirozzi S. Tactile Sensor Data Interpretation for Estimation of Wire Features. Electronics. 2021; 10(12):1458. https://doi.org/10.3390/electronics10121458

Chicago/Turabian Style

Cirillo, Andrea, Gianluca Laudante, and Salvatore Pirozzi. 2021. "Tactile Sensor Data Interpretation for Estimation of Wire Features" Electronics 10, no. 12: 1458. https://doi.org/10.3390/electronics10121458

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop