Next Article in Journal
Promotion of High-Speed Copper-Filling Performance for Interconnections with Increasing Aspect-Ratio Using Compound Additives
Next Article in Special Issue
A Siamese Vision Transformer for Bearings Fault Diagnosis
Previous Article in Journal
Height Uniformity Simulation and Experimental Study of Electroplating Gold Bump for 2.5D/3D Integrated Packaging
Previous Article in Special Issue
Novel Deep-Learning Modulation Recognition Algorithm Using 2D Histograms over Wireless Communications Channels
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tactile Perception Object Recognition Based on an Improved Support Vector Machine

State Key Laboratory of Public Big Data, Guizhou University, Guiyang 550025, China
*
Author to whom correspondence should be addressed.
Micromachines 2022, 13(9), 1538; https://doi.org/10.3390/mi13091538
Submission received: 24 August 2022 / Revised: 13 September 2022 / Accepted: 15 September 2022 / Published: 17 September 2022
(This article belongs to the Special Issue Embedded System for Smart Sensors/Actuators and IoT Applications)

Abstract

:
Tactile perception is an irreplaceable source of information for humans to explore the surrounding environment and has advantages over sight and hearing in processing the material properties and detailed shapes of objects. However, with the increasing uncertainty and complexity of tactile perception features, it is often difficult to collect highly available pure tactile datasets for research in the field of tactile perception. Here, we have proposed a method for object recognition on a purely tactile dataset and provide the original tactile dataset. First, we improved the differential evolution (DE) algorithm and then used the DE algorithm to optimize the important parameter of the Gaussian kernel function of the support vector machine (SVM) to improve the accuracy of pure tactile target recognition. The experimental comparison results show that our method has a better target recognition effect than the classical machine learning algorithm. We hope to further improve the generalizability of this method and provide an important reference for research in the field of tactile perception and recognition.

1. Introduction

Tactile perception is an important part of the autonomous operation of a robot’s dexterous hand. It provides information on the force and surface characteristics of a robot’s finger and the target contact point [1,2]. Giving robots the ability to accurately recognize objects with tactile perception is an urgent scientific challenge in the field of intelligent robot research [3]. Machine learning algorithms have powerful data training capabilities and can better complete the classification of different data [4]. Classic machine learning algorithms (such as the k-neighbor algorithm (KNN) [5,6], decision tree (DT) [7], Gauss Bayes (GNB) [8], support vector machine (SVM) [9,10], random forest (RF) [11], etc.) can be applied to the object recognition problem of tactile sensing data. The latest advances in robotic tactile sensing have promoted the development of many scientific computing technologies. The application of these technologies to the important sensory channel of touch has important practical significance for the leap forward in the development of intelligent robots.
In recent years, tactile sensor technology has developed rapidly, and there have been many advances in performance and applications [12,13,14]. Tactile sensor technology can detect the force of the target in real time and apply the detected tactile pressure data to the object recognition problem [15]. Alin Drimusa, Gert Kootstrab, et al. [16] demonstrated the application of tactile sensors in active target recognition systems. This research was based on the k-nearest neighbor classifier, which uses dynamic time warp to calculate the distance between different time series, which can successfully identify the target. Subramanian Sundaram et al. [17] designed a retractable tactile glove with 548 sensors on the entire hand. They built a deep convolutional neural network model to process and analyze the tactile data map. Finally, the classification of 26 types of objects was realized.
Recent studies have shown that tactile sensing target recognition problems can also use the spatio-temporal hierarchical matching pursuit (ST-HMP) unsupervised feature learning method [18], deep convolutional neural network model method [19], and efficient codebook formula clustering method (LDS-FCM) [20] to solve them. Brayan S. Zapata-Impata [21] and others used a multi-array tactile sensor grasping experiment platform to carry out target grasping experiments to obtain tactile data and identify targets. The platform was obtained by installing three BioTac SP tactile sensors developed by SynTouch on the top of the index finger, middle finger, and thumb of the shadow dexterous manipulator developed by Shadow Robot. Satoshi Funabashi et al. [22] studied the tactile target recognition problem of relatively densely distributed force vector measurements. First, the UsKin tactile sensor was embedded in Alelgo’s hand. A total of 240 three-axis force vector measurements were provided in all fingers to obtain time series training and test data. Then, object recognition was realized by simple feedforward, recursive, and convolutional neural networks to identify targets. The recognition rate of 20 targets can be as high as 95%. Chunfang Liu et al. [20] proposed a new efficient codebook formula clustering method (LDS-FCM) for spatio-temporal tactile target recognition. The VLAD method was used to obtain the final feature description of the tactile data, and the experiment was verified through five public databases.
The above methods, although validated on different datasets, used processed public datasets. These datasets are insufficient to guarantee that they are equally effective at solving more primitive and diverse recognition problems. If it is experimentally verified in the dataset collected by oneself, it can better demonstrate the generalization ability of the above method.
In response to the above problems, we have provided the original tactile dataset for target recognition and proposed a method to optimize the support vector machine. We have also verified the effectiveness of the method through experiments. We have achieved target recognition through data collection, data processing, data analysis, and other processes.
The main contributions of this paper lie in the following aspects:
(1) We have realized the effective application of different machine learning algorithms in the target recognition of pure tactile perception. We have proposed an improved support vector machine algorithm for tactile perception recognition and verified the effectiveness of the method.
(2) We used grips instead of single touch to record tactile pressure data, as grip is more useful than touch for the operation of intelligent robots and external factors have very little effect on such an experiment.
(3) A new, real, pure tactile grasping perception dataset is released in this study, which provides a useful data resource for researchers in related fields to further study the methods of learning tactile perception.
The rest of this paper is organized as follows: Section 2 introduces the classic machine learning algorithm and proposes an improved SVM algorithm. Section 3 outlines the designs and creation of the tactile dexterous-hand grasping platform for collecting tactile perception data. Section 4 explains the design and production of the tactile dataset and compares and analyzes the experimental results. The final section summarizes the research content.

2. An SVM Optimization Method Based on Tactile Perception Target Recognition

The differential evolution algorithm is simple in structure and easy to execute. It has the advantages of high optimization efficiency, simple parameter setting, and good robustness [23,24]. Therefore, we selected the DE algorithm in the intelligent optimization algorithm to optimize the machine learning algorithm. We proposed a method to optimize the SVM using a modified DE algorithm. This method is used to solve the object recognition problem. We optimized the parameter γ of the Gaussian kernel function of the SVM (see Equation (9) for the calculation of the Gaussian kernel function) in order to improve the training accuracy of the SVM classification model.

2.1. Improved Differential Evolution Algorithm

2.1.1. The Original Differential Evolution Algorithm

(1) Generate an initial population. Initial population { x i ( 0 ) | x j , i L x j , i ( 0 ) x j , i U , i = 1 , 2 , , N P ; j = 1 , 2 , , D } Randomly generated:
x j , i ( 0 ) = x j , i L + r a n d ( 0 , 1 ) ( x j , i U x j , i L )
where x i ( 0 ) represents the i-th individual of the 0-th generation in the population, and x j , i ( 0 ) represents the j-th “gene” of the i-th individual in the 0-th generation. NP represents the population size, and r a n d ( 0 , 1 ) represents a random number uniformly distributed in the interval (0, 1).
(2) Mutations. DE achieves individual variation through a differential strategy. Choosing the differential strategy shown in (2).
v i ( g + 1 ) = x r 1 ( g ) + F ( x r 2 ( g ) x r 3 ( g ) )
where i r 1 r 2 r 3 , F is the crossover factor, and the value is generally optional (0.2~1.5). x 1 ( g ) represents the i-th individual in the g-th generation population.
(3) Cross. Perform inter-individual crossover operations on the g-th population { x i ( g ) } and its mutant intermediate { v i ( g + 1 ) }
u j , i ( g + 1 ) = { v j , i ( g + 1 ) , i f r a n d ( 0 , 1 ) C R o r j = j r a n d x j . i ( g + 1 ) , o t h e r w i s e
where CR is the crossover probability, and the value is generally (0~1), and j r a n d is a random integer of [ 1 , 2 , , D ] .
(4) Choose. DE uses a greedy algorithm to select individuals who enter the next-generation population:
x i ( g + 1 ) = { u i ( g + 1 ) , i f f ( u i ( g + 1 ) ) f ( x i ( g ) ) x i ( g ) , o t h e r w i s e

2.1.2. Improved Differential Evolution Algorithm (DE)

In the optimization process, the effects that the DE algorithm needs to achieve are different in the initial, mid, and late stages. If the mutation factor F is set to a constant, the algorithm can easily fall into the local optimum, reducing the convergence speed and search accuracy. Therefore, we adjusted F linearly, as shown in Formula (5). F will change as the number of iterations increases.
F = ( F 1 F 2 ) G g G + F 2
where F1 and F2 are the maximum and minimum mutation factors, G is the maximum number of iterations, and g is the current number of iterations.
As the cross factor CR has a greater impact on selecting the next-generation population individuals, a constant CR is difficult to improve the optimization efficiency of the DE algorithm. Therefore, we adaptively adjusted CR according to Formula (6).
C R = C R 2 + C R 1 C R 2 G g
Here, CR1 and CR2 are the maximum and minimum crossover factors, respectively. G is the maximum number of iterations. g is the current number of iterations.

2.2. Improved Differential Evolution Algorithm (DE) to Optimize Support Vector Machine (SVM)

2.2.1. The Original Support Vector Machine (SVM)

The SVM model represents training targets as points in space, which were mapped to a picture, separated by a clear, widest possible interval to distinguish two categories [25]. Subsequently, the new targets were mapped into the same space. The categories they belonged to were predicted based on which side of the interval they fell on.
The SVM found a separating hyperplane in the distributed data as the decision boundary so that the classification error of the model on the data was as small as possible [26]. First, the support vector was found, then the distance between the support vector and the separation plane was maximized, and finally, the most suitable hyperplane was found.
The classification decision function is:
f ( x ) = s i g n ( w T * x + b )
The optimization objective function is:
min ( w , b ) w                         s . t .         y i ( w T X i + b ) δ , i = 1 , , m
where w is the vector perpendicular to the hyperplane. x is the support vector. b is the intercept.

2.2.2. Improved Differential Evolution Algorithm to Optimize Support Vector Machine (DESVM)

Because the sample feature dimension was not high and the number was not large, we adopted the Gaussian kernel function as the kernel function of the support vector machine. We used an improved DE algorithm to optimize the parameter γ of the SVM kernel function so that the recognition effect of the SVM reached the best result.
The Gaussian kernel function formula is as follows:
k ( x , y ) = exp ( γ x y 2 )
Here, γ is a nuclear parameter.
The process of the DE algorithm optimization SVM is shown in Figure 1.
The relevant parameters of the DE algorithm were set as follows: the maximum number of generations, G = 50; the population size, Size = 50; the dimension of the search space, D = 4; the mutation factor, F; and the cross factor, CR were obtained by linear equations. The purpose of optimizing the parameters of the support vector machine was to maximize the accuracy of classification. Therefore, the fitness value of the DE algorithm was set as: fit = accuracy (classification accuracy). After the optimization of the DE algorithm, the optimal value of the objective function and the optimal parameters of the SVM were finally obtained.

3. Tactile Data Collection Platform for the Dexterous Hand

3.1. Construction of the Data Collection Platform

Because effective tactile perception data requires high sensitivity and stability of tactile sensors, there are fewer relevant and useful datasets in the field of tactile perception research. In order to collect pure tactile datasets for object recognition, we built a dexterous-hand grasping experiment platform, as shown in Figure 2. The multi-array tactile sensor grasping platform consisted of a robotic arm developed by the RAKAE Robot Company, a linkage-driven dexterous hand developed by the Qingrui Boyuan Company, and five tactile sensors.
The connecting rod dexterous hand adopted a 5-finger design and a connecting rod transmission method, in which the thumb had 2 degrees of freedom, the other fingers had 1 degree of freedom, and the 5 fingers had a total of 6 degrees of freedom to achieve the perfect simulation of a human hand. The fingertip was equipped with a multi-array tactile sensor to sense the grip strength in real time. It also can perform a variety of grasping operations with high precision.
The main materials of the tactile sensors included: a flexible pressure-sensitive material with high sensitivity and a low relaxation time, flexible interface transition materials with high stability, flexible cladding materials with good elasticity and high mechanical strength, and a flexible interconnect material between the nodes. At the same time, combined with CMOS process technology, a composite sensing structure was realized by depositing polymer materials on the substrate of the CMOS tactile sensor readout chip to form a monolithic tactile sensing unit (i.e., node). The flexible technology is used to realize the communication between interconnected sensing units and the signal readout.
First, a polydimethylsiloxane polymer with good elasticity and strong plasticity was used as the base material. These carbon nanotubes were used as the conductive filler of the pressure-sensitive composite material to synthesize a new type of pressure-sensitive composite material with piezoresistive properties. The composite material was prepared into an array flexible tactile sensor with the characteristics of good flexibility, strong fatigue resistance, large dynamic range, and many array units. Then, the tactile sensor, the readout circuit with ultra-wide dynamic range, the sensor interface circuit with stepped resolution, and the main signal processing chip were integrated to form a complete sensing system.
The upper and lower electrodes of the fingertip tactile sensor were designed in the form of “five horizontal and five vertical” so that the whole piece of pressure-sensitive material is automatically divided into a 5 × 5 matrix, that is, 25 piezoresistive units. The schematic diagram of their unit positions and numbers is shown in Figure 3.
The linkage-driven dexterous hand, as a human robot or manipulator end-manipulation tool, can adaptively grasp objects of complex shapes and operate complex tasks in the special environment of industrial production.
This dexterous hand is able to perform a variety of grasping operations with high precision owing to a multi-array pressure sensor at the tip of the fingertip. Linked ambidextrous hands are powered by 8.4 V DC and are suitable for mobile applications.
The ROKAE robot can ensure flexible movement in six degrees of freedom in space. It performs with high precision and at high speed. Its working range is 1206 mm. The repeated positioning accuracy can reach 0.05 mm, which can ensure the smoothness and robustness of the robot trajectory during the experiment and improve the reliability of the data.

3.2. Tactile Data Collection Experiment

We used the teach pendant to control the movement of the ROKAE robot. The ROKAE robot moved to the grasping position with the tactile dexterous hand, and then it performed grasping experiments. The back end of the system recorded the pressure data from the tactile sensor in real time during the grasping process. An example of our grasping experiment is shown in Figure 4. We carried out 100 repeated grasping experiments for each type of object. A total of 500 repeated grasping experiments were carried out. Finally, we obtained the original dataset for tactile perception of five types of objects.
Figure 5 shows the variation trend in the pressure data within 100 ms for five fingers of the dexterous hand when grasping the irregular-surface object.

3.3. Data Preprocessing

In order to improve the tactile perception data mining analysis, reduce time, reduce cost, and improve quality, we needed to perform data preprocessing on the tactile data collected by tactile sensors.
First, we replaced the mutated data in Figure 5 with the mean value of the tactile data for the fingers within 100 ms and used manual interpolation to preprocess the data. Second, we performed min-max normalization and z-score normalization on the collected tactile perception raw data [27]; the specific methods are as follows:
(1) Min-max normalization
The following calculations were performed for each attribute. Let minA and maxA be the minimum and maximum values of attribute A, respectively. An original value x of A was mapped to a value x′ in the interval [0, 1] through min-max normalization. The calculation method is as shown (10):
x i = x i x min x max x min
(2) The z-score normalization
μ = 1 N i = 1 N x i
σ = 1 N i = 1 N ( x i μ ) 2
x i = x i μ σ
Data standardization was performed based on the mean and standard deviation of the original data. The original value x of A was normalized to x′ using the z-score. The z-score normalization method is suitable for situations where the maximum and minimum values of attribute A are unknown or when there are outlier data beyond the value range. The calculation method for z-score normalization is shown in (13). μ represents the mean. σ represents the standard deviation.

4. Experimental Results and Analysis

In our experiments, all calculations were performed using a computer with a 64 GB GPU (NVIDIA GeForce RTX 3090) and a Windows 10 operating system. The entire experiment was completed in Anaconda. We used Python language to realize the tactile recognition problem based on machine learning.

4.1. Division of the Tactile Perception Dataset

The dataset includes five types of objects. These five types of goals are shown in Figure 4, including (1) a cube wood block, (2) a cuboid plastic block, (3) a cylindrical curved box, (4) an elliptical surface, and (5) an irregular surface object.
We divided the dataset according to the ratio of training samples and test samples to 7:3. Then, we input the training set samples and test set samples for the tactile perception data on five types of objects into KNN, SVM, RF, GNB, DT, and our proposed DESVM, respectively. The five algorithm models were trained and tested. The accuracy of object recognition was finally obtained. The specific division of the dataset is shown in Table 1.
Figure 6 shows the tactile map of a moment in the grabbing of a cylindrical curved box experiment. The tactile map illustrates the comparison of the amount of pressure data collected by five tactile sensors (a total of 25 × 5 = 125 piezoresistive units). The value 0~350 represents the pressure range of the dexterous hand when grasping the object. The unit is N.

4.2. Comprehensive Evaluation Method

Machine learning algorithms are continuously derived and optimized on the basis of traditional algorithms. Various new algorithms have appeared, but the comprehensive performance evaluation indicators of these algorithms are almost the same [28,29]. We use four performance evaluation indicators (i.e., accuracy, precision, recall, and F1) to evaluate the model performance of the classification model.
Accuracy was used to measure the accurate proportion of all samples classified. It can well express the probability of accurate classification. The calculation method for accuracy is shown in Formula (14). The calculation result for precision is shown in Formula (15). Recall is used to measure the probability that a positive sample is correctly predicted among all positive samples. Its calculation method is shown in Formula (16). The F1-score is the harmonic mean of precision and recall. Its calculation method is shown in Formula (17).
A c c = T P + T N T P + F N + F P + T N
P r e c i s i o n = T P T P + F P
R e c a l l = T P T P + F N
F 1 s c o r e = 2 R e c a l l * P r e c i s i o n R e c a l l + P r e c i s i o n
Theoretically speaking, when the accuracy rate, precision rate, and recall rate are close to 1, the prediction effect of the machine learning classification algorithm is better and the prediction is more accurate [27]. However, it is difficult to achieve such an effect in practical applications, especially in the multi-object classification problem for more complex tactile features, where it is almost impossible to achieve. Moreover, the precision rate and the recall rate often affect each other, one is higher and the other is lower, so in practical applications, an appropriate balance should be made according to specific needs.

4.3. Object Recognition Result Analysis

The five machine learning algorithms, KNN, SVM, RF, GNB, and XDESVM, have different effects on classifying five types of objects with different shapes. In order to obtain the best classification effect among the five algorithms, ten groups of experiments were conducted, and the average experimental results are shown in Table 2.
As can be seen from Table 2, the recall and F1 values for GNB are 12.96% and 14.11% lower than for SVM, and 12.86% and 14.45% lower than for RF, respectively. This shows that compared with DT and GNB, the three algorithms KNN, RF, and SVM are more stable. The reason is that GNB and DT assume that the properties are independent of each other, but this assumption is often not true for real haptic data. This has a certain impact on the correct recognition and classification of GNB and DT models. Therefore, the performance of GNB and DT in the object recognition task of tactile perception is not ideal. In addition, the four performance evaluation indicators after training and testing on tactile data samples by SVM and RF are more uniform. All four performance evaluation indicators are better than for KNN. The four performance evaluation indicators, accuracy, precision, recall, and F1, after training with five types of objects by SVM, reached 98.50%, 98.21%, 98.56%, and 98.22%, respectively. With the exception of the F1-score, the other three indicators were a little higher than for RF.
In this paper, the improved DE algorithm was used to optimize the kernel function parameters of SVM, which improves the classification effect of the object.
The tactile recognition accuracy for the five types of objects we collected reached 99.60%, the precision reached 99.61%, the recall rate reached 99.60%, and the F1-score reached 99.50%. During training on our collected haptic dataset, DESVM performed better than SVM. The accuracy, precision, recall, and F1-score for object recognition were improved by 1.10%, 1.40%, 1.16%, and 1.32%, respectively. All four indicators improved, which shows that the method proposed in this paper can effectively improve the effect of object classification. The comparison of five machine learning algorithm evaluation index values is shown in Figure 7. The changing trend in the line graph clearly shows the classification effect of the five machine learning algorithms on the five types of objects based on the tactile perception data.
In order to ensure the correctness of the machine learning algorithm in the application of tactile perception recognition, ten-fold hierarchical cross-validation was used to test and evaluate the accuracy of these six classification algorithm models. The experimental results are shown in Table 3.
As shown in Figure 8, the evaluation results show that the improved support vector machine had the highest accuracy for tactile perception object recognition, reaching 99.60%, while the evaluation results of other machine learning algorithms were not very satisfactory, indicating that the optimized support vector machine’s use for tactile perception research has practical significance.
The confusion matrix of the recognition results for each machine learning algorithm is shown in Figure 9. The vertical axis represents the truth, and the horizontal axis represents the output of the classification. The labels correspond to the test objects.

5. Conclusions

We have proposed an efficient object recognition algorithm (DESVM) based on purely tactile sensory data. The algorithm optimizes the parameters of the Gaussian kernel function of the support vector machine by taking advantage of the high search accuracy of the improved differential evolution algorithm. We increased the accuracy of tactile perception object recognition to 99.6%. The experimental results show that the method effectively improves the object recognition accuracy of tactile perception data. We provide an important reference for research in the field of tactile perception. In future studies, we will further increase the available datasets in the tactile field. In addition, we will further apply the proposed method to more tactile data with complex features to improve the generalizability of the method.

Author Contributions

Conceptualization, X.Z. and S.L.; data curation, X.Z.; formal analysis, X.Z.; funding acquisition, S.L.; Investigation, X.Z.; methodology, X.Z., J.Y., Y.W. and Z.H.; project administration, S.L.; resources, X.Z.; software, X.Z. and J.Y.; supervision, S.L.; validation, X.Z. and J.Z.; visualization, X.Z.; writing—original draft, X.Z.; writing—review and editing, X.Z. and J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by the National Important Project under grant No. 2020YFB1713300 and No. 2019YFB1312704 and by the Guizhou Province Higher Education Project under grant No. [2020]005 and No. [2020]009, and by the Natural Science Foundation of Guizhou Provincial Basic Research Program under grant No. QKHYB-ZK [2022]130.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cretu, A.; de Oliveira, T.E.A.; da Fonseca, V.P.; Tawbe, B.; Petriu, E.M.; Groza, V.Z. Computational intelligence and mechatronics solutions for robotic tactile object recognition. In Proceedings of the 2015 IEEE 9th International Symposium on Intelligent Signal Processing (WISP), Siena, Italy, 15–17 May 2015; pp. 1–6. [Google Scholar] [CrossRef]
  2. Kim, J.; Kim, S.P.; Kim, J.; Hwang, H.; Kim, J.; Park, D.; Jeong, U. Object shape recognition using tactile sensor arrays by a spiking neural network with unsupervised learning. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020; pp. 178–183. [Google Scholar] [CrossRef]
  3. Abderrahmane, Z.; Ganesh, G.; Crosnier, A.; Cherubini, A. A Deep Learning Framework for Tactile Recognition of Known as Well as Novel Objects. IEEE Trans. Ind. Inform. 2020, 16, 423–432. [Google Scholar] [CrossRef]
  4. Gandarias, J.M.; Gómez-de-Gabriel, J.M.; García-Cerezo, A. Human and object recognition with a high-resolution tactile sensor. In 2017 IEEE Sensors; IEEE: Piscataway, NJ, USA, 2017; pp. 1–3. [Google Scholar] [CrossRef]
  5. Luo, S.; Mou, W.; Althoefer, K.; Liu, H. Novel Tactile-SIFT Descriptor for Object Shape Recognition. IEEE Sens. J. 2015, 15, 5001–5009. [Google Scholar] [CrossRef]
  6. Wang, Y.; Zhibin, P.; Yiwei, P. A Training Data Set Cleaning Method by Classification Ability Ranking for the $k$ -Nearest Neighbor Classifier. IEEE Trans. Neural Netw. Learn. Syst. 2020, 31, 1544–1556. [Google Scholar] [CrossRef] [PubMed]
  7. Owens, E.A.; Griffiths, R.E.; Ratnatunga, K.U. Using oblique decision trees for the morphological classification of galaxies. Mon. Not. R. Astron. Soc. 1996, 281, 153–157. [Google Scholar] [CrossRef]
  8. Yang, J.; Diego, K. Bayesian Active Learning for Choice Models With Deep Gaussian Processes. IEEE Trans. Intell. Transp. Syst. 2021, 22, 1080–1092. [Google Scholar] [CrossRef]
  9. Gandarias, J.M.; García-Cerezo, A.J.; Gómez-de-Gabriel, J.M. CNN-Based Methods for Object Recognition With High-Resolution Tactile Sensors. IEEE Sens. J. 2019, 19, 6872–6882. [Google Scholar] [CrossRef]
  10. Ruiz, L.I.; Beccaro, W.; Evaristo, B.G.; Fernandez, F.J.R. Tactile Sensing Glove-Based System for Objects Classification Using Support Vector Machine. IEEE Lat. Am. Trans. 2018, 16, 1658–1663. [Google Scholar] [CrossRef]
  11. Zhan, L.; Ma, X.; Fang, W.; Wang, R.; Liu, Z.; Song, Y.; Zhao, H. A rapid classification method of aluminum alloy based on laser-induced breakdown spectroscopy and random forest algorithm. Plasma Sci. Technol. 2019, 21, 34018. [Google Scholar]
  12. Kappassov, Z.; Baimukashev, D.; Adiyatov, O.; Salakchinov, S.; Massalin, Y.; Varol, H.A. A Series Elastic Tactile Sensing Array for Tactile Exploration of Deformable and Rigid Objects; IEEE: Madrid, Spain, 2018; pp. 520–525. [Google Scholar]
  13. Donlon, E.; Dong, S.; Liu, M.; Li, J.; Adelson, E.; Rodriguez, A. GelSlim: A High-Resolution, Compact, Robust, and Calibrated Tactile-Sensing Finger; IEEE: Madrid, Spain, 2018. [Google Scholar]
  14. Calandra, R.; Owens, A.; Jayaraman, D.; Lin, J.; Yuan, W.; Malik, J.; Adelson, E.H.; Levine, S. More Than a Feeling: Learning to Grasp and Regrasp Using Vision and Touch. IEEE Robot. Autom. Lett. 2018, 3, 3300–3307. [Google Scholar] [CrossRef]
  15. Luo, S.; Mou, W.; Li, M.; Althoefer, K.; Liu, H. Rotation and translation invariant object recognition with a tactile sensor. In SENSORS, 2014 IEEE; IEEE: Piscataway, NJ, USA, 2014; pp. 1030–1033. [Google Scholar] [CrossRef]
  16. Drimus, A.; Kootstra, G.; Bilberg, A.; Kragic, D. Design of a flexible tactile sensor for classification of rigid and deformable objects. Robot. Auton. Syst. 2014, 62, 3–15. [Google Scholar] [CrossRef]
  17. Sundaram, S.; Kellnhofer, P.; Li, Y.; Zhu, J.Y.; Torralba, A.; Matusik, W. Learning the signatures of the human grasp using a scalable tactile glove. Nature 2019, 569, 698–702. [Google Scholar] [CrossRef] [PubMed]
  18. Madry, M.; Bo, L.; Kragic, D.; Fox, D. ST-HMP: Unsupervised Spatio-Temporal feature learning for tactile data. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014. [Google Scholar]
  19. Lambeta, M.; Chou, P.-W.; Tian, S.; Yang, B.; Maloon, B.; Most, V.R.; Stroud, D.; Santos, R.; Byagowi, A.; Kammerer, G.; et al. DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile Sensor With Application to In-Hand Manipulation. IEEE Robot. Autom. Lett. 2020, 5, 3838–3845. [Google Scholar] [CrossRef]
  20. Liu, C.; Huang, W.; Sun, F.; Luo, M.; Tan, C. LDS-FCM: A Linear Dynamical System Based Fuzzy C-Means Method for Tactile Recognition. IEEE Trans. Fuzzy Syst. 2019, 27, 72–83. [Google Scholar] [CrossRef]
  21. Zapata-Impata, B.S.; Gil, P.; Mezouar, Y.; Torres, F. Generation of Tactile Data From 3D Vision and Target Robotic Grasps. IEEE Trans. Haptics 2021, 14, 57–67. [Google Scholar] [CrossRef] [PubMed]
  22. Funabashi, S.; Morikuni, S.; Geier, A.; Schmitz, A.; Ogasa, S.; Torno, T.P.; Somlor, S.; Sugano, S. Object Recognition through Active Sensing Using a Multi-Fingered Robot Hand with 3D Tactile Sensors; IEEE: Madrid, Spain, 2018; pp. 2589–2595. [Google Scholar]
  23. Das, S.; Abraham, A.; Konar, A. Automatic Clustering Using an Improved Differential Evolution Algorithm. IEEE Trans. Syst. Man Cybern.—Part A Syst. Hum. 2008, 38, 218–237. [Google Scholar] [CrossRef]
  24. Liu, M.; Li, J.; Zong, T. Parameter identification of Box-Jenkins systems based on the differential evolution algorithm. In Proceedings of the 2020 Chinese Control and Decision Conference (CCDC), Hefei, China, 22–24 August 2020; pp. 1557–1561. [Google Scholar] [CrossRef]
  25. Hou, Y.; Ding, X.; Hou, R. Support vector machine classification prediction model based on improved chaotic differential evolution algorithm. In Proceedings of the 2017 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD), Guilin, China, 29–31 July 2017; pp. 123–129. [Google Scholar] [CrossRef]
  26. Shu, W.; Cai, K. A SVM Multi-Class Image Classification Method Based on DE and KNN in Smart City Management. IEEE Access 2019, 7, 132775–132785. [Google Scholar] [CrossRef]
  27. Cheng, Y.; Li, D.; Guo, Z.; Jiang, B.; Geng, J.; Lin, J.; Fan, X.; Yu, X.; Bai, W.; Qu, L.; et al. Accelerating End-to-End Deep Learning Workflow With Codesign of Data Preprocessing and Scheduling. IEEE Trans. Parallel Distrib. Syst. 2021, 32, 1802–1814. [Google Scholar] [CrossRef]
  28. Xian, G. Parallel Machine Learning Algorithm Using Fine-Grained-Mode Spark on a Mesos Big Data Cloud Computing Software Framework for Mobile Robotic Intelligent Fault Recognition. IEEE Access 2020, 8, 131885–131900. [Google Scholar] [CrossRef]
  29. Tao, R.; Zhang, S.; Huang, X.; Tao, M.; Ma, J.; Ma, S.; Zhang, C.; Zhang, T.; Tang, F.; Lu, J.; et al. Magnetocardiography-Based Ischemic Heart Disease Detection and Localization Using Machine Learning Methods. IEEE Trans. Biomed. Eng. 2019, 66, 1658–1667. [Google Scholar] [CrossRef] [PubMed]
Figure 1. DE algorithm optimization SVM flow chart.
Figure 1. DE algorithm optimization SVM flow chart.
Micromachines 13 01538 g001
Figure 2. Tactile data collection platform.
Figure 2. Tactile data collection platform.
Micromachines 13 01538 g002
Figure 3. Distribution of each piezoresistive unit of the tactile sensor.
Figure 3. Distribution of each piezoresistive unit of the tactile sensor.
Micromachines 13 01538 g003
Figure 4. Dataset collection experiment.
Figure 4. Dataset collection experiment.
Micromachines 13 01538 g004
Figure 5. The pressure data curve of five fingers when grabbing an irregular surface object. (a) The pressure data map for thumbs. (b) The pressure data map of the index finger. (c) The pressure data map of the middle finger. (d) The pressure data map of the ring finger. (e) The pressure data map of the little finger.
Figure 5. The pressure data curve of five fingers when grabbing an irregular surface object. (a) The pressure data map for thumbs. (b) The pressure data map of the index finger. (c) The pressure data map of the middle finger. (d) The pressure data map of the ring finger. (e) The pressure data map of the little finger.
Micromachines 13 01538 g005
Figure 6. Tactile data map of five fingers.
Figure 6. Tactile data map of five fingers.
Micromachines 13 01538 g006
Figure 7. Comparison of experimental results of comprehensive evaluation indicators of six algorithms.
Figure 7. Comparison of experimental results of comprehensive evaluation indicators of six algorithms.
Micromachines 13 01538 g007
Figure 8. Comparison of algorithm’s accuracy evaluation by ten-fold hierarchical cross-validation.
Figure 8. Comparison of algorithm’s accuracy evaluation by ten-fold hierarchical cross-validation.
Micromachines 13 01538 g008
Figure 9. The confusion matrix of the recognition results for GNB, DT, KNN, RF, SVM, and DESVM machine learning algorithm.
Figure 9. The confusion matrix of the recognition results for GNB, DT, KNN, RF, SVM, and DESVM machine learning algorithm.
Micromachines 13 01538 g009
Table 1. Dataset division.
Table 1. Dataset division.
ObjectsCube Wood BlockCuboid Plastic BlockCylindrical Curved BoxElliptical SurfaceIrregular Surface Objects
Training set7070707070
Testing set3030303030
TotalTraining set: 350, testing set: 150
Table 2. Comprehensive performance of each machine learning model.
Table 2. Comprehensive performance of each machine learning model.
AlgorithmAccuracyPrecisionRecallF1
GNB0.8560 0.9148 0.8560 0.8411
KNN0.9667 0.9687 0.9667 0.9669
DT0.9733 0.9735 0.8670 0.8456
RF0.9860 0.9832 0.9846 0.9859
SVM0.9850 0.9821 0.9856 0.9822
DESVM0.9960 0.9961 0.9960 0.9950
Table 3. The average (Avg) and standard deviation (Std) of the accuracy for ten-fold cross-validation.
Table 3. The average (Avg) and standard deviation (Std) of the accuracy for ten-fold cross-validation.
AlgorithmAvg. AccuracyStd. Accuracy
GNB0.8820.032
KNN0.9640.022
DT0.9780.025
RF0.9880.008
SVM0.9860.017
DESVM0.9960.006
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, X.; Li, S.; Yang, J.; Wang, Y.; Huang, Z.; Zhang, J. Tactile Perception Object Recognition Based on an Improved Support Vector Machine. Micromachines 2022, 13, 1538. https://doi.org/10.3390/mi13091538

AMA Style

Zhang X, Li S, Yang J, Wang Y, Huang Z, Zhang J. Tactile Perception Object Recognition Based on an Improved Support Vector Machine. Micromachines. 2022; 13(9):1538. https://doi.org/10.3390/mi13091538

Chicago/Turabian Style

Zhang, Xingxing, Shaobo Li, Jing Yang, Yang Wang, Zichen Huang, and Jinhu Zhang. 2022. "Tactile Perception Object Recognition Based on an Improved Support Vector Machine" Micromachines 13, no. 9: 1538. https://doi.org/10.3390/mi13091538

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop