**8. Conclusions**

In this work, we proposed a novel framework for robotic tactile object recognition with inspiration from human tactile object exploration. The two sensing modalities in human tactile perception including kinesthetic and cutaneous cues are simulated for a set of object models and an enhanced model of visual attention intervenes to guide the process of touching. The originality of the work described in this paper is that the visual data does not directly interfere in the process of object recognition but is employed to select the most informative contours of the object that enhance the accuracy of classifiers. Two different approaches using convolutional neural networks and two other conventional classifiers (kNN and SVM) are employed to classify the tactile data for object recognition. The obtained results are compared to the case where visual data are not used confirming the fact that visual attention can improve the process of tactile data acquisition. An accuracy of 98.97% is the highest performance achieved in this study using CNN. A future application of the current work is the integration of the proposed intelligent algorithm in the decision system of a robot that can make use of its vision to select contours and then use its hand equipped with FSR tactile sensors of different sizes in finger phalanges, fingertips, and palm to touch objects and recognize them. Since for real implementation of the proposed framework the occurrence of noise may affect the performance, we will take advantage of a denoising autoencoder assisting to reconstruct the corrupted inputs.

**Author Contributions:** G.R. and A.-M.C. conceived and designed the experiments; G.R. performed the experiments; G.R. and A.-M.C. analyzed the data and wrote the paper.

**Funding:** This work is supported in part by the Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery Grant Program and by the Ontario Graduate Scholarship (OGS) program.

**Conflicts of Interest:** The authors declare no conflicts of interest.
