Vision-Based Tactile Sensor Mechanism for the Estimation of Contact Position and Force Distribution Using Deep Learning
Abstract
:1. Introduction
1.1. Background
1.2. Problem Statement
- The problem statement of this study is to predict the force distribution and contact position parameters that are to be estimated by the trained deep learning network using the training data acquired from the visual tactile sensor setup.
1.3. Purpose of Study
- employing deep learning for the transfer learning of VGG16 classification pre-trained network model; and
- validating the vision-based tactile sensor system to examine the estimation of contact position, contact area, and force distribution using thick and thin tactile sensors with various shapes.
2. Literature Review
2.1. Vision-Based Tactile Sensor Technology
2.2. Previous Works
3. Materials and Methods
3.1. System Installation and Flow Schematic
- Motion actuator with vision tactile sensor bench: The motion actuators are used in the test bench to facilitate the motion along the linear () and rotational () axis. The contact shaped tool is activated through actuators in order to make contact with the elastic tactile tip which has a camera fixed inside it.
- Motion controllers: The motors are controlled using the motion controllers which indeed act as a bridge between the motion actuators and control PC. This motion controller considers all the parameters, such as force, contact position, and angle, so that the motion exhibits the desired outcome as expected.
- Control PC: The control PC is a general personal computer with a LabVIEW GUI which acts as an activity log of the motions, controls, and data acquisition/processing center for the whole system installation. The training/testing data is collected from the test bench stereo camera setup via a USB port. Then, the LabVIEW software is used to accumulate the data with corresponding tactile control parameters for network training/testing.
3.2. Development of Tactile Fingertips
3.2.1. Process of Making Tactile Fingertips
- Physical: The tactile tip must sustain the repetitive stress and must exhibit the same tactility throughout the sensor data acquisition. But, often, the insides portion of the tactile tip severely suffers from air bubbles. This problem was encountered in this study, and it was successfully resolved using the process of vacuum degassing of the tactile tip while manufacturing it. This process is shown in Figure 4f, and it efficiently reduced the air bubbles and offered better endurance to the tactile tips.
- Visual: The visual reliability of the tactile tip was improved by the marker painting process, as shown in Figure 4g, which helped in the recognition of deformation patterns visually. Initially, a white paint is to mark the markers on the surface of the sensor. During the durability test, the markers were not compatible with the tactile sensor rubber material. Therefore, the marker painting is done using the same rubber material but with white color for easy recognition.
3.2.2. Tactile Fingertip Sensor Design Aspects
- Shore hardness (surface hardness): The tactile materials with a standard thickness t = 1 mm are considered with different shore hardness scales = 40, 60, 70, 80. The force-displacement characteristic plots can be observed in Figure 5 where, with the increase in the force, the tactile tip with shore hardness 40 is easily displaced losing its linearity in terms of elasticity, i.e, the tip with shore hardness 40 is too weak to be used as an elastic body at force 1 N. Similarity, with the increase in the force, the tactile tip material with shore hardness 60 seem to have similar displacement characteristics, like the shore hardness 40 material, but a bit linear. In contrast, the comparison between shore hardness 70 and 80 resulted in choosing the optimal shore hardness of 70 for the study experiments because shore hardness 80 is insensitive to be an elastic material with linearity at various force steps.
- Thickness (elastic stability): The materials with an optimal shore hardness range 70, were chosen. Then, the thickness t = 1 mm, 1.15 mm, 1.25 mm, 1.50 mm were investigated with an applied force of 1 N, as shown in Figure 6a, and thickness t = 2.0 mm, 2.5 mm were investigated with an applied force of 10 N, as shown in Figure 6b. At an applied force 1 N, the material with thickness of t = 1 mm is suitable for the deformation of 4 mm, and all the rest, t = 1.15 mm, 1.25 mm, 1.50 mm, cannot be used if the expected deformation is 4 mm or higher. For an applied force of 10 N, material with thickness t = 2 mm collapsed when the force reached 7 N, but the material thickness t = 2.5 mm is stable at 10 N. This ablation study facilitates the choice of better tactile fingertips for the experiments.
3.3. Stereo Camera System
3.4. Deep Learning Methodology
3.4.1. Region-of-Interest (ROI) and Mode Selection
- Mode-0: This mode will only consider the left image from the stereo pair as an input to the neural network.
- Mode-1: This mode will concatenate left and right gray images per channel and input them to the neural network.
- Mode-2: This mode will consider the left image binarized to enhance lighting and feed it to the neural network as input.
- Mode-3: This mode will concatenate the left and right images binarized for each channel to enhance lighting and feed it to the neural network as input.
3.4.2. Zero Centering and Scaling
3.4.3. Network Architecture
3.5. Contact Area Estimation
4. Experiments and Evaluations
4.1. Dataset Used
4.2. Training Details
4.3. Testing Evaluations
4.3.1. Testing Scenario-1: Force Distribution Estimation
4.3.2. Testing Scenario-2: Contact Point (Displacement) Estimation along Linear X-axis, Y-axis, an Z-axis
- Along Z-axis: The force is applied in Z-direction from 0.1 N to 1 N with 0.1 N interval such that total 10 tests were conducted. The difference between the original position along Z-axis and the estimated one is recorded as the error and an average error over 10 tests is calculated to evaluate the performance of the prediction.
- Along X-axis: For evaluating the displacement along X-axis, the force is applied in intervals of 0.1 N from 0.1 N to 1 N with 1-mm displacement step along the X-axis keeping the Y-axis displacement as 0. Therefore, the testing is done for ( −6 mm mm, with 1 mm step interval, total 13 points, constant ). The difference between the original position along X-axis and the estimated one is recorded as the error and an average error over 13 points is calculated to evaluate the performance of the prediction.
- Along Y-axis: For evaluating the displacement along Y-axis, the force is applied in intervals of 0.1 N from 0.1 N to 1 N with 1-mm displacement step along the Y-axis keeping the X-axis displacement as 0. Therefore, the testing is done for ( −6 mm mm, with 1 mm step interval, total 13 points, constant ). The difference between the original position along Y-axis and the estimated one is recorded as the error and an average error over 13 points is calculated to evaluate the performance of the prediction.
4.3.3. Testing Scenario-3: Contact Angle Estimation along Rotational axis
4.3.4. Testing Scenario-4: 2D Contact Area Estimation
5. Results and Discussions
5.1. Force Distribution Estimation
5.2. Contact Position Estimation w.r.t X,Y,Z Axes
5.3. Contact Angle Estimation w.r.t Rotational Axis
5.4. Contact Area Estimation
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Supplementary Test Results
Test-1 | Test-2 | Test-3 | ||||||
---|---|---|---|---|---|---|---|---|
Original Force (N) | Estimated Force (N) | Error (N) | Original Force (N) | Estimated Force (N) | Error (N) | Original Force (N) | Estimated Force (N) | Error (N) |
0.12 | 0.2009 | 0.0809 | 0.12 | 0.2014 | 0.0814 | 0.12 | 0.2039 | 0.0839 |
0.22 | 0.2776 | 0.0576 | 0.21 | 0.2610 | 0.0510 | 0.22 | 0.2857 | 0.0657 |
0.32 | 0.3423 | 0.0223 | 0.32 | 0.3336 | 0.0036 | 0.33 | 0.3447 | 0.0147 |
0.42 | 0.3983 | 0.0217 | 0.42 | 0.3931 | 0.0169 | 0.42 | 0.4113 | 0.0087 |
0.51 | 0.4772 | 0.0328 | 0.51 | 0.4828 | 0.0272 | 0.50 | 0.4751 | 0.0249 |
0.60 | 0.5726 | 0.0274 | 0.60 | 0.5838 | 0.0162 | 0.60 | 0.5844 | 0.0156 |
0.70 | 0.6846 | 0.0154 | 0.70 | 0.6929 | 0.0071 | 0.70 | 0.6926 | 0.0074 |
0.78 | 0.7696 | 0.0104 | 0.78 | 0.7685 | 0.0115 | 0.78 | 0.7736 | 0.0064 |
0.87 | 0.8648 | 0.0052 | 0.87 | 0.8715 | 0.0085 | 0.88 | 0.8745 | 0.0055 |
0.97 | 0.9748 | 0.0048 | 0.97 | 0.9819 | 0.0119 | 0.97 | 0.9769 | 0.0069 |
FSO (%) | 8.34 | FSO (%) | 8.39 | FSO (%) | 8.65 | |||
Test-4 | Test-5 | Test-6 | ||||||
Original Force (N) | Estimated Force (N) | Error (N) | Original Force (N) | Estimated Force (N) | Error (N) | Original Force (N) | Estimated Force (N) | Error (N) |
0.13 | 0.2120 | 0.0820 | 0.12 | 0.2057 | 0.0857 | 0.13 | 0.2241 | 0.0941 |
0.22 | 0.2742 | 0.0542 | 0.22 | 0.2812 | 0.0612 | 0.21 | 0.2655 | 0.0555 |
0.33 | 0.3468 | 0.0168 | 0.32 | 0.3473 | 0.0273 | 0.33 | 0.3508 | 0.0208 |
0.42 | 0.3987 | 0.0213 | 0.41 | 0.3995 | 0.0105 | 0.41 | 0.4014 | 0.0086 |
0.50 | 0.4721 | 0.0279 | 0.50 | 0.4776 | 0.0224 | 0.50 | 0.4889 | 0.0111 |
0.59 | 0.5803 | 0.0097 | 0.60 | 0.5962 | 0.0038 | 0.60 | 0.5917 | 0.0083 |
0.69 | 0.6903 | 0.0003 | 0.70 | 0.6963 | 0.0037 | 0.70 | 0.7020 | 0.0020 |
0.78 | 0.7720 | 0.0080 | 0.78 | 0.7780 | 0.0020 | 0.78 | 0.7724 | 0.0076 |
0.87 | 0.8693 | 0.0007 | 0.88 | 0.8770 | 0.0030 | 0.88 | 0.8778 | 0.0022 |
0.97 | 0.9803 | 0.0103 | 0.97 | 0.9745 | 0.0045 | 0.97 | 0.9799 | 0.0099 |
FSO (%) | 8.45 | FSO (%) | 8.84 | FSO (%) | 9.70 | |||
Test-7 | Test-8 | Test-9 | ||||||
Original Force (N) | Estimated Force (N) | Error (N) | Original Force (N) | Estimated Force (N) | Error (N) | Original Force (N) | Estimated Force (N) | Error (N) |
0.12 | 0.2054 | 0.0854 | 0.11 | 0.2065 | 0.0965 | 0.12 | 0.2074 | 0.0874 |
0.22 | 0.2707 | 0.0507 | 0.23 | 0.2686 | 0.0386 | 0.23 | 0.2871 | 0.0571 |
0.31 | 0.3271 | 0.0171 | 0.33 | 0.3456 | 0.0156 | 0.32 | 0.3438 | 0.0238 |
0.42 | 0.4177 | 0.0023 | 0.41 | 0.4052 | 0.0048 | 0.40 | 0.3959 | 0.0041 |
0.49 | 0.4761 | 0.0139 | 0.50 | 0.4820 | 0.0180 | 0.51 | 0.4784 | 0.0316 |
0.60 | 0.5941 | 0.0059 | 0.60 | 0.5985 | 0.0015 | 0.60 | 0.5926 | 0.0074 |
0.71 | 0.7154 | 0.0054 | 0.69 | 0.6919 | 0.0019 | 0.69 | 0.6958 | 0.0058 |
0.78 | 0.7774 | 0.0026 | 0.78 | 0.7737 | 0.0063 | 0.78 | 0.7799 | 0.0001 |
0.88 | 0.8811 | 0.0011 | 0.87 | 0.8693 | 0.0007 | 0.88 | 0.8801 | 0.0001 |
0.97 | 0.9734 | 0.034 | 0.98 | 0.9825 | 0.0025 | 0.98 | 0.9831 | 0.0031 |
FSO (%) | 8.80 | FSO (%) | 9.85 | FSO (%) | 8.92 | |||
Test-10 | ||||||||
Original Force (N) | Estimated Force (N) | Error Force (N) | Original Force (N) | Estimated Force (N) | Error (N) | Original Force (N) | Estimated Force (N) | Error (N) |
0.12 | 0.2114 | 0.0914 | 0.50 | 0.4868 | 0.0132 | 0.78 | 0.7789 | 0.0011 |
0.21 | 0.2643 | 0.0543 | 0.59 | 0.5997 | 0.0097 | 0.88 | 0.8777 | 0.0023 |
0.32 | 0.3553 | 0.0353 | 0.70 | 0.7088 | 0.0088 | 0.97 | 0.9804 | 0.0104 |
0.40 | 0.4039 | 0.0039 | − | − | − | − | − | − |
FSO (%) | 9.42 | |||||||
Average FSO (%) | 8.936 |
Xorig = −6 mm | Xorig = −5 mm | Xorig = −4 mm | Xorig = −3 mm | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] |
0.12 | −1.828 | 4.171 | 0.13 | −2.036 | 2.963 | 0.12 | −1.530 | 2.469 | 0.13 | −1.094 | 1.9056 |
0.21 | −2.875 | 3.124 | 0.21 | −2.593 | 2.407 | 0.22 | −2.065 | 1.934 | 0.22 | −1.588 | 1.411 |
0.31 | −3.388 | 2.611 | 0.31 | −2.986 | 2.013 | 0.33 | −2.435 | 1.564 | 0.31 | −1.828 | 1.171 |
0.40 | −3.884 | 2.115 | 0.40 | −3.352 | 1.647 | 0.42 | −2.566 | 1.433 | 0.40 | −1.767 | 1.232 |
0.48 | −3.997 | 2.002 | 0.49 | −3.549 | 1.451 | 0.50 | −2.801 | 1.198 | 0.49 | −1.798 | 1.201 |
0.57 | −4.260 | 1.740 | 0.59 | −3.742 | 1.257 | 0.60 | −2.817 | 1.182 | 0.59 | −1.903 | 1.096 |
0.66 | −4.943 | 1.056 | 0.68 | −3.643 | 1.357 | 0.69 | −2.857 | 1.142 | 0.69 | −1.900 | 1.099 |
0.75 | −5.610 | 0.389 | 0.76 | −4.070 | 0.929 | 0.77 | −3.074 | 0.926 | 0.78 | −2.115 | 0.884 |
0.85 | −5.738 | 0.261 | 0.85 | −4.576 | 0.423 | 0.86 | −3.288 | 0.716 | 0.87 | −2.491 | 0.508 |
0.95 | −5.747 | 0.252 | 0.95 | −4.818 | 0.818 | 0.97 | −3.567 | 0.433 | 0.96 | −2.512 | 0.487 |
Xorig = −2 mm | Xorig = −1 mm | Xorig = 1 mm | Xorig = 2 mm | ||||||||
F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] |
0.12 | −1.088 | 0.911 | 0.14 | −1.095 | 0.095 | 0.13 | −1.026 | 2.026 | 0.12 | −0.490 | 2.490 |
0.22 | −1.478 | 0.521 | 0.22 | −0.949 | 0.050 | 0.21 | −0.952 | 1.952 | 0.22 | −0.208 | 2.208 |
0.33 | −1.409 | 0.591 | 0.33 | −1.018 | 0.018 | 0.33 | −0.764 | 1.764 | 0.33 | −0.608 | 2.608 |
0.41 | −1.228 | 0.771 | 0.41 | 0.997 | 0.002 | 0.41 | −0.817 | 1.817 | 0.40 | −0.356 | 2.356 |
0.50 | −1.431 | 0.568 | 0.50 | −0.903 | 0.096 | 0.51 | −0.275 | 1.275 | 0.50 | 0.166 | 1.834 |
0.59 | −1.210 | 0.789 | 0.60 | −0.590 | 0.409 | 0.59 | 0.288 | 0.711 | 0.60 | 0.959 | 1.040 |
0.69 | −1.297 | 0.702 | 0.69 | −0.523 | 0.476 | 0.69 | 0.950 | 0.049 | 0.70 | 2.079 | 0.079 |
0.78 | −1.255 | 0.744 | 0.78 | −0.334 | 0.665 | 0.78 | 1.421 | 0.421 | 0.78 | 2.567 | 0.567 |
0.88 | −1.455 | 0.544 | 0.88 | −0.238 | 0.761 | 0.88 | 1.529 | 0.592 | 0.87 | 2.567 | 0.567 |
0.98 | −1.519 | 0.480 | 0.97 | −0.399 | 0.601 | 0.97 | 1.607 | 0.607 | 0.98 | 2.553 | 0.553 |
Xorig = 3 mm | Xorig = 4 mm | Xorig = 5 mm | Xorig = 6 mm | ||||||||
F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] | F (N) | Xest [mm] | Error [mm] |
0.12 | −0.461 | 3.461 | 0.11 | −0.246 | 4.246 | 0.12 | 0.020 | 4.979 | 0.13 | 0.166 | 5.833 |
0.23 | −0.279 | 3.279 | 0.21 | 0.125 | 3.874 | 0.22 | 0.633 | 4.366 | 0.21 | 0.692 | 5.308 |
0.32 | −0.401 | 3.401 | 0.31 | 0.958 | 3.042 | 0.32 | 1.588 | 3.411 | 0.33 | 2.673 | 3.326 |
0.41 | 0.093 | 2.906 | 0.39 | 0.983 | 3.874 | 0.40 | 2.317 | 2.652 | 0.40 | 3.060 | 2.939 |
0.50 | 0.942 | 2.0571 | 0.50 | 1.681 | 2.318 | 0.50 | 2.931 | 2.068 | 0.49 | 3.990 | 2.009 |
0.60 | 1.571 | 1.428 | 0.59 | 2.570 | 1.429 | 0.58 | 3.744 | 1.256 | 0.57 | 4.933 | 1.066 |
0.69 | 3.057 | 0.075 | 0.69 | 4.133 | 0.133 | 0.67 | 4.917 | 0.082 | 0.67 | 5.453 | 0.546 |
0.78 | 3.649 | 0.649 | 0.77 | 4.709 | 0.709 | 0.76 | 5.745 | 0.745 | 0.75 | 6.600 | 0.600 |
0.88 | 3.790 | 0.790 | 0.87 | 4.807 | 0.807 | 0.86 | 5.787 | 0.787 | 0.84 | 6.639 | 0.639 |
0.97 | 3.670 | 0.679 | 0.97 | 4.800 | 0.800 | 0.96 | 5.884 | 0.884 | 0.94 | 6.896 | 0.896 |
Yorig = −6 mm | Yorig = −5 mm | Yorig = −4 mm | Yorig = −3 mm | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
F (N) | Yest [mm] | Error [mm] | F (N) | Yest [mm] | Error [mm] | F (N) | Yest [mm] | Error [mm] | F (N) | Yest [mm] | Error [mm] |
0.12 | 0.021 | 6.021 | 0.12 | 0.457 | 5.457 | 0.13 | −0.345 | 3.654 | 0.13 | −0.367 | 2.632 |
0.20 | −0.631 | 5.3681 | 0.23 | −1.619 | 3.380 | 0.21 | −0.466 | 3.533 | 0.21 | −0.781 | 2.218 |
0.32 | −3.050 | 2.949 | 0.32 | −2.425 | 2.574 | 0.32 | −1.793 | 2.206 | 0.32 | −1.498 | 1.501 |
0.40 | −4.548 | 1.452 | 0.41 | −3.545 | 1.454 | 0.41 | −2.652 | 1.347 | 0.41 | −1.503 | 1.496 |
0.49 | −5.476 | 0.523 | 0.50 | −4.028 | 0.971 | 0.51 | −3.037 | 0.962 | 0.50 | −1.936 | 1.063 |
0.56 | −5.925 | 0.074 | 0.58 | −4.429 | 0.570 | 0.59 | −3.173 | 0.826 | 0.60 | −2.190 | 0.809 |
0.66 | −5.698 | 0.301 | 0.67 | −4.453 | 0.546 | 0.69 | −3.565 | 0.434 | 0.69 | −2.523 | 0.476 |
0.75 | −5.811 | 0.188 | 0.77 | −4.704 | 0.296 | 0.76 | −3.847 | 0.152 | 0.78 | −2.960 | 0.040 |
0.85 | −6.022 | 0.022 | 0.85 | −4.419 | 0.419 | 0.87 | −4.485 | 0.458 | 0.86 | −3.435 | 0.435 |
0.95 | −6154 | 0.154 | 0.95 | −4.479 | 0.479 | 0.96 | −4.464 | 0.464 | 0.97 | −3.352 | 0.352 |
Yorig = −2 mm | Yorig = −1 mm | Yorig = 1 mm | Yorig = 2 mm | ||||||||
F (N) | Yest [mm] | Error [mm] | F (N) | Yest [mm] | Error [mm] | F (N) | Yest [mm] | Error [mm] | F (N) | Yest [mm] | Error [mm] |
0.13 | −0.194 | 1.805 | 0.12 | 0.271 | 1.271 | 0.13 | 0.693 | 0.306 | 0.12 | 1.029 | 0.970 |
0.22 | −0.718 | 1.281 | 0.22 | 0.268 | 1.268 | 0.21 | 1.234 | 0.234 | 0.22 | 1.736 | 0.263 |
0.31 | −0.817 | 1.182 | 0.32 | −0.588 | 0.411 | 0.33 | 0.622 | 0.377 | 0.33 | 1.124 | 0.875 |
0.41 | −1.444 | 0.555 | 0.42 | −0.639 | 0.360 | 0.41 | 0.644 | 0.355 | 0.41 | 1.426 | 0.573 |
0.50 | −1.483 | 0.516 | 0.50 | −0.929 | 0.070 | 0.51 | 0.757 | 0.242 | 0.50 | 1.585 | 0.414 |
0.60 | −1.742 | 0.257 | 0.60 | −0.973 | 0.026 | 0.58 | 0.534 | 0.465 | 0.58 | 1.480 | 0.519 |
0.70 | −1.807 | 0.192 | 0.70 | −1.247 | 0.247 | 0.70 | 0.424 | 0.575 | 0.69 | 1.546 | 0.453 |
0.78 | −2.083 | 0.083 | 0.78 | −1.283 | 0.283 | 0.79 | 0.691 | 0.308 | 0.77 | 1.576 | 0.423 |
0.87 | −2.226 | 0.226 | 0.87 | −1.155 | 0.155 | 0.88 | 0.709 | 0.290 | 0.88 | 1.822 | 0.177 |
0.98 | −2.271 | 0.271 | 0.98 | −1.156 | 0.156 | 0.98 | 0.555 | 0.444 | 0.98 | 1.754 | 0.245 |
Yorig = 3 mm | Yorig = 4 mm | Yorig = 5 mm | Yorig = 6 mm | ||||||||
F (N) | Yorig [mm] | Error [mm] | F (N) | Yorig [mm] | Error [mm] | F (N) | Yorig [mm] | Error [mm] | F (N) | Yorig [mm] | Error [mm] |
0.11 | 1.375 | 1.624 | 0.11 | 1.920 | 2.079 | 0.14 | 2.799 | 2.200 | 0.11 | 2.409 | 3.590 |
0.22 | 1.897 | 1.102 | 0.21 | 2.390 | 1.609 | 0.22 | 3.072 | 1.927 | 0.22 | 3.681 | 2.318 |
0.33 | 2.241 | 0.758 | 0.31 | 0.958 | 1.484 | 0.33 | 3.273 | 1.726 | 0.32 | 3.838 | 2.162 |
0.41 | 2.169 | 0.831 | 0.39 | 2.515 | 1.031 | 0.42 | 3.479 | 1.520 | 0.40 | 4.237 | 1.762 |
0.50 | 2.300 | 0.699 | 0.50 | 2.968 | 1.077 | 0.50 | 3.654 | 1.345 | 0.49 | 4.354 | 1.645 |
0.59 | 2.378 | 0.621 | 0.59 | 2.922 | 0.846 | 0.59 | 3.923 | 1.076 | 0.58 | 4.748 | 1.241 |
0.69 | 2.441 | 0.558 | 0.69 | 3.153 | 0.775 | 0.68 | 4.215 | 0.784 | 0.68 | 5.106 | 0.893 |
0.78 | 2.470 | 0.529 | 0.77 | 3.244 | 0.581 | 0.76 | 4.414 | 0.585 | 0.76 | 5.188 | 0.811 |
0.88 | 2.698 | 0.302 | 0.87 | 3.596 | 0.403 | 0.86 | 4.363 | 0.363 | 0.85 | 5.239 | 0.760 |
0.97 | 2.771 | 0.228 | 0.97 | 3.692 | 0.307 | 0.96 | 4.622 | 0.377 | 0.96 | 5.358 | 0.642 |
References
- Umbaugh, S.E. Digital Image Processing and Analysis: Human and Computer Vision Applications with CVIPtools; CRC Press: Boca Raton, FL, USA, 2010. [Google Scholar]
- Kakani, V.; Nguyen, V.H.; Kumar, B.P.; Kim, H.; Pasupuleti, V.R. A critical review on computer vision and artificial intelligence in food industry. J. Agric. Food Res. 2020, 2, 100033. [Google Scholar] [CrossRef]
- Kakani, V.; Kim, H.; Basivi, P.K.; Pasupuleti, V.R. Surface Thermo-Dynamic Characterization of Poly (Vinylidene Chloride-Co-Acrylonitrile)(P (VDC-co-AN)) Using Inverse-Gas Chromatography and Investigation of Visual Traits Using Computer Vision Image Processing Algorithms. Polymers 2020, 12, 1631. [Google Scholar] [CrossRef] [PubMed]
- Shimonomura, K. Tactile image sensors employing camera: A review. Sensors 2019, 19, 3933. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kakani, V.; Kim, H.; Lee, J.; Ryu, C.; Kumbham, M. Automatic Distortion Rectification of Wide-Angle Images Using Outlier Refinement for Streamlining Vision Tasks. Sensors 2020, 20, 894. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kakani, V.; Kim, H.; Kumbham, M.; Park, D.; Jin, C.B.; Nguyen, V.H. Feasible Self-Calibration of Larger Field-of-View (FOV) Camera Sensors for the Advanced Driver-Assistance System (ADAS). Sensors 2019, 19, 3369. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Luo, S.; Bimbo, J.; Dahiya, R.; Liu, H. Robotic tactile perception of object properties: A review. Mechatronics 2017, 48, 54–67. [Google Scholar] [CrossRef] [Green Version]
- Li, W.; Konstantinova, J.; Noh, Y.; Alomainy, A.; Althoefer, K. Camera-based force and tactile sensor. In Proceedings of the Annual Conference Towards Autonomous Robotic Systems, Bristol, UK, 25–27 July 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 438–450. [Google Scholar]
- Sferrazza, C.; D’Andrea, R. Design, motivation and evaluation of a full-resolution optical tactile sensor. Sensors 2019, 19, 928. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yuan, W.; Mo, Y.; Wang, S.; Adelson, E.H. Active clothing material perception using tactile sensing and deep learning. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 1–8. [Google Scholar]
- Yuan, W.; Li, R.; Srinivasan, M.A.; Adelson, E.H. Measurement of shear and slip with a GelSight tactile sensor. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 304–311. [Google Scholar]
- Fearing, R.S. Tactile sensing mechanisms. Int. J. Robot. Res. 1990, 9, 3–23. [Google Scholar] [CrossRef]
- Chitta, S.; Sturm, J.; Piccoli, M.; Burgard, W. Tactile sensing for mobile manipulation. IEEE Trans. Robot. 2011, 27, 558–568. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Yamaguchi, A.; Atkeson, C.G. Recent progress in tactile sensing and sensors for robotic manipulation: Can we turn tactile sensing into vision? Adv. Robot. 2019, 33, 661–673. [Google Scholar] [CrossRef]
- Hosoda, K.; Tada, Y.; Asada, M. Internal representation of slip for a soft finger with vision and tactile sensors. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, Switzerland, 30 September–4 October 2002; Volume 1, pp. 111–115. [Google Scholar]
- Kolker, A.; Jokesch, M.; Thomas, U. An optical tactile sensor for measuring force values and directions for several soft and rigid contacts. In Proceedings of the ISR 2016: 47st International Symposium on Robotics, VDE, Munich, Germany, 21–22 June 2016; pp. 1–6. [Google Scholar]
- James, J.W.; Pestell, N.; Lepora, N.F. Slip detection with a biomimetic tactile sensor. IEEE Robot. Autom. Lett. 2018, 3, 3340–3346. [Google Scholar] [CrossRef] [Green Version]
- Johnsson, M.; Balkenius, C. Neural network models of haptic shape perception. Robot. Auton. Syst. 2007, 55, 720–727. [Google Scholar] [CrossRef]
- Naeini, F.B.; AlAli, A.M.; Al-Husari, R.; Rigi, A.; Al-Sharman, M.K.; Makris, D.; Zweiri, Y. A novel dynamic-vision-based approach for tactile sensing applications. IEEE Trans. Instrum. Meas. 2019, 69, 1881–1893. [Google Scholar] [CrossRef]
- Ma, D.; Donlon, E.; Dong, S.; Rodriguez, A. Dense tactile force estimation using GelSlim and inverse FEM. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 5418–5424. [Google Scholar]
- Wilson, A.; Wang, S.; Romero, B.; Adelson, E. Design of a Fully Actuated Robotic Hand With Multiple Gelsight Tactile Sensors. arXiv 2020, arXiv:2002.02474. [Google Scholar]
- Taunyazov, T.; Sng, W.; See, H.H.; Lim, B.; Kuan, J.; Ansari, A.F.; Tee, B.C.; Soh, H. Event-driven visual-tactile sensing and learning for robots. Perception 2020, 4, 5. [Google Scholar]
- Pezzementi, Z.; Plaku, E.; Reyda, C.; Hager, G.D. Tactile-object recognition from appearance information. IEEE Trans. Robot. 2011, 27, 473–487. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Yuan, W.; Kan, Z.; Wang, M.Y. Towards Learning to Detect and Predict Contact Events on Vision-based Tactile Sensors. In Proceedings of the Conference on Robot Learning, Boston, MA, USA, 16–18 November 2020; pp. 1395–1404. [Google Scholar]
- Begej, S. Planar and finger-shaped optical tactile sensors for robotic applications. IEEE J. Robot. Autom. 1988, 4, 472–484. [Google Scholar] [CrossRef]
- Lepora, N.F.; Ward-Cherrier, B. Superresolution with an optical tactile sensor. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 2686–2691. [Google Scholar]
- Ito, Y.; Kim, Y.; Obinata, G. Robust slippage degree estimation based on reference update of vision-based tactile sensor. IEEE Sens. J. 2011, 11, 2037–2047. [Google Scholar] [CrossRef]
- Yang, X.D.; Grossman, T.; Wigdor, D.; Fitzmaurice, G. Magic finger: Always-available input through finger instrumentation. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, Cambridge, MA, USA, 7–10 October 2012; pp. 147–156. [Google Scholar]
- Corradi, T.; Hall, P.; Iravani, P. Object recognition combining vision and touch. Robot. Biomim. 2017, 4, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Luo, S.; Mou, W.; Althoefer, K.; Liu, H. iCLAP: Shape recognition by combining proprioception and touch sensing. Auton. Robot. 2019, 43, 993–1004. [Google Scholar] [CrossRef] [Green Version]
- Piacenza, P.; Dang, W.; Hannigan, E.; Espinal, J.; Hussain, I.; Kymissis, I.; Ciocarlie, M. Accurate contact localization and indentation depth prediction with an optics-based tactile sensor. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 959–965. [Google Scholar]
- Johnson, M.K.; Adelson, E.H. Retrographic sensing for the measurement of surface texture and shape. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 1070–1077. [Google Scholar]
- Johnson, M.K.; Cole, F.; Raj, A.; Adelson, E.H. Microgeometry capture using an elastomeric sensor. ACM Trans. Graph. (TOG) 2011, 30, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Yuan, W.; Srinivasan, M.A.; Adelson, E.H. Estimating object hardness with a gelsight touch sensor. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 208–215. [Google Scholar]
- Kroemer, O.; Lampert, C.H.; Peters, J. Learning dynamic tactile sensing with robust vision-based training. IEEE Trans. Robot. 2011, 27, 545–557. [Google Scholar] [CrossRef]
- Meier, M.; Patzelt, F.; Haschke, R.; Ritter, H.J. Tactile convolutional networks for online slip and rotation detection. In Proceedings of the International Conference on Artificial Neural Networks, Barcelona, Spain, 6–9 September 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 12–19. [Google Scholar]
- Chuah, M.Y.; Kim, S. Improved normal and shear tactile force sensor performance via least squares artificial neural network (lsann). In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 116–122. [Google Scholar]
- Kaboli, M.; Feng, D.; Cheng, G. Active tactile transfer learning for object discrimination in an unstructured environment using multimodal robotic skin. Int. J. Humanoid Robot. 2018, 15, 1850001. [Google Scholar] [CrossRef] [Green Version]
- Gandarias, J.M.; Garcia-Cerezo, A.J.; Gomez-de Gabriel, J.M. CNN-based methods for object recognition with high-resolution tactile sensors. IEEE Sens. J. 2019, 19, 6872–6882. [Google Scholar] [CrossRef]
- Sferrazza, C.; D’Andrea, R. Transfer learning for vision-based tactile sensing. arXiv 2018, arXiv:1812.03163. [Google Scholar]
- Sato, K.; Kamiyama, K.; Kawakami, N.; Tachi, S. Finger-shaped gelforce: Sensor for measuring surface traction fields for robotic hand. IEEE Trans. Haptics 2009, 3, 37–47. [Google Scholar] [CrossRef]
- Sferrazza, C.; Wahlsten, A.; Trueeb, C.; D’Andrea, R. Ground truth force distribution for learning-based tactile sensing: A finite element approach. IEEE Access 2019, 7, 173438–173449. [Google Scholar] [CrossRef]
- Qi, H.; Joyce, K.; Boyce, M. Durometer hardness and the stress-strain behavior of elastomeric materials. Rubber Chem. Technol. 2003, 76, 419–435. [Google Scholar] [CrossRef]
- Moeslund, T.B. BLOB analysis. In Introduction to Video and Image Processing; Springer: Berlin/Heidelberg, Germany, 2012; pp. 103–115. [Google Scholar]
Research Study | Methodology | Tactile Properties | Key Aspects/Limitations |
---|---|---|---|
Lepora et al. [27] | Bayesian perception | Localization (internal displacement) | 40-fold accuracy compared to traditional tactile sensor |
Ito et al. [28] | Adaptive selection and compensation of dot positions | Slippage degree multidimensional force object contact | Depends on position, measurements of dots (tuning is easy) |
Yang et al. [29] | Magic finger optical touch sensor | Contact location force and texture | Can sense the touch finger XY-footprint (like optical mouse) |
Corradi et al. [30] | Object recognition (vision + touch) | Object shape/texture | Vector concatenation object label posterior |
Piacenza et al. [32] | Elastomer Light transport mechanism | Contact Localization and Indentation depth prediction | Exhibits submillimeter accuracy 20 mm by 20 mm active sensing area |
Johnson et al. [33] | Surface reconstruction (photometric stereo) | Texture and shape | In addition, called as 2.5D texture scanner |
Johnson et al. [34] | Microgeometry using Elastomeric Sensor | Surface geometry | Can only handle shallow relief geometry |
Yuan et al. [35] | Object hardness with GelSight touch sensor | Fine texture, contact force and slip conditions | Infer object hardness without prior knowledge |
Kroemer et al. [36] | Dynamic tactile sensing using weak pairing (vision + tactile samples) | Visual shape and surface texture | Machine Learning with lower dimensional representation of tactile data |
Meier et al. [37] | Tactile DeepCNN for Online Slip and Rotation Detection | Classify contact state Distinguish rotational and translation slippage | Final classification rate is Feasible for adaptive grasp control |
Chuah et al. [38] | Least Squares ANN improving shear force with better optimization | Normal and Shear tactile force | Better convergence with Multi-input, multi-output function approximator |
Kaboli et al. [39] | Probabilistic active tactile transfer learning | Surface texture, stiffness, and thermal conductivity | discrimination accuracy only one training sample (on-shot-tactile-learning) |
Gandarias et al. [40] | Custom CNN (TactNet) for object recognition with RGB pressure images | Contact objects Identify tactile pressure | Used 8 transfer learning networks, 3 TactNet scratch training |
Sato et al. [42] | Compact finger-shaped GelForce sensor for surface traction fields | Measuring distribution of force vectors or surface traction fields | Small size with linearity of force N with refresh rate 67 Hz |
Sferrazza et al. [43] | Commercial force tactile sensor images are matched to groundtruth data for DNN training | Measuring contact force and contact center of the sensor’s surface | Refresh rate of 40 Hz, performance is dependent on reference axes alignments |
Current study | Transfer learning-based CNN training using tactile sensor images matched with groundtruth | Measuring contact force contact position in and contact size in mm | Refresh rate of 30 Hz with spatial resolution of mm and size of sensor is high than that of References [42,43] because of stereo-camera |
Design Aspects | Specifications |
---|---|
Sensor surface material (including markers) | Rubber |
Sensor size (width × height) | 44 mm × 72 mm |
Spatial resolution | 2.5 mm |
Refresh rate (sampling frequency) | 30 Hz |
No. of Protrusions | 292 |
Design Aspects | Specifications |
---|---|
Camera resolution | 640 × 480 with 30 fps |
Pixel size | 3 um × 3 um |
Image sensor size | 1/4” |
Image active area | 3888 um × 2430 um |
Signal-to-Noise ratio | 39 dB |
Scan mode | Progressive |
Lens module | 3.4 mm/F2.8 |
Power | DC 5 V/150 mA |
Interface | USB 2.0 |
Category | Training (Point) | Validation (Point) | Testing (Point) |
---|---|---|---|
Data01 (Left + Right) | 3380 + 3380 | 1680 + 1680 | 1690 + 1690 |
Data02 (Left + Right) | 2730 + 2730 | 910 + 910 | 910 + 910 |
Total (Data01 + Data02) | 12,220 | 5180 | 5200 |
Number of images for total 10 points (0.1 N interval from 0.1 N to 1 N) | 122,200 | 51,800 | 52,000 |
Test No. (1∼5) | Force Estimation Error (N) | Test No. (6∼10) | Force Estimation Error (N) |
---|---|---|---|
1 | 0.027 | 6 | 0.022 |
2 | 0.026 | 7 | 0.021 |
3 | 0.023 | 8 | 0.018 |
4 | 0.023 | 9 | 0.022 |
5 | 0.022 | 10 | 0.023 |
Average Error on all 10 tests | 0.022 |
Shape (Contact Tool) | Number of Images (Testing) | Error Rates in (%) |
---|---|---|
Circle | 20 | 0.597 |
Square | 18 | 0.926 |
Hexagon | 18 | 2.857 |
Total Average Error | 56 | 1.429 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kakani, V.; Cui, X.; Ma, M.; Kim, H. Vision-Based Tactile Sensor Mechanism for the Estimation of Contact Position and Force Distribution Using Deep Learning. Sensors 2021, 21, 1920. https://doi.org/10.3390/s21051920
Kakani V, Cui X, Ma M, Kim H. Vision-Based Tactile Sensor Mechanism for the Estimation of Contact Position and Force Distribution Using Deep Learning. Sensors. 2021; 21(5):1920. https://doi.org/10.3390/s21051920
Chicago/Turabian StyleKakani, Vijay, Xuenan Cui, Mingjie Ma, and Hakil Kim. 2021. "Vision-Based Tactile Sensor Mechanism for the Estimation of Contact Position and Force Distribution Using Deep Learning" Sensors 21, no. 5: 1920. https://doi.org/10.3390/s21051920
APA StyleKakani, V., Cui, X., Ma, M., & Kim, H. (2021). Vision-Based Tactile Sensor Mechanism for the Estimation of Contact Position and Force Distribution Using Deep Learning. Sensors, 21(5), 1920. https://doi.org/10.3390/s21051920