Next Article in Journal
Examination of Potential of Thermopile-Based Contactless Respiratory Gating
Next Article in Special Issue
Soft Multi-Directional Force Sensor for Underwater Robotic Application
Previous Article in Journal
Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System
Previous Article in Special Issue
A Soft Resistive Sensor with a Semicircular Cross-Sectional Channel for Soft Cardiac Catheter Ablation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Sensing and Rendering Method of 2-Dimensional Haptic Texture

1
Faculty of Advanced Science and Technology, Kumamoto University, Kumamoto 860-8555, Japan
2
Nishi-Nippon Railroad Co., Ltd., Fukuoka 812-0011, Japan
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(16), 5523; https://doi.org/10.3390/s21165523
Submission received: 30 June 2021 / Revised: 9 August 2021 / Accepted: 11 August 2021 / Published: 17 August 2021
(This article belongs to the Special Issue Tactile and Force Sensors in Robotics)

Abstract

:
Nowadays, touchscreens have been used worldwide. However, most of them lack realistic haptic feedback. Several haptic feedback devices employ one-dimensional vibration only. We aim at a novel rendering method for direction-controlled 2-dimensional vibration display to present texture information. This paper proposed a rendering method of texture information that enables lateral-force-based 2-dimensional vibration in the X and Y-axis. Moreover, we proposed combining AKAZE image feature information of the textures to improve the fidelity for larger periodic textures. We held experiments to evaluate the fidelity of the proposed method. The result shows that the proposed method has higher fidelity in presenting randomized textures and large periodic textures than the conventional method.

1. Introduction

Recently, touchscreen interfaces have become popular worldwide. Moreover, consumer electronics devices use a dedicated touchscreen as an interface. However, few touchscreens that enable reactive tactile signals exist. Many of them employ vibrotactile feedback, though, these do not have enough texture feelings. Furthermore, none of them provide realistic texture information.
At the research level, several haptic devices using a liquid crystal panel have been developed. For example, Chubb et al. developed a haptic device employing friction change induced by the squeeze film effect [1]. Yamamoto et al. [2] developed a tactile display employing electrostatic force. The surface-acoustic-wave-based tactile display was developed by Takasaki et al. [3]. By employing a controlled vibration frequency and a virtual pointer, Konyo et al. [4] developed a tactile display. Makino et al. [5] developed a tactile device that employs the beat of two different high-frequency vibrations. Wang et al. developed a sliding system using shear force [6]. In recent years other lateral-force-based tactile feedback devices that employ static electric fields have been developed [7].
Other studies have used recorded vibrations. Visell et al. [8] employ accelerometers for recording vibration of contact period between feet and ground. Kuchenbecker et al. [9] employs accelerometers to record contact vibration between surgical tools and organs of humans. Romano and Kuchenbecker [10] employed accelerometers, touchscreens, and force sensors. By recording the information, they display adequate vibration for the movement of the user’s stylus. Minamizawa et al. [11] employ microphones to record higher frequency information of contact. Saga et al. proposed a simpler recording/playing method by omitting the measurement of pressure and using a compensation method when reproducing vibration [12]. These vibration stimuli realize high reproducibility, though, the direction of the vibration is limited to one dimension. This is because it has been found that receptors transmitting vibrational stimulation in the skin cannot discriminate the direction of vibration [13]. For this reason, the direction of vibration has not been regarded as of much importance in tactile research so far, and most of them have employed one-dimensional vibration.
However, there is some distribution of the receptors in the skin. Thus, the input signals from multiple receptors may induce discrimination of multi-dimensional vibration. We employed the same device with Saga et al. [12] for presentation device (Figure 1). This paper proposed a rendering method to reproduce biaxial acceleration information through the lateral-force-displaying device using X-axis and Y-axis independent vibration information. We reported the results of experiments on the reproducibility of tactile sensation by comparing the conventional method and proposed one. We proposed a novel rendering method to display multi-dimensional vibration.
Moreover, we proposed combining AKAZE image feature information of the textures to improve the fidelity for larger periodic textures. We held experiments to evaluate the fidelity of the proposed method. KAZE features is a novel 2D feature detection and description method that operates entirely in nonlinear scale-space. AKAZE is an accelerated KAZE feature, and it speeds the nonlinear scale-space computation. The result shows that the proposed method has higher fidelity in presenting randomized textures and large periodic textures than the conventional method.

2. Presentation of Tactile Texture Information Using Independent Vibration

Many researchers are considering methods of presenting tactile texture information using vibration information from various viewpoints [11,14]. Romano et al. proposed a method for recording texture on a tablet by recording acceleration, position, and contact force overtime when touching a texture with a dedicated tool [10]. Saga et al. proposed a simpler recording/playing method by omitting the measurement of pressure and using a compensation method when reproducing vibration [12]. They reproduce the sense of direct touch by recording vibration information with fingers and reproducing the recorded information using the two-dimensional shearing force presentation device (Figure 1).
In this research, to accurately acquire three-dimensional data, acceleration information was processed by a microcontroller, Arduino, which packs three-axis information as one packet and is transmitted to a PC using serial communication. On the PC, packed vibration information was unpacked and recorded by the Processing application.

2.1. Recording State

In our research, we used two-axes recorded acceleration information as tactile information of textures for each axis. In recording state, the triaxial acceleration sensor (ADXL-335) is fixed to the finger with tape, and the acceleration information is recorded from several textures by rubbing them with the finger repeatedly. These measurements are held on the touchscreen to measure the rubbing speed of the user. Since Saga et al. recorded acceleration information by the audio input, it was recorded as one-dimensional data. In our research, to accurately acquire three-dimensional data, acceleration information was processed by a microcontroller, Arduino, which packs three-axis information as one packet and is transmitted to a PC using serial communication. On the PC, packed vibration information was unpacked and recorded by the Processing application.
The acceleration was sampled at 1 kHz. To accurately present the recorded vibration direction and reproduce faithful vibrations, using correct vibrations suitable for the user’s movement direction is essential. Therefore, when recording vibration information, we stored vibration separately, not in one direction but in two directions, x, and y-axis. This makes it possible to reproduce vibrations more accurately for textures that give similar vibrations regardless of the direction in which the fingers are moved and textures with significantly different vibrations depending on the direction in which the fingers are moved.
We recorded the rubbing acceleration two times per one texture. One is to record x-axis movement. The other is y-axis movement. Each record has an acceleration x , y -axis, i.e., we have four elements of time series record for each texture (Equation (1)). The information to be recorded is defined as follows. The subscript, r, represents the state of the record, and the subscript, e.g., x y , represents acceleration direction (x) and finger’s movement direction (y). Thus, a r x y is a recorded acceleration of x-direction during a finger movement toward y-direction.
a r ( X ) = a r x x a r x y a r y x a r y y

2.2. Playing State and Its Compensation Method

To play the recorded acceleration, we used the position of the user’s finger. Based on the finger position, the system integrated the moving distance of the finger and chose the playing acceleration. In the recording state, the absolute position is recorded, though, the information is not used in the playing state. The user’s movement is not limited in both the recording/playing states at all. Thus, the recorded position history does not match the playing position history. We aim to reproduce the texture information without the history information. Though the displayed data are limited to one-dimensional data, Saga also employed the user’s finger position integration. We extended the method of Saga et al. and proposed a method to accurately record the vibration information on the X and Y axes and reproduce it on our device.
In Saga et al.’s method, the compensation of the timeline between the recording and playing state was essential. They aligned the timeline between recorded data and the playing one by the moving distance to overcome the problem. This was because the textures themself do not change, though the measured acceleration pattern changes according to the movement speed. Thus, the timeline should be controlled by the moving distance of the playing state. By employing the rubbing speed data of two states ( x ˙ r , x ˙ p ), we acquired the moving distance of each state, X r , X p .
X r ( t ) = | x r ˙ ( t ) | · Δ t
X p ( t ) = | x p ˙ ( t ) | · Δ t
Here, the absolute operator calculates the absolute of each element.
From the moving distance of the playing state, we acquired appropriate acceleration. To extend the method to the X and Y axes, we employed individual control for both axes. If we employed the two-dimensional moving distance as one parameter, the user might feel the wrong texture information. For example, if the recorded data were only a movement of the X-axis and the user moved his finger in Y-axis in the playing state, the user would feel the acceleration data of the X-axis. This would induce a misunderstanding of texture information. The recorded trajectory and playing trajectory are not related at all. Thus, we employed individual control for each axis. In Equations (4) and (5), X { r , p } ( T ) , and a { r , p } ( T ) were two-dimensional vectors, though, the calculation was made individually in each axis. By this method, we could acquire the precise acceleration history of each axis.
a r ( X ) = a r ( X r ( t ) )
a p ( X ) = a r ( X p ( t ) )
Moreover, there was a difference between sampling frequencies. In recording state, acceleration, a r , and the rubbing speed, x ˙ r , was recorded at the same time. Here, x represented the position of the fingertip. We recorded the acceleration in f a = 44.1 kHz of the sampling rate. On the other hand, the rubbing speed was recorded in f s 20 Hz, based on the touchscreen’s sampling rate. Moreover, the control frequency of vibration motor was f v < 10 kHz.
The difference between recording/playing state’s rubbing speeds and the difference between sampling frequencies, f a , f s , f v , must be compensated. Here, each subscription means “accelerator,” “touchscreen,” and “vibration motor.” If we use finger position based on the touchscreen, the slow sampling rate of touchscreen, f s , limits the refresh rate of displaying frequency. Because the sampling rate was lower than 20 Hz, the displayable acceleration data could be 20 Hz. However, the rate was too slow for displaying texture information. Therefore, we employed an interpolation method between the slow sampling data. To compensate for the sampling rate of the finger position, we employed a computer’s precise timer and the latest rubbing speeds of the finger (sampled by f s ), and calculated the interpolated moving distance between the slow samplings.
X p ( T ) = X p ( t s ) + | x ˙ p ( t s ) | · ( T t s )
a p ( T ) = a r ( X p ( T ) )
= a r X p ( t s ) + | x ˙ p ( t s ) | · ( T t s )
where t s is a nearest time to T, sampled by touchscreen, and the absolute operator calculate absolute of each element.
As a two-dimensional vibration display, we employed the same device for presentation (Figure 1). The device employed four DC motors on a touchscreen’s corners, and each motor winds the string. The four strings are connected to a finger pad which is on the center of the touchscreen. By putting the user’s finger on the pad, he/she can feel the torque of the four motors on the touchscreen.

3. Evaluation of the Proposed Method

Here, we compared the proposed displaying method and previous methods by Saga et al. [12], binormal ( F B ), and tangential ( F T ) vibrations (Figure 2). Their method records one-dimensional data only, though they proposed the two types of vibration methods to make good use of the two-dimensional vibration display.
The procedures were as follows; The participants were eight male and one female graduate student, aged 22 to 23. We used pink noise to block auditory feedback from biasing participant’s judgments. They were all right-handed, and they rubbed the display with their index finger.
We asked the participants to touch the display for about 1 min to become used to the experimental equipment. After that, the participants were asked to rub and compare the real texture with the presented textures by each method for 1 min. Then, we used the magnitude estimation method and requested to answer how much the presented texture on display could reproduce the real texture by each method.
To eliminate the influence of the order effect, the experiment was randomized for each collaborator in the order experiment of the vibration presentation method. An overview of the experiment is shown on the left of Figure 3.

3.1. Comparison of Reproducibility between Presentation Stimulus Methods

The results of experiments conducted using the eight types of textures are shown below. Figure 3 shows the results of comparing the reproducibility of each method for each texture. Tukey’s test was used for the analysis, and * in the figure shows a significant result at α < 0.05 in Tukey’s test.
From the result of artificial grass 2, the proposed method has significantly higher texture reproducibility than both the F T and F B methods, and in Sandpaper 2, the proposed method has significantly higher texture reproducibility than F T . Vibration presentation by the proposed method is suitable for materials that are hard and have irregular spatial frequencies. On the other hand, it can be seen that the vibration presentation by the proposed method is not suitable for materials with a small spatial frequency, such as tile. In Carpet 1, F B has significantly higher texture reproducibility than the proposed method. Since it is difficult to reproduce a soft texture only with vibration information, another approach needs to be considered.

3.2. Discussion

We revealed that the method could display some textures with random spatial frequencies, such as carpet or sandpapers from the first experiment with the proposed method. However, the proposed method has a problem that tactile reproducibility decreases for a texture with a certain spatial frequency (e.g., tiled floor). Hence the fidelity of the texture decreases. This could be because of the two-dimensional independent display method. The displaying method enables precise acceleration history of each axis, though, the accelerometer cannot detect the macroscopic features.
In the proposed method, independent vibrations are presented on the X-axis and the Y-axis, and it is speculated that vibrations with random periods are likely to occur, depending on the direction in which the finger is moved since the subject can freely move the finger during the experiment. However, it is difficult to present a periodic vibration. Additionally, the larger the period of the real texture, the easier it is for the users to recognize the periodicity. Therefore, the reproducibility of the virtual texture tends to be lower when compared to the real texture. Therefore, we proposed combining another rendering method to resolve this problem by employing image information.

4. Augmenting Information of Image Features

To realize the macroscopic feature detection in the two-dimensional planer pattern, we proposed employing two-dimensional image features. To combine the information of the image with the measured acceleration, we extract the features from the texture image. As a feature detector, we employ the AKAZE detector. To combine the information of the image with the measured acceleration, we extract the features from the texture image. From the development of SIFT features, most feature detectors have scale/rotation, brightness, and blur invariant characteristics. Furthermore, the features have a scale-space index. These features are important for us to detect macroscopic scale information. Different from SIFT and SURF, the AKAZE is free of use (BSD 2-Clause License). Furthermore, its computational speed and precision are better than others [15]. Thus, we chose AKAZE features for the detector and employed the scale-space index for the macroscopic feature detection.

4.1. Recording of Image Features

We propose a vibration presentation method using image features to solve the problem in displaying larger periodic textures such as tiled floor by employing scale-space indices. The parameters of feature points, such as size and angle, are considered to represent some texture information. Therefore, our method extracts the information contained in the texture image, processes it into a one-dimensional form that can be used for augmenting vibration. The procedure of presenting the actual vibration information and image information by augmentation is shown below. OpenCV is used for image processing. The procedure is described below.
  • Acquire features from texture images using AKAZE
  • Extract the size information representing the diameter of the important region around the feature
  • Obtain one-dimensional information by averaging information in each of the X-axis and Y-axis directions and then normalize
  • Augment the size information corresponding to the display position on the vibration information and presented
As large periodic textures, we used a self-made texture of a tile pattern made of polylactic acid (PLA). Figure 4 shows the result of the extraction of image features from the image of the self-made texture.
Image features can also be acquired for textures with other certain spatial frequencies in the same way. Even for the textures that showed low tactile fidelity, the augmentation method is effective. Figure 5 shows one-dimensional image features extracted from a self-made tile texture ( e ( x ) ).

4.2. Presentation of Vibration Information Using Image Feature

The presented vibration is calculated by the following equation. a x , a y is the presentation vibration on the x-axis and y-axis, and e is the size information of the image feature to be superimposed.
a p ( X p ) = a p x · e ( x ) a p y · e ( y )
Using this presentation method, it is possible to emphasize and present only the characteristic parts of the texture. Upside of Figure 6 shows the vibration information before the image features are augmented, and the downside of Figure 6 shows the vibration information after the image feature is augmented.

5. Experiment

5.1. Experiment Procedure

We used ten textures for an experiment. The textures are the following; soft artificial grass1, which is close to natural grass, artificial grass2, which is harder than natural grass, stiff carpet1, soft carpet2, self-made tiled texture, #40 coarse sandpaper, three types of placemats with different feels, and punched plastic plates. Figure 7 shows the image of the texture used in the experiment. From the previous experiment, we found that the periodic features affect the result, though, the previous dataset has only one texture that has periodic features. Thus, we appended the other textures. Furthermore, to evaluate the combined use of the proposed method more precisely, we also appended the other textures.
Participants were 6 healthy men aged 22 to 24. During this experiment, they wear eye masks to block visual information and headphones to block external sounds. They were all righthanded, and they used their right index fingers for rubbing movement. We compared several rendering methods of vibration for each texture. 5 stages Likert scale were used for evaluation.
  • Ask the subject to touch the sample texture placed on the weighing scale and train them so that the pressing force to be kept about 50 gf for 5 min
  • Have they touch the real texture for 10 s to learn the tactile sensation
  • Ask them to touch the texture presented on display for 10 s and evaluate it in five steps how much the texture have fidelity
  • Change the presentation method and have it evaluated in the same way as steps 2 and 3.
  • After completing steps 2 to 4 for all ten types of textures, we finished the experiment.
To eliminate the influence of the order effect, the experiment was conducted by changing the order of presenting patterns for each subject.

5.2. Result

The proposed two-dimensional vibration rendering method was evaluated higher than the one-dimensional vibration presentation in artificial grass 2, carpet 2, placemat 1, and placemat 2 (Figure 8). However, as a result of Tukey’s test, no significant difference was obtained for textures other than artificial grass 2. We consider the reasons as follows.
Among the highly evaluated textures for the two-dimensional vibration presentation, the following textures, artificial grass 2, sandpaper, and place mat 2, have random spatial frequencies. Additionally, although no significant difference was obtained between sandpaper and place mat 2, both scores exceeded 3.0. This suggests that our two-dimensional vibration presentation method is good at presenting textures with random spatial frequencies. There was no significant difference between the one-dimensional and two-dimensional vibration presentations with soft textures, such as artificial grass 1, carpet, and place mat 3.
On the other hand, they couldn’t feel fidelity on materials with large periodic patterns. Since the texture of tiles, place mat 1, and punched plastic sheet has large periodic patterns, it is considered difficult to reproduce it with the two-dimensional rendering method.
Figure 8 shows the result of selecting the higher evaluation of image feature augmentation methods. From the textures, we limited the targets of image-feature-based augmentation methods to the periodic textures based on scale-space indices. Thus, we applied the method to tile, place mat 1, and punched plastic sheet. By employing the image-features-based rendering method, as you can see that the presentation method that augments image features on vibration information was highly evaluated for some textures.

5.3. Discussion

Our proposed method records vibrations in the X-axis and Y-axis directions independently. It selects and presents vibrations by the direction of finger movement, making it easier to generate random-period vibrations than one-dimensional vibrations In particular, since the display surface is a hard material, we can say that evaluating a texture with a hard random spatial frequency tends to be high. This may be related to the perception of softness due to the change in the contact area between the finger and the texture.
From the image feature-based rendering method’s results, we found the method can display large periodic patterns. Using the image feature, we could emphasize the spatially characteristic part of the texture. Since place mat 1 has the longest distance between features among the three textures, the periodicity of the features seemed to contribute more to the fidelity than the magnitude of the vibration. Additionally, the punched plastic sheet texture score exceeds 3.0 even in a two-dimensional vibration presentation, although the texture has a certain spatial frequency. This could be because the distance between feature points of the texture is small, and it is difficult to recognize a constant period. This indicates that the image-features-based augmentation needs to be controlled based on the scale of periodic features.

5.4. Evaluation of Image Feature Superposition Method

The proposed two-dimensional rendering method is suitable for presenting materials with random spatial frequencies (artificial grass 2 and sandpaper 2) from the two results. In other words, it is not suitable for presenting materials with certain spatial frequencies (e.g., tile). We will discuss the reasons for this result.
For these reasons, the users felt fidelity on materials with random spatial frequencies. On the other hand, they couldn’t feel fidelity on materials with large periodic patterns. Since the texture of tiles, place mat1, and punched plastic sheet has large periodic patterns, it is considered difficult to reproduce it with the two-dimensional rendering method.

6. Conclusions

We proposed a method to record the vibration of a texture tracing using a three-axis acceleration sensor and reproduce it as faithfully as possible in two dimensions. Experiments showed that our proposed method is suitable for displaying textures with random spatial frequencies. We revealed that the proposed method could display some textures with random spatial frequencies, such as carpet or sandpapers from the first experiment. However, the proposed method has a problem that tactile reproducibility decreases for a texture with a certain spatial frequency (e.g., tiled floor). Hence the fidelity of the texture decreases.
This could be because of the two-dimensional independent display method. The displaying method enabled precise acceleration history of each axis, though, the macroscopic features cannot be detected by the accelerometer. In the proposed method, independent vibrations are presented on the X-axis and the Y-axis, and it is speculated that vibrations with random periods were likely to occur, depending on the direction in which the finger was moved since the subject could freely move the finger during the experiment. However, it was difficult to present a periodic vibration. Also, the larger the period of the real texture, the easier it was for the users to recognize the periodicity. Therefore, the fidelity of the virtual texture tends to be lower when compared to the real texture.
To resolve the abovementioned problem, we proposed a presentation method that combines image features and succeeded in improving the reproducibility of textures that are difficult to present. From the image feature-based rendering method results, we found the method can display large periodic patterns. Using the image features, we can emphasize the characteristic part of the texture. Figure 8 shows the result of a higher evaluation of image feature augmentation methods.
This paper proposed a rendering method to reproduce biaxial acceleration information through the lateral-force-displaying device using X-axis and Y-axis independent vibration information. Moreover, we proposed combining AKAZE image feature information of the textures to improve the fidelity for larger periodic textures. We held experiments to evaluate the fidelity of the proposed method. The result shows that the proposed method has higher fidelity in presenting randomized textures and large periodic textures than the conventional method.

Author Contributions

Conceptualization, S.S.; methodology, S.S.; software, J.K.; validation, J.K. and S.S.; formal analysis, J.K.; investigation, S.S.; resources, S.S.; data curation, J.K.; writing—original draft preparation, J.K.; writing—review and editing, S.S.; visualization, J.K.; supervision, S.S.; project administration, S.S.; funding acquisition, S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partly supported by JSPS KAKENHI 18H04104G1 (Grant-in-Aid for Scientific Research (A)), 19K2287900 (Grant-in-Aid for challenging Exploratory Research), and the Asahi Glass Foundation.

Institutional Review Board Statement

The study was approved by the Ethics Committee of Kumamoto University as #H30-1.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data sharing not applicable because of the ethical issues.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chubb, E.C.; Colgate, J.E.; Peshkin, M.A. ShiverPaD: A Glass Haptic Surface That Produces Shear Force on a Bare Finger. IEEE Trans. Haptics 2010, 3, 189–198. [Google Scholar] [CrossRef] [PubMed]
  2. Yamamoto, A.; Ishii, T.; Higuchi, T. Electrostatic tactile display for presenting surface roughness sensation. In Proceedings of the 2003 IEEE International Conference on Industrial Technology, Maribor, Slovenia, 10–12 December 2003; IEEE: Piscataway, NJ, USA, 2004; Volume 2, pp. 680–684. [Google Scholar] [CrossRef]
  3. Takasaki, M.; Tamon, R.; Kotani, H.; Mizuno, T. Pen tablet type surface acoustic wave tactile display integrated with visual information. In Proceedings of the 2008 IEEE International Conference on Mechatronics and Automation, Takamatsu, Japan, 5–8 August 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 357–362. [Google Scholar] [CrossRef]
  4. Konyo, M.; Yamada, H.; Okamoto, S.; Tadokoro, S. Alternative Display of Friction Represented by Tactile Stimulation without Tangential Force. Haptics: Perception, Devices and Scenarios. In Proceedings of the 6th International Conference, EuroHaptics 2008, Madrid, Spain, 11–13 June 2008; pp. 619–629. [Google Scholar] [CrossRef]
  5. Makino, Y.; Saeki, M.; Maeno, T. Tactile Feedback for Handheld Touch Panel Device with Dual Vibratory Stimulus. In Proceedings of the Asian Conference on Design and Digital Engineering, Seoul, Korea, 25–28 August 2010. [Google Scholar]
  6. Wang, D.; Tuer, K.; Rossi, M.; Shu, J. Haptic overlay device for flat panel touch displays. In Proceedings of the 12th International Symposium on HAPTICS’04, Chicago, IL, USA, 27–28 March 2004; IEEE: Piscataway, NJ, USA, 2004; p. 290. [Google Scholar] [CrossRef]
  7. Bau, O.; Poupyrev, I.; Israr, A.; Harrison, C. TeslaTouch: Electrovibration for touch surfaces. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, 3–6 October 2010; ACM: New York, NY, USA, 2010; pp. 283–292. [Google Scholar] [CrossRef]
  8. Visell, Y.; Law, A.; Cooperstock, J.R. Touch Is Everywhere: Floor Surfaces as Ambient Haptic Interfaces. IEEE Trans. Haptics 2009, 2, 148–159. [Google Scholar] [CrossRef] [PubMed]
  9. Kuchenbecker, K.J.; Gewirtz, J.; McMahan, W.; Standish, D.; Martin, P.; Bohren, J.; Mendoza, P.J.; Lee, D.I. VerroTouch: High-frequency acceleration feedback for telerobotic surgery. Haptics Gener. Perceiving Tang. Sensat. 2010, 6191, 189–196. [Google Scholar] [CrossRef] [Green Version]
  10. Romano, J.M.; Kuchenbecker, K.J. Creating Realistic Virtual Textures from Contact Acceleration Data. IEEE Trans. Haptics 2012, 5, 109–119. [Google Scholar] [CrossRef] [PubMed]
  11. Minamizawa, K.; Kakehi, Y.; Nakatani, M.; Mihara, S.; Tachi, S. TECHTILE toolkit—A prototyping tool for design and education of haptic media. In Proceedings of the Virtual Reality International Conference (Laval Virtual) 2012, Laval, France, 28–30 March 2012; p. 22. [Google Scholar] [CrossRef]
  12. Saga, S.; Raskar, R. Simultaneous geometry and texture display based on lateral force for touchscreen. In Proceedings of the World Haptics Conference (WHC), Daejeon, Korea, 14–17 April 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 437–442. [Google Scholar] [CrossRef]
  13. Brisben, A.; Hsiao, S.; Johnson, K. Detection of vibration transmitted through an object grasped in the hand. J. Neurophysiol. 1999, 81, 1548–1558. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Visell, Y.; Law, A.; Cooperstock, J.R. Toward iconic vibrotactile information display using floor surfaces. In Proceedings of the World Haptics 2009-Third Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, CT, USA, 18–20 March 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 267–272. [Google Scholar] [CrossRef]
  15. Tareen, S.A.K.; Saleem, Z. A comparative analysis of sift, surf, kaze, akaze, orb, and brisk. In Proceedings of the 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur, Pakistan, 3–4 March 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–10. [Google Scholar] [CrossRef]
Figure 1. Lateral-force-based display.
Figure 1. Lateral-force-based display.
Sensors 21 05523 g001
Figure 2. Binormaland tangential vibration: binormal vibration vibrate in binormal to movement direction, and tangential vibration vibrates in tangential to movement direction.
Figure 2. Binormaland tangential vibration: binormal vibration vibrate in binormal to movement direction, and tangential vibration vibrates in tangential to movement direction.
Sensors 21 05523 g002
Figure 3. Left: an overview of the experiment, right: comparison results of reproducibility rating between methods. * in the figure shows a significant result at α < 0.05 in Tukey’s test.
Figure 3. Left: an overview of the experiment, right: comparison results of reproducibility rating between methods. * in the figure shows a significant result at α < 0.05 in Tukey’s test.
Sensors 21 05523 g003
Figure 4. Left: self-made texture, right: image features acquired by AKAZE.
Figure 4. Left: self-made texture, right: image features acquired by AKAZE.
Sensors 21 05523 g004
Figure 5. Parameters extracted from self-made tile texture, e ( x ) .
Figure 5. Parameters extracted from self-made tile texture, e ( x ) .
Sensors 21 05523 g005
Figure 6. (a) Vibration information of self-made tile texture, (b) Vibration information with the parameter e ( x ) superimposed.
Figure 6. (a) Vibration information of self-made tile texture, (b) Vibration information with the parameter e ( x ) superimposed.
Sensors 21 05523 g006
Figure 7. Textures used in the experiment.
Figure 7. Textures used in the experiment.
Sensors 21 05523 g007
Figure 8. Reproducibility in 5 staged Likert scale.
Figure 8. Reproducibility in 5 staged Likert scale.
Sensors 21 05523 g008
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Saga, S.; Kurogi, J. Sensing and Rendering Method of 2-Dimensional Haptic Texture. Sensors 2021, 21, 5523. https://doi.org/10.3390/s21165523

AMA Style

Saga S, Kurogi J. Sensing and Rendering Method of 2-Dimensional Haptic Texture. Sensors. 2021; 21(16):5523. https://doi.org/10.3390/s21165523

Chicago/Turabian Style

Saga, Satoshi, and Junya Kurogi. 2021. "Sensing and Rendering Method of 2-Dimensional Haptic Texture" Sensors 21, no. 16: 5523. https://doi.org/10.3390/s21165523

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop