Next Article in Journal
Fault-Tolerant FPGA-Based Nanosatellite Balancing High-Performance and Safety for Cryptography Application
Previous Article in Journal
Unified Graph Theory-Based Modeling and Control Methodology of Lattice Converters
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Study of Cutaneous Perception Parameters for Designing Haptic Symbols towards Information Transfer

1
Electrical Engineering Department, Tshwane University of Technology, Pretoria 0001, South Africa
2
Tianjin Key Laboratory for Control Theory & Applications in Complicated Systems, Tianjin University of Technology, Tianjin 300384, China
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(17), 2147; https://doi.org/10.3390/electronics10172147
Submission received: 21 June 2021 / Revised: 23 July 2021 / Accepted: 11 August 2021 / Published: 3 September 2021
(This article belongs to the Section Bioelectronics)

Abstract

:
Vibrotactile displays can substitute for sensory channels of individuals experiencing temporary or permanent impairments in balance, vision, or hearing, and can enhance the user experience in professional or entertainment situations. This massive range of potential uses necessitates primary research on human vibrotactile perception. One leading aspect to consider when developing such displays is how to develop haptic patterns or symbols to represent a concept. In most settings, individual patterns are sorted as alphabets of haptic symbols which formulate tactons. Tactons are structured and perceivable tactile patterns (i.e., messages) that transfer information to users by employing the sense of touch. Hence, haptic patterns are critical when designing vibrotactile displays, as they not only affect the rate of information transfer but also determine the design of the displays (e.g., the number and the placement of tactors engaged) and how the information is encoded to achieve separability. Due to this significance, this paper presents an overview study on the cutaneous perception parameters (i.e., intensity, loci, frequency, duration, illusions, and combinations of these) for designing haptic symbols to identify mutual best-practices and knowledge gaps for future work. The study also provides developers from different scientific backgrounds with access to complex notions when engaging this specialized topic (i.e., the use of cutaneous perception parameters towards information transfer). Finally, it offers recommendations on defining which parameters to engage for a specific task or pattern.

Graphical Abstract

1. Introduction

The word haptics refers to the sensing of external environments using touch (i.e., the experience of touch sensations from external forces, motions, rumbles, or vibrations when we handle, wear, or operate on objects). It encompasses kinesthesia and tactile sensing [1], with the latter given the most attention in this study. Tactile sensing, also known as cutaneous perception, refers to the response of the body to mechanical stimuli (e.g., vibration or pressure) acting on the skin by engaging the various types of mechanoreceptors [2,3].
In the beginning, cutaneous perception research focused on sensory substitution systems (SSSs) that conveyed imagery or speech information to individuals with sensory impairments by stimulating their sense of touch [1,4,5,6,7,8,9]. These systems engage tactors (e.g., piezoelectric actuators) to stimulate the skin. With the development of technology (e.g., miniaturized software and hardware), modern haptics studies are exploring further into the science and technology of applications related to information transfer (IT) [9,10] and object manipulation [11] using touch, together with all features of manual exploration by smart machines, humans, as well as the human–machine interactions, under virtual, networked, teleoperated, or real settings [12,13].
In fact, in recent times, the field of haptics has outgrown the horizon of scientists or experts to broader audiences concerned with creating aspects of tactile feedback presently lacking when they interact with smart machines (e.g., tablets) and virtual-reality tools. One of the challenges is to create, in a virtual environment, sensations like those achieved during regular physical interactions [14]. Today, many of our daily interactions and activities are loaded in the sight and listening senses (e.g., driving). Thus, there is a growing need for supplementary sensing dimensions to the visual and auditory interactions, especially when these channels are burdened or absent (sensory substitution and addition) [9].
The growth of wearable technologies (i.e., wearable haptic devices) and relevant technological advances, specifically in image processing, natural language translation, microelectromechanical devices, automatic speech processing and recognition, and approaches to training and learning of novel haptic devices have created renewed interest in engaging touch as a channel of communication, to relieve or substitute the overloaded or absent visual and auditory modalities. The goal of such efforts, unlike the conventional vibration signals presently employed as notifications in various devices, is to determine how to develop wearable tactual displays that transfer sufficient information relating to environmental perception by engaging haptic patterns or symbols [14]. However, unlike the notifying vibration patterns, such complex vibrotactile communication systems are yet to mature [15], even though some functionalities exist in some solutions. Nevertheless, with the developments in technology, a new wave of wearable haptic displays appears ready for significant improvements in auditory or sight to touch translation and reception.
Haptic displays engaging mechanical stimulation by arraying tactors combined with sophisticated haptic language (tactile vocabulary) are critical in sensory addition and substitution. Being the main components of sensory substitution and addition, these displays can substitute malfunctioning modalities or supplement the sensory experiences of individuals with impermanent or permanent impairments [16,17,18]. This study concerns itself with the wearable tactual displays designed for SSSs, particularly the various types of cutaneous sensing parameters commonly engaged to design haptic alphabets towards IT. However, the findings of this study can be directly applied to sensory addition systems as well.
One primary design consideration when developing tactile displays for SSSs is determining how to create perceivable haptic patterns [8,19]. This is challenging for several reasons. Firstly, touch has different temporal and spatial resolution compared to audition and sight [14]. Different metrics have been engaged to determine these variations between sensory channels and confirmed that touch is an intermediate modality. Spatial resolution is a physical quantity, which states the spatial separation between different probes of stimuli necessary to achieve perception [20]. For example, the minimum distance required between two tactile probes to realize separability (i.e., the discrimination threshold). Our fingertips can separate tactile probes about 1 mm apart [21], which puts the haptic sense right between sight and audition, with sight having superior spatial acuity.
On the other hand, temporal resolution defines the time separation necessary for two probes of stimuli presented to touch to be perceived as sequential and not contemporaneous [9,20]. For the skin, a time separation of 5 ms (i.e., temporal resolution) can be resolved [14], which is inferior to audition (0.01 ms) but superior to vision (25 ms). These differences between the sensing of hearing, vision, and touch, make it challenging to encode concepts of hearing and sight to touch [9]. For example, it is challenging to encode concepts of sight to haptics, due to the difference between the huge number of pixels describing a scene and the limited number of tactors that can be configured in a wearable tactual display, due to the inferior spatial resolution of the skin. Similarly, the difference between the number of symbols in the English language and the limited number of tactors that can be configured across the skin makes it challenging to encode concepts of audition to touch. Likewise, it is challenging to design uniform tactile patterns across all users and applications, because of the differences in the applications as well as the capacities of users to perceive, memorize, and categorize stimuli.
Secondly, the haptic sense has considerably less bandwidth or throughput compared to audition and vision [9,20]. In other words, the amount of information it can process over a specific time is less than that processed by the other two sensory systems. The metric bits per second (bps), measuring the capacity of sensory channels to process information, is now commonly used [22]. According to Spence [23], sight, hearing, and touch can transfer information at rates of 106, 104, and 102 bps, respectively. For the sense of touch, this seems like an overestimate, especially when engaging the existing tactual displays. Today, a more realistic estimate of the haptic communication capacity is 12 bps [10,24]. Fundamentally, the differences in throughput make it challenging to encode concepts of hearing or sight to touch stimuli towards high IT. For example, the differences between the amount of data from the source of stimuli and its throughput, and those encoded into the skin at practical rates make it complex to design separable tactile patterns.
Thankfully, our somatic sensory system through the mechanoreceptors that have different physical properties, can receive nuances or shades of the deformation of the skin induced by vibration [25]. It is hence practicable to conquer the above differences and challenges, to achieve acceptable IT rate using sensory substitution displays (SSDs), by creating haptic vocabularies through manipulating different parameters of tactile perception (i.e., time, location, movement, frequency, and magnitude) [10]. Depending on the engaged parameters, tactile alphabets vary in throughput and accuracy. Hence, the levels of accuracy and IT depend on the separability (i.e., perceptual independence) of the encoded parameters as well [20].
Without undermining the other design components of SSSs, the methodology of this paper is to summarize the cutaneous perception parameters of vibrational stimuli and the strategies that encode them to tactile patterns towards IT. The paper views such as critical due to the effect of tactile patterns on the amount of IT (psychophysics), the design of the display (sensory interface), and how the information from the source of stimuli is encoded to the target modality to achieve separability. The work considered in this study is published in peer-reviewed journals, books, or in conference proceedings. The work differs in success, scope, number of participants, and the required time of training. The scope of the work includes, among others, the design of tactile patterns by either utilizing illusions or not. The major stream of databases was consulted, including IEEE Xplore, ScienceDirect, SpringerLink, and Google Scholar.
The rest of the paper is set out as follows: Section 2 presents a summary of the perceivable physical properties to humans and the cutaneous parameters employed in past studies to construct tactile patterns. The basic design model of a generic SSS and how it should interact with users is presented in the section as well. Section 3 and Section 4 discuss some of the existing methods relevant to tactile communication, with a goal of finding better practices in the roles performed by the parameters of the cutaneous sense when transferring information using touch, either by engaging tactile illusions or not. Section 5 concludes the paper with some recommendations and future work.

2. Considerations of SSS Design

Generic SSSs comprise the source of stimuli, a sensory interface, a target modality, and user training software [26] (Figure 1).
The source of stimuli contains the data in its natural form that excites sensory substitution devices and the target modality. In the case of visual data, it is an array of pixel values. In most cases, the properties of the source of stimuli disregard the perceptual limits of the sensory interface and the target modality. Thus, different approaches have been proposed to encode vision or hearing to touch considering their physical differences to keep enough data to “listen”, “read”, or “see” using touch. In general, such techniques configure tactors (electronic actuators) into a wearable haptic display, in order to convey tactile icons which define the perceived scene (i.e., source of stimuli). An icon is a symbol or image representing a concept [27,28]. Haptic icons are displayed to users as tactons, which are vibrotactile patterns representing abstract messages [20]. Tactons must be arranged in an effective and perceivable linguistic structure to convey information to users using the SSDs by manipulating parameters of the cutaneous sense [1,29].
The sensory interface is usually made from various low-cost electronic devices which include micro-controllers (e.g., Arduino), wireless enabling devices (e.g., Xbees), actuators or tactors (e.g., vibration motors), and other basic electronic components. Due to the complex relationship between cutaneous perception and mechanical stimulation (vibrotactile stimulation), it is fitting to summarize the available tactors for SSDs design. At a minimum, a tactor must create a type of sensation that humans can feel and be able to function under a specific computer instruction (e.g., turn on or off). Other key engineering considerations generally include cost, response time, availability, shape, size, input requirements, power consumption, and robustness against potential interference [30]. Even with this summary, the importance of coupling tactors with the skin makes it challenging for one to determine beforehand how well a specific actuator will perform for the engaged task. We have therefore discovered that design iteration of prototypes and frequent experiments with subjects can help ensure high functionality levels for the designed SSDs.
There are several types of actuators that can be engaged during SSDs design. Amongst others, vibrating motors are commonly engaged, particularly the so-called ERMs (eccentric rotating mass). ERMs are DC motors, whose rotors generate rotational motion in the housing of the motor and in turn, induce vibrational sensations to the users. These motors differ in shapes and sizes, and the frequency of vibration depends on the input voltage. Nevertheless, the magnitude of vibration is constant [31], which limits the design of tactual patterns that are separated by intensity. Also, their stimuli–response times are relatively large when compared to similar actuators.
Equally popular are audio exciters or voice coil actuators, which possess similar functionality to the speakers in audio devices. Linear resonance actuators (LRAs) are voice coil actuators that are miniaturized for use in mobile devices and SSDs. Whilst LRAs have faster stimuli response times than ERMs, their frequency bandwidth is narrow [30]. Piezoelectric tactors are also commonly used during SSDs design due to the wide frequency bandwidth. Nevertheless, they generally entail high input voltage and are not robust against external impact forces or shocks. Besides vibration devices, electroactive tactors (e.g., polyvinylidene fluoride), and electrostatic actuators, can be engaged to encode information to the skin. This paper, however, is concerned with SSSs that engage the types of vibration actuators, due to the availability of multiple encoding dimensions.
Another important consideration when designing SSSs is how to train users to associate tactons with the environmental data they encode. Effective training retains good performance (e.g., identification performance) and memorization with the least time of training. For example, in the case of text to touch translation (i.e., “skin reading”) [8], it should generalize the learned knowledge to identify untrained words. Different perceptual learning techniques of tactile communication have been proposed in the literature [32]. For example, in skin reading, the “bottom-up” method in which participants are trained using parts of speech (e.g., phonemes) to construct phrases can be used.
Congruently, how to quantitatively measure the relationship between physical stimulation and its perception (i.e., psychophysical evaluation) is a key component of SSS design. Several psychophysical techniques have been employed to investigate human tactile perception [32], although the choice of a particular technique is generally made in an ad hoc manner, instead of analyzing which technique is optimum for the desired question. Various singular features of the tactile sense affect how these techniques are implemented experimentally, for example, unlike in vision and audition, where stimuli are acquired in specific ways, for touch, the stimuli are either active or passive and the obtained information might fundamentally vary in these two scenarios. Similarly, unlike the large stimulus sets and corresponding short interstimulus intervals present in psychophysical evaluations of sight and hearing, for touch, the size of the stimulus sets displayed during psychophysical experimental paradigms is comparatively small [10]. This mirrors the characteristic delays in existing SSDs that are commonly engaged to convey tactile stimuli.
Furthermore, as subjects physically interact with SSDs, they are sometimes required to explore sequences of stimuli before reaching a judgement. This time of exploration along with the trial duration will, consequently, be longer (3–10 s) compared to the exhibition (50–100 ms) and response times in auditory or vision psychophysical experiments [33,34]. This creates the challenge of performing a sufficient number of trials needed to perform such experiments, for example, estimating the IT and IT rate of a tactual communication system involves a large number of trials (i.e., 5k2, where k is the number of stimuli probes in a set) [22]. This makes it time consuming for researchers to obtain an adequate number of trials to achieve a true estimate of IT. To address such challenges, the general additivity law for IT can be utilized [35]. For a detailed understanding of the methods of perceptual learning, psychophysical evaluation of tactile communication systems, and the available actuators for SSD design, refer to [30,32,33] respectively.

2.1. Psychophysics of the Cutaneous Sense

Human sensory receptor systems are sets of non-linear, time-varying organisms that show dynamic characteristics, for example, adaptation, and they do not reach a seamless, linear relationship between input (e.g., vibration) and output (i.e., transduced electrical signal). Also, given a specific receptor type, a big difference may exist in how that receptor responds to outside stimuli depending on how it is engrained in our bodies. For example, how the hair cells outside our body respond inversely to sound when compared to similar cells in the ear. Somatic receptors have smaller response variation from outside stimuli and can be considered as linear time-invariant at coarse time and stimulus intensity scales [26]. This possibly reduces, to some extent, the difficulty of developing effective SSSs that use touch as the target modality.
Our skin is embedded with different types of mechanoreceptors, which are classified by their stimuli–response times and rate of adaption (SA: slow-adapting, and FA: fast-adapting) concerning the onset or offset of stimuli against sustained response in the presence of continuous stimuli as well as their broad receptive field (i.e., large or small) [20,25]. Thus, cutaneous perception does not respond in a particular manner to different properties (e.g., time) of mechanical stimulation, which in this study refers to vibrotactile stimulation properties or dimensions. This ability of cutaneous perception is invaluable when designing haptic symbols as it permits separability and identification.
The four primary types of mechanoreceptors that provide sensory data to the brain about touch, and vibration are Ruffini’s corpuscles, Meissner’s corpuscles, Merkel’s discs, and Pacinian corpuscles (Figure 2). This group of receptors is identified as high-sensitivity or low-threshold mechanoreceptors because potential actions can be induced by weak mechanical stimulation (i.e., vibration properties) [25]. It is thus appropriate for this study to discuss the cutaneous perception properties in brief detail.

2.2. Cutaneous Parameters Relevant to SSD Design

The experience of touch sensations from physical stimuli is produced by the central nervous system and the cutaneous receptor properties [36]. Surprisingly, for the tactile sense, there is a unique neurological relationship between the sensitivity and acuity of a body site and the corresponding size of the cortical area that processes its sensory inputs [37]. Hence, different body sites have different spatial acuity and sensitivity. In other words, for the same stimuli, the magnitude of its perception changes from site to site. Research [38,39] on the sensation thresholds of vibrotactile stimuli reveal that our fingertips are more sensitive to vibrotactile stimuli when compared to other body sites by one order of magnitude at a minimum. For example, our thoracic region, specifically the abdomen, at 200 Hz of vibration, is 60 times less sensitive than our fingertips. Nonetheless, SSDs do not only engage fingertips [40,41], but also other body parts (e.g., forearms [42,43], wrists [44,45], feet [46], and abdomen [16]).
Although the tactile sensitivity of the site is critical for perception, it is generally not considered as a cutaneous sensing parameter for designing haptic symbols. Instead, the point or site of stimulation on the body can afford effective cues to which users can readily relate [47,48]. Consequently, it is the location of an exciting motor (i.e., vibrotactor) in a sensory substitution display or the loci of simultaneously activated tactors in an array of motors that can encode information about the external environment by designing tactile patterns. Likewise, the order of stimulation of motors in a configuration of tactors can be employed to encode spatial and temporal data from the perceived environment by means of designing tactile patterns. However, certain features of tactile sensitivity (i.e., two-point discrimination threshold, point localization, sensitivity thresholds, and the just-noticeable difference) impact the perception of haptic symbols.
The two-point discrimination threshold is concerned with the smallest distance necessary between different points of stimulation to be perceived as separate stimuli [6]. Point localization is the ability to accurately localize the point of stimulation [47]. It is not chance that body sites with larger portions of the cortical area have smaller discrimination thresholds. On the other hand, the just-noticeable difference (JND) is concerned with the smallest variation in the magnitude of stimuli that is noticeable to the user [20]. Such features of tactile sensitivity ought to be considered during the design of haptic patterns to allow perception and separability. For example, when involving spatial encoding (simultaneous activation of tactors), two-point discrimination thresholds are vital during the design of SSDs to avoid masking effects (i.e., the hindrance of perception of one point of stimulation by the others). In the same way, when utilizing various intensity levels to encode data, the JND and magnitude thresholds are fundamental guides during the designing of tactile patterns to avoid pain sensations or discomfort to users.
Like the site or point of stimulation on the body, the vibration frequency can be used to design haptic symbols [6]. Mechanoreceptors in the skin can perceive vibration frequencies of between 0.4 Hz and 1 kHz, although SSDs generally presents tactons between 50 Hz and 300 Hz [48]. It was proposed [49] that when designing tactile patterns, at most, nine different levels of frequency can be utilized to reach separability. For low-frequency levels, the discrimination threshold is 5 Hz, and for higher frequency levels (i.e., more than 320 Hz), the threshold increases [50]. Also, Brewster and Brown [51] suggest that the cutaneous capability to discriminate frequency levels is enhanced by presenting stimuli in a relative, instead of an absolute way. Nonetheless, frequency perception depends on the amplitude of stimuli. For that reason, others suggest that these can be integrated into a single encoding parameter [28]. Similarly, the magnitude of stimuli can be engaged as a separate haptic encoding dimensions [8,9,10]. Its perception however weakens beyond 28 dB [28]. Literature suggests that, at the most, only four different levels of magnitude can be employed when designing haptic symbols for high IT rates and accuracy [20].
There are different aspects of the time of vibrotactile stimuli that can be used to create haptic patterns, for example, duration, repetition rate, the number of displayed pulses per unit time, and rhythm. The haptic sense can identify successive stimuli separated by 5 ms. However, the threshold for presenting vibrotactile stimuli to get the separability of tactons is 10 ms [52]. Equally, a combination of the spatial distribution and the time (i.e., temporal features) of stimulation can encode environmental information as well [1,8]. Hence, a cutaneous sensing parameter can include various body locations (distinct tactor layout) and temporal sequencing (e.g., overlapping spatial–temporal encoding approach for designing tactile patterns).
Undeniably, it is technically easier to stimulate a single location on the skin, however, multilocation haptic stimulation which engages several tactors configured across various body sites can significantly improve the human perception and processing of haptic symbols [10]. Importantly, multilocation stimuli can be displayed concurrently or in succession [20]. This immensely alters the design environment of haptic patterns, for example, a time-based stimulus sequence can induce movement illusions in the skin [53]. Also, by changing the intensity of successive stimuli in a strong-to-weak manner, a downward movement is supposed [54]. Such illusions can equally be employed as separate encoding parameters of the cutaneous sense when designing haptic patterns using vibration signals.
Undeniably, from literature, our cutaneous senses can perceive different features of vibrational stimulation. By manipulating or integrating these, it is conceivable to counter to some extent the variation between the sensing resolution of touch and other sensory channels when encoding data using tactile symbols [4,6,28]. This challenge is defined as how to design an alphabet of tactual symbols, which, like that of letters or numbers, is naturally separated, effortlessly learned, and quickly processed [53].

3. Strategies for Designing Tactile Patterns without Illusions

Our pursuit of methods for designing tactile patterns towards IT led us to many relevant studies. These studies typically include the development of tactile displays and vocabularies to achieve effective human–machine interaction. To this end, we tabulated the vibration parameters engaged to design the tactile vocabularies and the tasks achieved in order to observe design trends. Classically, separability, learnability, and transferability were primary when evaluating haptic alphabets in these studies. To design tactile patterns that satisfy these conditions, inventors typically utilize different combinations of the skin’s perceptual properties. The skin’s perceptual properties considered by this study are time, frequency, magnitude, and location, which have definitive meanings, and illusions, which are usually defined by many dimensions, for example, a combination of space and time [9,55].
By using one-element displays (i.e., single stimulation points), some methods encode information through tactual alphabets that have symbols that differ in magnitude or frequency, or a fusion of the two. For example, Goldstein Jr and Proctor [56], and Leder et al. [57] proposed an auditory SSD in which different vibration magnitudes are displayed through a tactor mounted on the chest to resemble the sound intensity from microphones. However, most solutions employ multiple tactors to encode environmental data. In 1957, Geldard [58] presented a tactile vocabulary that involved five tactors mounted on the chest to encode 45 symbols of English (i.e., numbers, frequently used words, and letters). Each tactile pattern involved one tactor, however, the patterns varied in magnitude, point of stimulation, and duration. After 65 h of training, one subject was able to acquire the English language at a rate of 38 wpm (words per minute).
Since Geldard’s success [58], developers have demonstrated many ways to design tactile symbols by employing multiple tactor arrays. For example, Tan et al. [59] designed a multiple finger tactile display to fully utilize the skin’s information transmission ability. Encouraged by a natural technique of communicating speech through touch in which listeners acquire multidimensional information related to the articulative process by putting their hands on the faces and necks of the talkers, Tan et al. [59] created the Tactuator which transmitted tactual symbols to the distal pads of the thumb, index, and middle fingers. In total, an alphabet of thirty tactile patterns separated by varying frequency and magnitude levels was displayed to one or more (up to four) stimulation points. In turn, a stimulus set of 120 symbols was proposed with an IT of 6.50 bits, which relates to 90 separable symbols. In similar identification experiments [29,60], an alphabet of eight tactile patterns was invented by varying the frequency, intensity, and pulse duration. Here, an IT of 2.41 bits was achieved, which indicates that only three out of the eight symbols were identifiable or separable.
Nonetheless, encoding information is above separability of different patterns since it needs users to precisely identify the tactile patterns and map them to the information they encode. This property appears challenging to realize with intensity and frequency modulated patterns [20]. On the contrary, spatial, and temporal patterns are comparatively easy to label [8]. Just like the retina, the skin is a spatially enlarged receptive body element that can convey spatial patterns which might otherwise be perceived by the sense of sight. The challenge that arises is how to display spatial patterns on the skin to maximize its perception abilities. Most of us are acquainted with finger spelling as a way of tactile interaction: moving a single stimulation point in space and time allows high identification accuracy of patterns (see Table 2), nevertheless, it loads the memory of users, has low bandwidth, and involves intricate preprocessing particularly when using video or textual inputs [61].
A more functional but considerably less informative method is to display the spatial symbol at once to the skin’s surface (spatial encoding) as demonstrated by Novich and Eagleman [9]. In their demonstration, Novich and Eagleman designed an alphabet of eight tactile symbols for the identification tasks. Each pattern used three tactors which were stimulated concurrently. The separability of patterns was realized by the points of stimulation. Although achieving high bandwidth, the study documented that the identification and labelling of spatially encoded symbols were challenging when compared to those encoded using space and time. The study proposed that, when inventing tactile patterns, a compromise between the bandwidth and the separability of tactons should be maintained. Similar works also confirmed these findings which can be summarized as follows: for identification, displaying the spatial patterns on the skin in a relative way, outperforms absolute encoding due to masking effects (Table 2).
Masking effects occur when several points of stimulation are excited simultaneously and some ‘mask’ or hinder the perception of others. By logical manipulation of the activation time or by increasing the tactors’ point-to-point separation, one can reduce this occurrence [20]. Hence, developers tend not to stimulate all the points at once, but rather incorporate temporal delays (i.e., spatiotemporal encoding) (Figure 3). Similarly, by displaying stimuli at different frequencies (one below 80 Hz and the other above 100 Hz), the masking effects can be reduced [70].
Primarily, sequential spatiotemporal encoding where tactors displaying a pattern are exited one after the other with only one motor active at a time is utilized. Notably, the ideal number of tactors defining a sequence was found to be between two and six [10]. However, the total number of tactors used to form cutaneous alphabets using spatiotemporal encoding seems to range between 5 and 400 [8,9,20,32,42,61]. While spatiotemporally encoded patterns are more noticeable to users, the extent of the realized bandwidth is small relative to that realized by spatially encoded patterns because of the time span of each tactile symbol (Figure 3). To achieve a tradeoff between identification accuracy and bandwidth, overlapping spatiotemporal encoding techniques were developed [71], where tactors are stimulated in sequence and stay activated until a pattern is ended. In these studies, it is shown that overlapping spatiotemporal encoding creates tactile patterns that allow better recognition accuracy comparative to spatial patterns, at the expense of bandwidth.
Likewise, temporal properties of the cutaneous sense can be utilized to encode information. In a few experiments done by Velazquez and colleagues [62,63], cutaneous words were constructed by vibrations of short and long durations using a tactual display with four motors on the foot sole. After a time of training in which subjects had to learn the vibration patterns of four to five words, they were tested on various sequences of “haptic sentences” of the words to determine the recognition accuracy in percentage correct scores. An 84% correct score was achieved when labelling a single word from the stimulus set, and a 66% correct score was achieved when labelling sentences of four haptic words. Here, temporal properties of touch were proved to be promising dimensions for building tactile languages.

4. Strategies for Designing Tactile Patterns by Engaging Illusions

As described in the previous sections, tactile awareness relies on integrated temporal and spatial mechanisms that are confined to specific stimulation parameters. Thus, haptic symbols not compliant with such parameters may result in illogical perception. Similarly, interactions between the spatial and temporal features of stimuli may result in perceptual illusions, for example, a time-based sequence of tactual stimuli displayed to different sites of the body can induce a moving sensation (illusion) on the skin. Such illusions, similar to the vibration frequency, time, space, and magnitude can be engaged as separate encoding parameters of the tactile sense towards IT. As a result, various types of cutaneous illusions exist in the literature. However, this study considers the three types of haptic illusions, which are commonly utilized when designing haptic symbols (i.e., apparent motion, sensory saltation, and phantom sensation) for SSSs.

4.1. Apparent Motion (The Illusion of Movement)

The apparent tactile motion which is also known as the phi phenomena is characterized by an illusion of one stimulus point that is moving smoothly across the skin’s surface [53]. In other words, apparent motion is the illusion of movement that results from separate stimuli that are being presented successively to the skin. For example, if two tactors are configured in proximity, and the times of their actuation overlapped, instead of both, the user perceives a single stimulus point moving between them (Figure 4).
In 1930, Neuhaus [72], defined the variables of getting effective apparent movements. These include (a) duration of stimuli, and (b) the time between the onset of two successive stimulations (SOA). Several conforming works established such findings [73], and the optimum SOA value was empirically proven to be 0.32d + 47.3 ms, where d represents the stimulus duration [53]. By manipulating the duration of stimuli, one can regulate the speed of the illusional motion, and in return realize the separability of tactons. This, however, involves the adjustment of SOA values to keep the illusion, otherwise, stimuli are perceived only as successive vibrations. To maximize the IT of tactons, the variations in the speed of motion must be discriminable, and the tactile symbols must be short. Also, the direction (horizontal or longitudinal) and shape of motion, and the number of engaged tactors (distance) can be utilized to achieve separability [73,74,75].
Apparent tactile motion has been engaged in various research studies, as a cutaneous dimension for designing tactual symbols. Recently, Reed et al. [42] impress motion sensations across a tactile display (TAPS) using pulsatile stimuli conveyed in a systematic temporal order to the respective tactors. To get smooth apparent motions, the SOA and durations of pulses were realized using the findings from Israr and Poupyrev [53] (i.e., 0.32d + 47.3 ms). In the study [42], 15 tactile symbols were designed to encode 15 English vowels, with two vowels encoded using the apparent motion-sensing parameter. This duet of patterns was separable by the extent and direction (wrist-to-elbow, and elbow-to-the middle of the forearm) of motion.

4.2. Tactile Sensory Saltation (The Illusion of Mislocalization)

Sensory saltation or “the cutaneous rabbit” involves systematic excitation in the spatial perception of spatiotemporally encoded tactile symbols [76]. Unlike apparent motion, the sensory saltation illusion is induced by short, nonoverlapping stimuli probes from one tactor, followed by the same action using a different tactor, and so on. For example, if two separate stimuli are to excite two separate sites in proximity over a short delay, the apparent point-of-stimulation of the first stimulus is mislocated in the direction of the following point-of-stimulation. This phenomenon (i.e., errors of mislocalization) increases as the delay between the stimuli reduces (i.e., as the SOA is reduced, the stimuli probes seem to be much closer spatially). Thus, SOA values can be utilized to achieve separability of tactons. The range of the optimum delay between pulses is between 20–250 ms [77], otherwise, no spatial separation is perceived at all (when SOA is <20 ms), else localization accuracy is enhanced (when SOA is >300 ms).
The saltation phenomenon can be engaged to enhance the spatial resolution of tactile communication devices. For example, by integrating the spatial and temporal sequence of tactor excitation, it is possible to develop cutaneous communication devices that have perceptually higher spatial resolution than the actual number of the employed tactors would indicate [55]. In the case of SSDs designed for the purpose of navigational, by combining separate inputs with the appropriate selection of time intervals, developers induced directional cues on the skin, as demonstrated by Lee et al. [75]. In this study, a 3 × 3 tactor configuration was attached to the wrist of users, to display four directional patterns (up, down, left, and right) by using the saltatory motion. Here, the identification accuracy and IT were higher than those achieved with corresponding spatial patterns.

4.3. Funneling Illusion (Localization Errors)

The funneling illusion, also known as the phantom sensation, was discovered in 1957 by Békésy [78]. In his work, he found that the simultaneous activation of two tactors situated in proximity, creates an illusory point of stimulation, sited between the true points of stimulation. In other words, funneling illusions create virtual stimulation points anywhere between the utilized tactors. It is also referred to as an interpolation effect that results from two active motors, and the user characterizes perception to a point in between the motors, inherently due to the poor resolution of the skin [20]. Unlike apparent motion, funneling sensations do not induce movement illusions, however, in both cases, the location and magnitude of sensations depend on the amplitudes of the engaged tactors. For illustration, in the context of phantom sensations, if the intensities are of equal magnitude, the illusion appears midpoint between them. Similarly, by changing the amplitudes of both motors in similar proportions, the intensity of the apparent sensation changes as well [53,54]. Hence, the separability of tactons is achieved by varying the magnitude of the engaged tactors.
Funneling illusions can be engaged to design haptic patterns. Luzhnica [20], presents phantom sensation patterns that transfer inaccuracy tolerant numerical values. Park and Choi [79], estimated the IT achieved by four tactors at different body locations using phantom loci and an IT rate of 2.53 bits per second was realized.

5. Discussion

Haptic displays have proved to be functional tools for displaying spatial information to individuals seeking perception assistance in visually complex environments or navigation in unfamiliar scenes. In such contexts, the point of stimulation of a single tactor or the sequence of spatial stimulation of a set of tactors encodes the related cues. If tactile devices (SSDs) are to transfer data with some level of complexity (e.g., spoken language), then a tactile vocabulary is necessary, in which a tacton or “haptic word” is related to a specific concept or instruction. The design of tactile alphabets must be founded on conceptual frameworks which stipulate the stimulus dimensions that are easily separated and learnable, and how they can be optimally combined.
Vibrotactile signals are multidimensional and have larger dynamic ranges in several stimulus dimensions than those of electro-tactile stimuli, making them fit for this application (designing of tactile vocabulary). The five main parameters of vibrotactile stimuli are intensity, duration, loci, frequency, and illusions. Following a time of training, each tacton becomes related to a definite concept about the perceived environment. In this way, information about the environment is transmitted to the users using an SSD. It should be noted that the psychophysical data reported by this study were reached in laboratory environments with experienced and motivated participants, within perfect perceptual scenes, etc.; conditions which potentially overestimate how much IT is gained by using cutaneous sensing parameters to design tactile vocabularies for everyday perception.
Some general conclusions can be drawn from this study. First, intensity and frequency seem to afford developers confined possibilities of encoding data. This possibly limits, to a large extent, the use of these parameters to applications where large tactile alphabets are not principal, for example, assistive navigation devices. In contrast, the temporal and spatial parameters of the cutaneous sense towards vibration perception are considerably large. Second, the identification performance of dynamic patterns is superior to that of static patterns. Third, SSDs that employ many actuators do not outperform displays that employ very few tactors (2–10). Hence, an advantage exists in using SSDs with fewer motors as these are lighter, cheaper, and possibly easy to create. Fourth, it is possible to achieve good performance after only a few hours of effective training. Moreover, there are strong indications that acquired knowledge can be transferred from one body part to the other, however, this is founded on just a few studies and needs further attention.

Future Work

Despite the notable success, some research concerns towards the designing of tactile patterns for IT exist. One primary concern is that most of these studies are start-up student projects lacking sustainable continuation. Nonetheless, most of the successful experiments were performed recently, therefore additional research could still be on the way. This may result in a few functional SSSs for daily use.
For most of the surveyed research, the next step is to determine how to increase the size of the haptic vocabulary and the IT rate. In the case of text-to-touch encoding, how to increase the tactile alphabets to cater for the full English vocabulary is valuable but challenging work. In such cases, haptic patterns that allow the reception of full English sentences will be a huge success.
An additional challenge that needs full attention is the mode of input. Again, for consistency, we use text-to-touch encoding as a motivating example. Is it possible to realize in real-time the text-to-touch translation? How can English text be easily translated in real-time or near-real-time to tactile signals or patterns, and if parts of speech (e.g., phonemes, letters) are engaged as building blocks, what type of supplementary displays (electronic devices) are needed? Surely, these challenges cannot be attended to instantly; however, they are paramount when designing the future generations of SSSs.

Author Contributions

Conceptualization, S.D., N.S. and T.D.N.; methodology, T.D.N. and S.D.; validation, S.D., N.S. and T.D.N.; formal analysis, T.D.N.; writing—original draft preparation, T.D.N.; writing—review and editing, S.D., N.S. and E.D.; supervision, S.D. and N.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work is founded on the research supported in part by the National Research Foundation of South Africa (Grant Numbers 93539).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tan, H.Z.; Durlach, N.I.; Rabinowitz, W.M.; Reed, C.M.; Santos, J.R. Reception of Morse code through motional, vibrotactile, and auditory stimulation. Percept. Psychophys. 1997, 59, 1004–1017. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Giri, G.; Maddahi, Y.; Zareinia, K. An Application-Based Review of Haptics Technology. Robotics 2021, 10, 29. [Google Scholar] [CrossRef]
  3. Tai, Y.; Shi, J.; Wei, L.; Huang, X.; Chen, Z.; Li, Q. Real-time visuo-haptic surgical simulator for medical education—A review. In International Conference on Mechatronics and Intelligent Robotics; Springer: Berlin/Heidelberg, Germany, 2017; pp. 531–537. [Google Scholar]
  4. Summers, I.R. Tactile Aids for the Hearing Impaired; Whurr Publishers: London, UK, 1992. [Google Scholar]
  5. Franklin, D. Tactile aids, new help for the profoundly deaf. Hear. J. 1984, 37, 20–23. [Google Scholar]
  6. Kaczmarek, K.A.; Webster, J.G.; Bach-y-Rita, P.; Tompkins, W.J. Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Trans. Biomed. Eng. 1991, 38, 1–16. [Google Scholar] [CrossRef] [Green Version]
  7. De Volder, A.G.; Catalan-Ahumada, M.; Robert, A.; Bol, A.; Labar, D.; Coppens, A.; Michel, C.; Veraart, C. Changes in occipital cortex activity in early blind humans using a sensory substitution device. Brain Res. 1999, 826, 128–134. [Google Scholar]
  8. Luzhnica, G.; Veas, E.; Pammer, V. Skin reading: Encoding text in a 6-channel haptic display. In Proceedings of the 2016 ACM International Symposium on Wearable Computers, Heidelberg, Germany, 12–16 September 2016; pp. 148–155. [Google Scholar]
  9. Novich, S.D.; Eagleman, D.M. Using space and time to encode vibrotactile information: Toward an estimate of the skin’s achievable throughput. Exp. Brain Res. 2015, 233, 2777–2788. [Google Scholar] [CrossRef] [PubMed]
  10. Tan, H.Z.; Choi, S.; Lau, F.W.; Abnousi, F. Methodology for Maximizing Information Transmission of Haptic Devices: A Survey. Proc. IEEE 2020, 108, 945–965. [Google Scholar] [CrossRef]
  11. Nahri, S.N.F.; Du, S.; Van Wyk, B. Haptic System Interface Design and Modelling for Bilateral Teleoperation Systems. In Proceedings of the 2020 International SAUPEC/RobMech/PRASA Conference, Cape Town, South Africa, 29–31 January 2020; IEEE: New York, NY, USA, 2020; pp. 1–6. [Google Scholar]
  12. Carvalheiro, C.; Nóbrega, R.; da Silva, H.; Rodrigues, R. User redirection and direct haptics in virtual environments. In Proceedings of the 24th ACM international conference on Multimedia, Amsterdam, The Netherlands, 15–19 October 2016; pp. 1146–1155. [Google Scholar]
  13. Feng, L.; Ali, A.; Iqbal, M.; Bashir, A.K.; Hussain, S.A.; Pack, S. Optimal haptic communications over nanonetworks for E-health systems. IEEE Trans. Ind. Inform. 2019, 15, 3016–3027. [Google Scholar] [CrossRef] [Green Version]
  14. Jones, L.A. Haptics; MIT Press Essential Knowledge Series: Cambridge, MA, USA, 2018. [Google Scholar]
  15. Tan, H.Z.; Reed, C.M.; Jiao, Y.; Perez, Z.D.; Wilson, E.C.; Jung, J.; Martinez, J.S.; Severgnini, F.M. Acquisition of 500 english words through a TActile Phonemic Sleeve (TAPS). IEEE Trans. Haptics 2020, 13, 745–760. [Google Scholar] [CrossRef]
  16. Hoffmann, R.; Valgeirsdóttir, V.V.; Jóhannesson, Ó.I.; Unnthorsson, R.; Kristjánsson, Á. Measuring relative vibrotactile spatial acuity: Effects of tactor type, anchor points and tactile anisotropy. Exp. Brain Res. 2018, 236, 3405–3416. [Google Scholar] [CrossRef] [Green Version]
  17. Hoffmann, R.; Spagnol, S.; Kristjánsson, Á.; Unnthorsson, R. Evaluation of an audio-haptic sensory substitution device for enhancing spatial awareness for the visually impaired. Optom. Vis. Sci. 2018, 95, 757. [Google Scholar] [CrossRef]
  18. Kristjánsson, Á.; Moldoveanu, A.; Jóhannesson, Ó.I.; Balan, O.; Spagnol, S.; Valgeirsdóttir, V.V.; Unnthorsson, R. Designing sensory-substitution devices: Principles, pitfalls and potential 1. Restor. Neurol. Neurosci. 2016, 34, 769–787. [Google Scholar] [CrossRef] [Green Version]
  19. Luzhnica, G.; Veas, E.; Seim, C. Passive haptic learning for vibrotactile skin reading. In Proceedings of the 2018 ACM International Symposium on Wearable Computers, Singapore, 8–12 October 2018; pp. 40–43. [Google Scholar]
  20. Luzhnica, G. Leveraging Optimisations on Spatial Acuity for Conveying Information through Wearable Vibrotactile Display. 2019. Available online: https://diglib.tugraz.at/leveraging-optimisations-on-spatial-acuity-for-conveying-information-through-wearable-vibrotactile-displays-2019 (accessed on 12 February 2021).
  21. Skedung, L.; Arvidsson, M.; Chung, J.Y.; Stafford, C.M.; Berglund, B.; Rutland, M.W. Feeling small: Exploring the tactile perception limits. Sci. Rep. 2013, 3, 1–6. [Google Scholar] [CrossRef] [Green Version]
  22. Reed, C.M.; Durlach, N.I. Note on information transfer rates in human communication. Presence 1998, 7, 509–518. [Google Scholar] [CrossRef]
  23. Spence, C. In Touch with the Future. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, Funchal, Portugal, 15–18 November 2015; p. 1. [Google Scholar]
  24. Reed, C.M.; Durlach, N.I.; Delhorne, L.A. Tactile Aids Hear. Impair. 1992. Available online: https://www.wiley.com/en-us/Tactile+Aids+for+the+Hearing+Impaired-p-9781870332170 (accessed on 14 January 2021).
  25. Purves, D.; Augustine, G.J.; Fitzpatrick, D.; Hall, W.C.; LaMantia, A.-S.; McNamara, J.O.; White, L.E. Neuroscience, 4th ed.; Sinauer xvii: Sunderland, MA, USA, 2008; Volume 857, p. 944. [Google Scholar]
  26. Novich, S.D. Sound-to-Touch Sensory Substitution and Beyond. 2015. Available online: https://scholarship.rice.edu/handle/1911/88379 (accessed on 10 January 2021).
  27. Shneiderman, B.; Plaisant, C.; Cohen, M.S.; Jacobs, S.; Elmqvist, N.; Diakopoulos, N. Designing the User Interface: Strategies for Effective Human-Computer Interaction; Pearson: London, UK, 2016. [Google Scholar]
  28. Brewster, S.A.; Brown, L.M. Non-visual information display using tactons. In Proceedings of the CHI’04 Extended Abstracts on Human Factors in Computing Systems, Vienna, Austria, 24–29 April 2004; pp. 787–788. [Google Scholar]
  29. Azadi, M.; Jones, L.A. Evaluating vibrotactile dimensions for the design of tactons. IEEE Trans. Haptics 2014, 7, 14–23. [Google Scholar] [CrossRef] [PubMed]
  30. Choi, S.; Kuchenbecker, K.J. Vibrotactile display: Perception, technology, and applications. Proc. IEEE 2012, 101, 2093–2104. [Google Scholar] [CrossRef]
  31. Basdogan, C.; Giraud, F.; Levesque, V.; Choi, S. A review of surface haptics: Enabling tactile effects on touch surfaces. IEEE Trans. Haptics 2020, 13, 450–470. [Google Scholar] [CrossRef]
  32. Zhao, S.; Israr, A.; Lau, F.; Abnousi, F. Coding tactile symbols for phonemic communication. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–13. [Google Scholar]
  33. Jones, L.A.; Tan, H.Z. Application of psychophysical techniques to haptic research. IEEE Trans. Haptics 2012, 6, 268–284. [Google Scholar] [CrossRef] [Green Version]
  34. Plaisier, M.A.; Tiest, W.B.; Kappers, A. Haptic object individuation. IEEE Trans. Haptics 2010, 3, 257–265. [Google Scholar] [CrossRef]
  35. Durlach, N.; Tan, H.; Macmillan, N.; Rabinowitz, W.; Braida, L. Resolution in one dimension with random variations in background dimensions. Percept. Psychophys. 1989, 46, 293–296. [Google Scholar] [CrossRef] [PubMed]
  36. Vallbo, A.B.; Johansson, R.S. Properties of cutaneous mechanoreceptors in the human hand related to touch sensation. Hum. Neurobiol. 1984, 3, 3–14. [Google Scholar]
  37. Goldstein, E. Sensation and Perception; Wadsworth-Thomson Learning Cop: Pacific Grove, CA, USA, 2002. [Google Scholar]
  38. Craig, J.C. Temporal integration of vibrotactile patterns. Percept. Psychophys. 1982, 32, 219–229. [Google Scholar] [CrossRef]
  39. Hoggan, E.; Anwar, S.; Brewster, S.A. Mobile multi-actuator tactile displays. In International Workshop on Haptic and Audio Interaction Design; Springer: Berlin/Heidelberg, Germany, 2007; pp. 22–33. [Google Scholar]
  40. Horvath, S.; Galeotti, J.; Wu, B.; Klatzky, R.; Siegel, M.; Stetten, G. FingerSight: Fingertip haptic sensing of the visual environment. IEEE J. Transl. Eng. Health Med. 2014, 2, 1–9. [Google Scholar] [CrossRef]
  41. Sorgini, F.; Caliò, R.; Carrozza, M.C.; Oddo, C.M. Haptic-assistive technologies for audition and vision sensory disabilities. Disabil. Rehabil. Assist. Technol. 2018, 13, 394–421. [Google Scholar] [CrossRef] [PubMed]
  42. Reed, C.M.; Tan, H.Z.; Perez, Z.D.; Wilson, E.C.; Severgnini, F.M.; Jung, J.; Martinez, J.S.; Jiao, Y.; Israr, A.; Lau, F.; et al. A phonemic-based tactile display for speech communication. IEEE Trans. Haptics 2018, 12, 2–17. [Google Scholar] [CrossRef] [PubMed]
  43. Wong, E.Y.; Israr, A.; O’Malley, M. Discrimination of consonant articulation location by tactile stimulation of the forearm. In Proceedings of the 2010 IEEE Haptics Symposium, Waltham, MA, USA, 25–26 March 2010; pp. 47–54. [Google Scholar] [CrossRef]
  44. Chen, H.-Y.; Santos, J.; Graves, M.; Kim, K.; Tan, H.Z. Tactor localization at the wrist. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications; Springer: Berlin/Heidelberg, Germany, 2008; pp. 209–218. [Google Scholar]
  45. Kammoun, S.; Jouffrais, C.; Guerreiro, T.; Nicolau, H.; Jorge, J. Guiding blind people with haptic feedback. Available online: https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=.+Guiding+blind+people+with+haptic+feedback&btnG= (accessed on 14 February 2021).
  46. Kim, H.; Seo, C.; Lee, J.; Ryu, J.; Yu, S.-B.; Lee, S. Vibrotactile display for driving safety information. In Proceedings of the 2006 IEEE Intelligent Transportation Systems Conference, Toronto, ON, Canada, 17–20 September 2006; IEEE: New York, NY, USA, 2006; pp. 573–577. [Google Scholar]
  47. Lederman, S.J.; Klatzky, R.L. Haptic perception: A tutorial. Atten. Percept. Psychophys. 2009, 71, 1439–1459. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Hoggan, E.; Brewster, S. New parameters for tacton design. In Proceedings of the CHI’07 Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 28 April–3 May 2007; pp. 2417–2422. [Google Scholar]
  49. Human Factors (HF). Guidelines on the multimodality of icons, symbols, and pictograms. 2002. Available online: https://www.etsi.org/deliver/etsi_eg/202000_202099/202048/01.01.01_60/eg_202048v010101p.pdf (accessed on 5 January 2021).
  50. Goff, G.D. Differential discrimination of frequency of cutaneous mechanical vibration. J. Exp. Psychol. 1967, 74, 294. [Google Scholar] [CrossRef] [PubMed]
  51. Brewster, S.A.; Brown, L.M. Tactons: Structured Tactile Messages for Non-Visual Information Display. 2004. Available online: https://eprints.gla.ac.uk/3443/ (accessed on 8 March 2021).
  52. Gescheider, G.A. Resolving of successive clicks by the ears and skin. J. Exp. Psychol. 1966, 71, 378. [Google Scholar] [CrossRef]
  53. Israr, A.; Poupyrev, I. Tactile brush: Drawing on skin with a tactile grid display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 2019–2028. [Google Scholar]
  54. Hoffmann, R.; Brinkhuis, M.A.; Unnthorsson, R.; Kristjánsson, Á. The intensity order illusion: Temporal order of different vibrotactile intensity causes systematic localization errors. J. Neurophysiol. 2019, 122, 1810–1820. [Google Scholar] [CrossRef]
  55. Lederman, S.J.; Jones, L.A. Tactile and haptic illusions. IEEE Trans. Haptics 2011, 4, 273–294. [Google Scholar] [CrossRef]
  56. Goldstein, M.H., Jr.; Proctor, A. Tactile aids for profoundly deaf children. J. Acoust. Soc. Am. 1985, 77, 258–265. [Google Scholar] [CrossRef]
  57. Leder, S.B.; Spitzer, J.B.; Milner, P.; Flevaris-Phillips, C.; Richardson, F. Vibrotactile stimulation for the adventitiously deaf: An alternative to cochlear implantation. Arch. Phys. Med. Rehabil. 1986, 67, 754–758. [Google Scholar] [CrossRef]
  58. Geldard, F.A. Adventures in tactile literacy. Am. Psychol. 1957, 12, 115. [Google Scholar] [CrossRef]
  59. Tan, H.Z.; Durlach, N.I.; Reed, C.M.; Rabinowitz, W.M. Information transmission with a multifinger tactual display. Percept. Psychophys. 1999, 61, 993–1008. [Google Scholar] [CrossRef] [Green Version]
  60. Azadi, M.; Jones, L. Identification of vibrotactile patterns: Building blocks for tactons. In Proceedings of the 2013 World Haptics Conference (WHC), Daejeon, Korea, 14–17 April 2013; IEEE: New York, NY, USA, 2013; pp. 347–352. [Google Scholar]
  61. Loomis, J.M. Tactile letter recognition under different modes of stimulus presentation. Percept. Psychophys. 1974, 16, 401–408. [Google Scholar] [CrossRef]
  62. Velázquez, R.; Bazán, O.; Alonso, C.; Delgado-Mata, C. Vibrating insoles for tactile communication with the feet. In Proceedings of the 2011 15th International Conference on Advanced Robotics (ICAR), Tallinn, Estonia, 20–23 June 2011; IEEE: New York, NY, USA, 2011; pp. 118–123. [Google Scholar]
  63. Velázquez, R.; Pissaloux, E. Constructing tactile languages for situational awareness assistance of visually impaired people. In Mobility of Visually Impaired People; Springer: Berlin/Heidelberg, Germany, 2018; pp. 597–616. [Google Scholar]
  64. Saida, S.; Shimizu, Y.; Wake, T. Computer-controlled TVSS and some characteristics of vibrotactile letter recognition. Percept. Mot. Ski. 1982, 55, 651–653. [Google Scholar] [CrossRef]
  65. Janidarmian, M.; Fekr, A.R.; Radecka, K.; Zilic, Z. Designing and evaluating a vibrotactile language for sensory substitution systems. In International Conference on Wireless Mobile Communication and Healthcare; Springer: Berlin/Heidelberg, Germany, 2017; pp. 58–66. [Google Scholar]
  66. Janidarmian, M.; Fekr, A.R.; Radecka, K.; Zilic, Z. Wearable Vibrotactile System as an Assistive Technology Solution. Mob. Netw. Appl. 2019, 2019, 1–9. [Google Scholar] [CrossRef]
  67. Wu, J.; Zhang, J.; Yan, J.; Liu, W.; Song, G. Design of a vibrotactile vest for contour perception. Int. J. Adv. Robot. Syst. 2012, 9, 166. [Google Scholar] [CrossRef]
  68. Barralon, P.; Ng, G.; Dumont, G.; Schwarz, S.K.; Ansermino, M. Development and evaluation of multidimensional tactons for a wearable tactile display. In Proceedings of the 9th International Conference on Human Computer Interaction with Mobile Devices and Services, Singapore, 9–12 September 2007; pp. 186–189. [Google Scholar]
  69. Brown, L.M.; Brewster, S.A.; Purchase, H.C. Multidimensional tactons for non-visual information presentation in mobile devices. In Proceedings of the 8th conference on Human-Computer Interaction with Mobile Devices and Services, Helsinki, Finland, 12–15 September 2006; pp. 231–238. [Google Scholar]
  70. Van Erp, J.B. Guidelines for the use of vibro-tactile displays in human computer interaction. In Proceedings of the Eurohaptics: Citeseer, Edinburgh, UK, 8–10 July 2002; pp. 18–22. [Google Scholar]
  71. Luzhnica, G.; Veas, E. Vibrotactile patterns using sensitivity prioritisation. In Proceedings of the 2017 ACM International Symposium on Wearable Computers, Maui, HI, USA, 11–15 September 2017; pp. 74–81. [Google Scholar]
  72. Neuhaus, W. Experimentelle Untersuchung der Scheinbewegung. Arch. Gesamte Psychol. 1930, 75, 315–458. [Google Scholar]
  73. Kirman, J.H. Tactile apparent movement: The effects of interstimulus onset interval and stimulus duration. Percept. Psychophys. 1974, 15, 1–6. [Google Scholar] [CrossRef] [Green Version]
  74. Cholewiak, R.W.; Collins, A.A. The generation of vibrotactile patterns on a linear array: Influences of body site, time, and presentation mode. Percept. Psychophys. 2000, 62, 1220–1235. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  75. Lee, J.; Han, J.; Lee, G. Investigating the information transfer efficiency of a 3 × 3 watch-back tactile display. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Korea, 18–23 April 2015; pp. 1229–1232. [Google Scholar]
  76. Trojan, J.; Stolle, A.M.; Mršić Carl, A.; Kleinböhl, D.; Tan, H.Z.; Hölzl, R. Spatiotemporal integration in somatosensory perception: Effects of sensory saltation on pointing at perceived positions on the body surface. Front. Psychol. 2010, 1, 206. [Google Scholar] [CrossRef] [Green Version]
  77. Geldard, F.A. Saltation in somesthesis. Psychol. Bull. 1982, 92, 136. [Google Scholar] [CrossRef] [PubMed]
  78. Von Békésy, G. Sensations on the skin similar to directional hearing, beats, and harmonics of the ear. J. Acoust. Soc. Am. 1957, 29, 489–501. [Google Scholar] [CrossRef]
  79. Park, G.; Choi, S. Tactile information transmission by 2D stationary phantom sensations. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–12. [Google Scholar]
Figure 1. Key components of generic sensory substitution systems.
Figure 1. Key components of generic sensory substitution systems.
Electronics 10 02147 g001
Figure 2. The major types of mechanoreceptors engrained in the hairless skin of the fingertip based on the work of Purves et al. [25]. The tactile properties of each mechanoreceptor are summarized in Table 1.
Figure 2. The major types of mechanoreceptors engrained in the hairless skin of the fingertip based on the work of Purves et al. [25]. The tactile properties of each mechanoreceptor are summarized in Table 1.
Electronics 10 02147 g002
Figure 3. Spatial, spatial–temporal, and overlapping spatial–temporal encoding of tactile patterns.
Figure 3. Spatial, spatial–temporal, and overlapping spatial–temporal encoding of tactile patterns.
Electronics 10 02147 g003
Figure 4. The illusion of movement.
Figure 4. The illusion of movement.
Electronics 10 02147 g004
Table 1. The properties of the different types of mechanoreceptors engrained in the skin based on the work of Purves et al. [25].
Table 1. The properties of the different types of mechanoreceptors engrained in the skin based on the work of Purves et al. [25].
Meissner’s CorpusclesRuffini’s CorpusclesPacinian CorpusclesMerkel’s Discs
Perceived stimuliTouch and moving stimuli Stretching of the skinVibration and deep pressureTouch and static stimuli
Spatial acuity3 mm7 mm10 mm0.5 mm
ResponseFASAFASA
LocationPrimarily hairless skinAll skinSubcutaneous tissue, all skinAll skin
Frequency range1–300 HzUnknown5–1000 Hz0.4–100 Hz.
Peak sensitivity50 Hz0.5–7 Hz250 Hz5 Hz.
Table 2. Methods for designing tactile patterns without illusions.
Table 2. Methods for designing tactile patterns without illusions.
ReferenceBody LocationNumber of TactorsParametersEngaged TaskOutcome
Velázquez, Pissaloux
[62,63]
Foot.4Five temporal patterns of short and long durations encoding words.Recognize words, and word sequences (i.e., sentences).Recognition performance of sentences in percentage correct scores: one word, 84%; two words, 77%, three words, 77%; four words, 66%.
Saida et al. [64]Back. 100
(10 × 10)
Three modes of tactile encoding:
Static (i.e., spatial encoding), tracing (i.e., fingerspelling), moving (characters passed horizontally through the back).
Identify letters of the Japanese alphabet. Identification accuracy in terms of percentage correct scores: static 27%, moving 39%, tracing 95%.
No performance difference between blind and sighted subjects.
Loomis [61]Back.400
(20 × 20)
Spatial and Spatiotemporal encoding is divided into five modes of presentation.Identification of the 26 letters of English.Identification performance varied between modes of presentation. Static patterns were worst, and spatiotemporal patterns are best in terms of identification.
Tan et al. [59]Fingers.3Frequency, magnitude, location.Identification of 120 stimulus patterns.Achieved an IT of 6.50 bits: 90 patterns were separable.
Azadi, Jones [29,60]Forearm.8Frequency, magnitude, pulse duration. Identification of eight stimulus patterns.Achieved an IT of 2.41 bits: three patterns were separable.
Time and frequency are not integral parameters of vibrotactile stimuli.
Novich and Eagleman [9]Back.9
(3 × 3)
Space, Time, and Intensity.Identification of patterns.Patterns engaging space and time (spatiotemporal) outperforms the rest in terms of identification performance. Spatial patterns are least performing in that respect.
Luzhnica, et al. [8,20]Hand.6Space and Time (overlapping spatiotemporal encoding), spatial encoding.
Identification of letters of the English alphabet.Prioritizing site sensitivity is critical when encoding information.
Overlapping spatiotemporal encoding performs better than spatial encoding in terms of identification.
Janidarmian et al. [65,66]Back.9
(3 × 3)
Personalized spatiotemporal encoding by adjusting the vibration variables (i.e., frequency, intensity, duration) to suit the user’s preference. Identification of numbers.Personalized encoding achieves better identification performance when compared to generalized encoding.
Wu et al. [67]Back.48
(8 × 6)
Three types of spatiotemporal encoding methods: scanning, handwriting, and tracing.Two tasks were involved: Identify physical shapes as acquired by a camera,
identification of letters of the English alphabet.
In both cases, high identification accuracy was achieved with tracing, handwriting, and scanning respectively.
Barralon et al. [68]Waist.8Locus, roughness (amplitude modulation), and rhythm (durations of pulses separated by distinct pauses).Identification of 36 stimulus patterns.
Investigate the individual effects of the engaged cutaneous parameters on recognition accuracy.
Achieved an IT of 4.18: 18.07 patterns were separable.
Rhythm and location achieve high recognition accuracy: (96.3%), (91.6%) respectively. Roughness is least in terms of recognition accuracy: (88.7%).
Brown et al. [69]Forearm.3Spatial location, rhythm, and roughness. Compare the identification performance of two-dimensional stimuli with three-dimensional stimuli. Tactons encoding two dimensions of information (rhythm and roughness) achieved an identification rate of 70% with an IT of 3.27 bits.
Tactons with a further dimension (spatial location) achieved a rate of 80% with an IT of 3.37 bits.
Kim, et al. [46]Foot.25
(5 × 5)
Space and Time: two modes of sequential stimulation to mimic tracing by engaging either one or two tactors.Identification of Letters.Sequential activation of two tactors improves identification performance.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nyasulu, T.D.; Du, S.; Steyn, N.; Dong, E. A Study of Cutaneous Perception Parameters for Designing Haptic Symbols towards Information Transfer. Electronics 2021, 10, 2147. https://doi.org/10.3390/electronics10172147

AMA Style

Nyasulu TD, Du S, Steyn N, Dong E. A Study of Cutaneous Perception Parameters for Designing Haptic Symbols towards Information Transfer. Electronics. 2021; 10(17):2147. https://doi.org/10.3390/electronics10172147

Chicago/Turabian Style

Nyasulu, Tawanda Denzel, Shengzhi Du, Nico Steyn, and Enzeng Dong. 2021. "A Study of Cutaneous Perception Parameters for Designing Haptic Symbols towards Information Transfer" Electronics 10, no. 17: 2147. https://doi.org/10.3390/electronics10172147

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop