Next Article in Journal
Application of a Low-Cost Electronic Nose for Differentiation between Pathogenic Oomycetes Pythium intermedium and Phytophthora plurivora
Next Article in Special Issue
A Smartphone-Based Cursor Position System in Cross-Device Interaction Using Machine Learning Techniques
Previous Article in Journal
Accelerometer-Based Wheel Odometer for Kinematics Determination
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Analysing Touchscreen Gestures: A Study Based on Individuals with Down Syndrome Centred on Design for All

by
Jorge Martin-Gutierrez
1,* and
Marta Sylvia Del Rio Guerra
2,*
1
Department of Techniques and Projects in Engineering and Architecture, Universidad de La Laguna, Av. Angel Guimerá sn, 38071 Tenerife, Spain
2
Department of Computer Science, Universidad de Monterrey, Av. Ignacio Morones Prieto 4500-Pte., San Pedro Garza García, 66238 Nuevo Leon, Mexico
*
Authors to whom correspondence should be addressed.
Sensors 2021, 21(4), 1328; https://doi.org/10.3390/s21041328
Submission received: 24 December 2020 / Revised: 1 February 2021 / Accepted: 11 February 2021 / Published: 13 February 2021
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)

Abstract

:
There has been a conscious shift towards developing increasingly inclusive applications. However, despite this fact, most research has focused on supporting those with visual or hearing impairments and less attention has been paid to cognitive impairments. The purpose of this study is to analyse touch gestures used for touchscreens and identify which gestures are suitable for individuals living with Down syndrome (DS) or other forms of physical or cognitive impairments. With this information, app developers can satisfy Design for All (DfA) requirements by selecting adequate gestures from existing lists of gesture sets. Twenty touch gestures were defined for this study and a sample group containing eighteen individuals with Down syndrome was used. A tool was developed to measure the performance of touch gestures and participants were asked to perform simple tasks that involved the repeated use of these twenty gestures. Three variables are analysed to establish whether they influence the success rates or completion times of gestures, as they could have a collateral effect on the skill with which gestures are performed. These variables are Gender, Type of Down syndrome, and Socioeconomic Status. Analysis reveals that significant difference is present when a pairwise comparison is performed, meaning individuals with DS cannot perform all gestures with the same ease. The variables Gender and Socioeconomic Status do not influence success rates or completion times, but Type of DS does.

1. Introduction

It has become much more commonplace to find touchscreen devices— tablets, smartphones or e-readers, amongst others—and touch gestures being used in both personal and professional settings. This situation has led to a change in people’s habits [1,2]. The natural and intuitive interaction [3] between the end-user and touchscreens via touch gestures has transformed smartphones and tablets into the most widely used pieces of technology in today’s society [4]. In fact, interactive screens are so natural and intuitive that studies such as that by Plowman confirm that even small children are capable of using these devices, and do so even before verbal communication development is completed [5]. In the report by Media and Redeout [6] on the use of technologies by children aged under 8 years old, the authors indicate that 38% of participants knew how to use a smartphone, iPad, or similar piece of technology, and that 10% of these participants were aged between 0–23 months old and 39% aged between 2–4 years old. According to the study carried out by Hourcade et al. [7], it’s possible to affirm that these percentages are higher today because these sorts of devices are used at increasingly younger ages. Even adults use them when they are placed in their surroundings, i.e., urban interfaces such as parking meters, on-street cycle parking, cash machines, ticket machines at the underground station, etc.
Mobile devices and apps are being used more and more frequently to help resolve work related tasks, making the end-user more productive and efficient [8,9,10]. There is no doubt that for many people their smartphone or tablet has become an indispensible tool in the workplace. However, in the private setting touchscreen devices are gradually substituting other devices in an attempt to make users’ lives easier and less cluttered, which explains why society is becoming increasingly dependant on such devices [11].
Natural and intuitive multi-touch systems are based on touch and direct manipulation. Shneiderman and collaborators put forward three ideas for the concept of direct manipulation: (1) the visibility of objects and actions of interest; (2) the replacement of typed commands with pointing-actions focused on objects of interest; (3) rapid, reversible and incremental actions that help to control the technology whilst avoiding complex instructions [12]. Hourcade, on the other hand, argues that direct touch on the screen is preferable to using a mouse when selecting options offered onscreen [13].
Ingram et al. indicate that the number and types of multi-touch interactions that end-users and app developers “have instinctively agreed upon” is restricted [14]; in other words, the interactions needed in order to use an app are generally limited to the following: a single touch in order to select something; a single finger to move something; two fingers to scale something; and, to a lesser extent, two fingers to rotate something. Thus, the lack of consensus regarding standardized interaction, and the fact that there are already some universally accepted gestures, makes the need for well-designed multi-touch interactions absolutely crucial.
In order to design touchscreen interfaces that are better suited to end-user needs, Jeong & Liu studied how humans use their fingers whilst using touchscreen devices, and examined the different characteristics of gestures that are used on the touchscreen (gesture performance, time, error rate, speed, and finger movement trajectory) [15].
It is true that user experience (UX) is taken into account when apps or websites are designed in order to adapt them to the user needs. Therefore, it is possible to encounter apps for children living with Down syndrome or autism [16], or for the elderly [17]. However despite this being the case, during the design process little to no consideration has been paid to touch gestures to ensure that screen interactions have been designed to suit all users. Usability studies have been performed for multi-touch screens aimed at different collectives: children [18,19], adults and the elderly [20,21,22], and individuals living with Down syndrome [23,24], amongst others.
According to the statistics portal Statista [25], the app market has grown substantially in recent years. In 2018, figures for app downloads worldwide reached 205.4 billion and it is estimated that this figure will grow to reach 258.2 billion by 2022. The most frequently downloaded apps (not games) in Google Play Store in 2018 were Facebook, WhatsApp and Google, although mobile games were also downloaded in large volumes on mobile devices. In light of this set of circumstances pertaining to the use of and dependence on technologies that rely on touch gestures, it should be noted that individuals living with disability run the risk of being sidelined and are at the risk of social exclusion. This would increase the digital divide and, consequently, social inequality [26].
Some authors state that the digital divide is a gap that exists between those using advanced technologies as part of their day-to-day lives and those who encounter difficulties in using such technologies [27]; these difficulties may emerge as the result of being unfamiliar with such technology, living with some form of disability that makes handling such technology difficult, holding the perception that such technology is too complex, or simply because it has not been designed with all types of users in mind and cannot be manipulated by all users. The digital divide is transforming into an element that separates and excludes people, collectives and countries [28].
Usually the literature uses the term "Design for All" to refer to developing applications and products that are accessible to everyone. “Design for Inclusion” encompasses three main design principles: Design for All (DfA), Inclusive Design (ID) and Universal Design (UD). These were created to address diversity in age, gender, culture, abilities, disabilities and technological literacy. Design for Inclusion is based on the assumption that most users are not the standard-user [29]. Thus, one of the most ignored disabilities is cognitive impairment. DS users must be included in a Design for Inclusion approach.
This article focuses on individuals living with Down syndrome (DS) as users of touchscreen devices that require the use of gestures for interaction purposes.
Despite UX conditions having been taken into account when designing applications [30,31] and usability studies performed to improve said design [32,33], the fact remains that certain physical gestures used by end-users (touchscreen—Kinect gestures—Mid-Air gestures) that are then interpreted by devices (screen, Kinect, VR/AR goggles, etc.) influence the interactive capacity between the user and machine. What is more, they have not been studied sufficiently to ensure they address the physical limitations experienced by certain collectives [34].
Individuals with DS have varying degrees of cognitive delay [35], which is reflected in lower IQ scores when compared against the general populace [36]. This indicates the presence of mental impairment [37] and it has been established that this impairment worsens with age [38]. Individuals living with DS find themselves affected by three main types of impairment: cognitive, motor, and perceptual [39]. These affect their gross and fine motor skills, cognitive perception, speech perception, sensorial perception and processing, and communication skills [40,41]. Current designs of mobile devices and touchscreens utilise sets of established gestures. Individuals with DS, however, possess unique physical and cognitive traits that make it difficult for them to perform certain touch gestures. As such, a developer’s design choices and gesture selections may hinder the user experience of current devices by individuals with DS. The driving motivation behind granting individuals with disabilities full access to ICTs is to ensure they do not suffer from another form of social exclusion and that their rights are protected; these are technologies that are used in the personal, professional and education settings, and thus equal access to these technologies guarantees equal access to the breadth of educational and professional opportunities on offer. Furthermore, these technologies provide these individuals with alternative ways to socialize.
The aim of this work is to establish the most suitable touchscreen gestures for individuals with DS, and ensure they are considered as part of UX when designing inclusive apps. From a social inclusion standpoint, this study aims to assist developers in developing apps that use interactive gestures that can be performed by all users. Several contributions to the literature are made by this work: firstly, the development of an app that is capable of measuring gestures and detecting new types of interactions; secondly, findings from an experimental study on all types of possible interaction involving finger and hand gestures by individuals with DS of all ages, socioeconomic status and gender; and finally, a set of design guidelines that should be taken account when designing touchscreen apps.

2. Related Works

Recently Nacher introduced a work where measured touchscreen interactive gestures by children with DS [24]. From their research, it should be noted that the motor impairments experienced by individuals living with DS worsen with age as Barnhart and Connolly, already pointed out [42]. The results presented in the mentioned Nacher work, indicate that the seven gestures studied are successfully executed by children with DS. However, the work presented in this paper, unlike the aforementioned, includes a much broader range of gestures. A total of 20 gestures are analysed, including gestures performed with the fingers of one hand, the fingers of two hands, a single hand, and both hands.

2.1. Interaction and Gestures

In recent decades the development of touchscreen technologies has proliferated and there have been many studies on the use of touchscreen devices and gestures in order to better understand users’ needs usuarios [43,44,45,46,47,48]. Although conventional touchscreen technologies only allowed for the use of a limited number of finger gestures, e.g., writing and selecting objects (normally with the index finger), the range of touchscreen gestures and the fingers used has since grown in recent years. For example, Mayer, propose using the positioning of fingers to enrich input [49]. Crescenzi & Grané found that young children adapted their gestures to the app content, which meant more natural gestures were used than those originally designed by the app developer [50]. It should be noted the classification of gestures based on their taxonomy category (nature, flow, dimension, complexity, interaction and location) Dingler and collaborators who also proposed a methodology for designing ‘consistent gestures across different devices’ [51].
Touchscreen gestures that use fingers generally fall under two categories: single-touch gestures and multi-touch gestures. Single-touch gestures include clicking (or pressing, holding) an object, or sliding (or moving, dragging) an object. However, multi-touch gestures include pinching (or zooming in) and stretching (or zooming out).
The multi-touch gesture dictionary and work of Plichta et al. describes a plethora of entries for interacting with desktop computers, tablet computers, notebook computers, handheld computers, media players, mobile phones, and so on [52,53].
According to Lim et al. the research on gesture interaction focuses on two facets: on the one hand it analyses gesture recognition within systems; and on the other hand it analyses usability factors, studying design and natural and intuitive gesture communication that takes physical and cognitive traits into consideration [54]. On the other hand, to evaluate the ease of use of gestures, Jeong and Liu analysed the speed at which a series of gestures were executed [15].

2.2. Interaction and Individuals Living with Down Syndrome

The main objective of usability is to include intuitive design for all users, including users living with DS. Unfortunately, conventional research methods do not contemplate all the needs of this group of people [55].
Recent studies show how technology has been used with individuals living with DS, whether it be to assist learning, or performing tests that measure the usability of apps and systems so that they are accessible to users, i.e., BeeSmart and Eye-hand Coordination of Children with Down Syndrome [56]. Users with DS are able to use computers because they can use devices such as the mouse, keyboard and screen [57]. In particular, a lot of research has been performed with this social group using games [58,59,60].
When reviewing the bibliographic summary performed by Cáliz and collaborators, it was possible to identify 98 scientific contributions for usability studies with mobile devices [55]. Out of these, only five had been performed taking into account individuals with DS: Nacher and collaborators studied interaction on touchscreen devices, but only reviewed four gestures [24]; and Mendoza and his collaborators analysed how the eight most commonly used touchscreen gestures are learnt by teenagers with DS [61]. Thus, the authors of this paper felt it necessary to broaden the scope of such research, believing it should include more gestures and focus on gesture use in order to create applications under the paradigm of user-centric design.

3. Problem Statement

Down syndrome is a genetic condition caused by a third copy of chromosome 21, meiotic or mitotic nondisjunction, or an unbalanced translocation of said chromosome pair [62]. During reproduction both parents pass their genes on to their children, which are carried in chromosomes. When the baby’s cells develop, each cell is supposed to receive 23 pairs of chromosomes to ensure the correct total of 46 chromosomes; half of these chromosomes are from the mother and half are from the father. However, in the cells of an individual with DS, there is an extra pair of chromosome 21, meaning that instead of having 46 chromosomes they have 47. This extra chromosome has a direct effect on both the brain and body’s physical development and has been identified as the cause of the unique physical and cognitive traits found in DS.
One of the predominant traits found in DS is limited motor movement, which is a consequence of delayed motor development [63]. Kim and collaborators, affirm that at a young age children with DS display the same development as other children, however the delay in motor development becomes increasingly apparent as individuals age and remains present throughout all later stages of development. Other studies that have been performed have also made the same observation [64].
The particular physical and cognitive traits of people with Down syndrome affect individual capabilities. Persons with DS typically share a host of recognizable physical traits that include: shorter extremities; stubby fingers; a curved little finger—known clinically as Clinodactyly; a short thumb that is more widely separated from the other fingers; and, a third finger that is typically longer than the rest [65]. Another trait that tends to be encountered is microcephaly, or in other words, reduced diameter of the head and flattened occipital bone. Regarding sight, individuals with DS will often have strabismus, which is almost always a convergent squint [66]. These physical traits might negatively affect the use of electronic devices of domestic use, as they might not have been designed for inclusion. On the cognitive aspect, it is common for people to have reasoning and learning problems, but it is usually mild or moderate. They have very short attention spans, lack of logical thinking, impulsive behavior, slow learning and delay in language and speech development [67]. Troncoso and del Cerro, report all traits of persons with DS [68].
As briefly mentioned above, from a medical stance there are three types of DS [69]. All three types results from chromosome disorders affecting chromosome pair 21:
Trisomy 21: Accounts for 95% of cases—this is not an inherited condition and happens by chance. It occurs when all cells contain an extra chromosome. This is the most widespread type of DS.
Translocation: Accounts for approximately 4% of cases—this occurs when an extra fragment of chromosome 21 adheres to another chromosome.
Mosaicism: Accounts for 1% of cases—this occurs after the initial cell division process takes place. The nondisjunction of chromosome 21 affects some but not all cells, thus causing some cells to contain the normal number of chromosomes (46), whilst others contain an extra chromosome (47). In other words, some cells have Trisomy 21, and others do not.
According to the National Down Syndrome Society (NDSS), about 1 in 700 infants in the United States is born with DS. It’s the most common genetic disorder in the United States to affect all social classes. As individuals with DS are able to manipulate devices such as the mouse, keyboard and computer screen, end-users with DS do use computers [36,57]. In these usability studies with individuals with DS that related to work tasks, it was found that when using a keyboard users would type more slowly because they only used one of the fingers on one of their hands. They also studied the usability of apps on touchscreen devices and the conclusion draw was that simple gestures are, in theory, easier for users with DS due to them only using one finger [70]. It has been demonstrated that the challenges that individuals with DS encounter when using machinery arise as a direct result of the physical and cognitive limitations associated with the condition.

3.1. Objectives and Hypothesis

The main objective of this work is to investigate the suitability of a set of touchscreen gestures for individuals with DS that use finger gestures and hand gestures. Following analysis the authors shall establish which of these gestures that can be successfully completed by individuals displaying cognitive delays associated with DS, and, from a UX stance, identify those gestures that are best suited to the widest possible audience. The gestures shall be written into Best Practice so they are taken into account when designing touchscreen devices or apps for touchscreens.
Following the software metrics approach "goal, question, metric" (GQM) [71], it is possible to state that the purpose of this study is to analyse and propose touch gestures that are suitable for individuals with DS to use when interacting with touchscreens.
In this study the authors contemplate the influence of three variables on the success rate and completion time of gestures: gender, type of Down syndrome, and socioeconomic status. The latter is under consideration as a consequence of other studies indicating that differences in intellectual abilities exist in relation to this variable [72], and that it may indirectly influence the dexterity with which gestures are performed. The hypotheses established for this piece of research are as follows:
First Research Hypothesis (HR1):
Touchscreen gestures pose different levels of difficulty for individuals living with Down syndrome.
Gestures are determined to be a ‘success’ or ‘fail’, which is to say different gestures present different degrees of difficulty.
The Null Hypothesis (Ho1):
The gestures analysed have the same degree of difficulty.
Second Research Hypothesis (HR2):
The gesture used has an effect on completion time.
A relationship has been established between time and the successful completion of a gesture; in other words, the authors aim to establish whether the degree of difficulty of a gesture affects the time it takes to execute the gesture.
The Null Hypothesis (Ho2):
Gesture difficulty is related to completion time.
Third Research Hypothesis (HR3):
Gender has an effect on the success rate of a gesture.
The Null Hypothesis (Ho3):
The gender of a person with DS does not influence the success rate of a gesture.
Fourth Research Hypothesis (HR4):
Gender has an effect on the completion time of a gesture.
The Null Hypothesis (Ho4):
The gender of a person with DS does not influence the completion time of a gesture.
Fifth Research Hypothesis (HR5):
The type of Down syndrome has an effect on the success rate of a gesture.
The Null Hypothesis (Ho5):
The type of Down syndrome influences the success rate of a gesture.
Sixth Research Hypothesis (HR6):
Socioeconomic status has an effect on the success rate of a gesture.
The Null Hypothesis (Ho6):
Socioeconomic status does not influence the success rate of a gesture.
Seventh Research Hypothesis (HR7):
The type of Down syndrome has an effect on the completion time of a gesture.
The Null Hypothesis (Ho7):
The completion time of a gesture is not influenced by the type of Down syndrome.
Eighth Research Hypothesis (HR8):
Socioeconomic status has an effect on the completion time of a gesture.
The Null Hypothesis (Ho8):
Socioeconomic status influences gesture completion time.

4. Experiment Design

To test the hypotheses presented, the authors of this paper designed a research study that consisted of developing an application capable of registering all gestures performed by users whilst completing tasks inside the app. To do so, a study had to be performed of the large number of previously existing apps that require touchscreen gestures in order to identify the number and type of gestures that would need to be evaluated. Next a purpose-built gaming app was created that would elicit these gestures. The app includes several different tasks that are performed by users using interactive touchscreen gestures. The app logs the personal details of users, and for future analysis it also records gesture completion times, success rates, and the size of the object the user is interacting with onscreen [73]. Once the purpose-built app was ready, the authors contacted centres that support individuals with DS in order to recruit volunteers for the study, ensuring that different social classes and different types of DS were represented in the sample group.

4.1. Participants

For the purposes of this research project, participant recruitment was limited to individuals with DS living in the metropolitan area of Monterrey (Mexico). Prior to commencing the study, a statistical review was performed to determine the appropriate sample size needed to guarantee reliable results.
Given the fact that the research project was going to cover the metropolitan area of Monterrey, the authors began by establishing the number of residents in the area according to the National Institute of Statistics and Geography (INEGI). This figure stood at approximately 4,225,000 individuals in 2017. Data from the same institute indicate that the number of births with DS in the country of Mexico stands at 3.73 out of every 10,000 births, whilst in the state of Nuevo Leon it stands at 1.587 out of every 10,000 births [74]. Having established the number of inhabitants, the percentage of individuals with DS, a statistical confidence level of 95%, a margin of error of 20%, and heterogeneity of 50%, it was then possible to establish that the minimun sample size for the research project is 18 participants. Despite qualitative studies indicating that just six to nine participants [75] are required in order to identify usability problems in an application or product during Think Aloud testing, more recent studies [76] suggest 10 ± 2 participants are in fact required. The research project was performed with the sample size of 18 individuals with DS aged between 9 and 34 years old (mean M = 21.83, standard deviation SD = 5.85). The sample consisted of 13 men and five women. Thus, by gender fewer women participated than men, making up only 27% of the sample size. Participants were drawn from three centres that offer support to individuals with DS. The centers were chosen from the socio-economical population who they serve. Two are private institutions, and the third one is a public one located in an underprivileged neighborhood, whose users have very limited resources. The second one, located in a middle-class neighborhood, is financed by donations and a fee payed by users. The third center (PISYE Center at Universidad de Monterrey), is a private one, and users are required to pay a very expensive tuition to attend.
The selection of those institutions was done as a means to ensure that persons of each socioeconomic status were included in the sample (High 28%, Medium 50% and Low 22%). All participants of High and Medium socioeconomic status were already users of mobile devices. Only 2 participants of Low socioeconomic status had never used a tablet.
To avoid skewing the tests, the researchers made themselves available to participants as and when participants and centres were available. Participants’ ages proved not to be an important parameter in the research project, whereas the type of DS that participants had (Trisomy 21, Translocation or Mosaicism) did prove to be an important parameter (see Table 1).

4.2. Gesture Selection and Description of Tasks

As the authors of this paper had little experience with DS users, empirical observations were used first to determine the type of apps these users preferred, and how often they used a smartphone. Results varied greatly, with some users proving themselves extremely adept at using a smartphone. In the light of these observations, it was decided that similar tasks as those performed in videogames would provide the best tool for measuring gesture use. It was decided to program an app that included small tasks to perform gestures on a touch screen. The app would record data such as success rate and time of completion. Next, a list of the main actions used in touch gestures was created, and a purpose-built task devised to elicit each gesture. The tasks are described in greater detail in Table 2, Table 3, Table 4 and Table 5.

4.3. Equipment and Software

The purpose of the developed app, referred to as DS-Touch, was to register data from gestures performed on the touch screen. DS-Touch can record the participant’s personal information, and all gestures the person performs. The interaction framework for the experiment was implemented using the iPad’s native Xcode and C# in Visual Studio 2015. It is possible to access tasks grouped in four categories: gestures using the fingers of one hand, gestures using the fingers of both hands, gestures using the whole hand, and gestures using both hands. All information is stored in a SQLite database and can be exported to a CSV file as, and when, it is required. Figure 1a describes the software architecture of the DS-Touch app and Figure 1b displays the menu of the software interface.
The app recognizes a gesture as successful, when the target is reached, and it is considered a failure when a predetermined amount of time elapses without the target being reached. Once the task is completed, it is repeated randomly. The data collected include user identification, success rate, and completion time. The hardware used during the execution experience was an iPad3 tablet, with a resolution of 2048×1536, 3.1 million pixels at 264 ppi, multi-touch screen, an A5X Processor, quad-core graphics and 1 GB of RAM.

4.4. Procedure and Measurements

The study was performed in three centres that offer support to individuals living with DS: the Centro Integral Down, A.C., the Down Institute of Monterrey managed by Down association of Monterrey, and the DIF of Santa Catarina. Participants were selected by the Centre’s based on their availability to avoid the risk of skewed results. Prior to the experiment all the participants, family and directors of centres were given an informed consent form reviewed by the university’s institutional review board, and provided with a brief introduction of the purposes and nature of the study. All participants gave their informed consent to be included in the study before participating. The study was conducted in accordance with the Declaration of Helsinki. The study was run over a two-day period for 5 h per day. During this time, researchers worked with 3–4 participants around one hour and half per day. The study was supervised at all times in each Centre by one of their in-house psychotherapists. The study participants were informed that they would be “playing games” using an electronic tablet. A moderator was on hand to assist participants, despite the fact that they were given verbal instructions prior to commencing (Figure 2). For ease of use, the device was placed on a table at all times and so participants could use both hands to perform the gestures. Each task was presented in a loop, so gestures were repeated in a random length of time. For each attempt, we registered the completion time and success rate.
The moderator could change the time frame if he thought the gesture was completed successfully for a number of times or if he deemed the participant could not perform the gesture after repeatedly trying to do so. In either case, participants were to essay the next gesture.
In more than half of the cases the moderator had to show the participants how to perform each of the gestures in order for them to complete the set task. A total of twenty gestures classified into four categories were analysed: those using a single finger on one hand, those using two fingers on one hand, those using a whole hand, those using two hands. Despite a brief set of instructions being displayed onscreen, participants also needed to receive verbal instructions as the majority of users had poor reading comprehension skills, or no reading comprehension skills whatsoever.
During each task the purpose-built application recorded: the number of attempts, the number of successfully completed gestures, and the time taken to complete a gesture; and for incomplete gestures the app measures the percentage by which they were completed. Whenever a gesture was successfully completed an audible sound was produced to provide users with feedback on their performance. Users were given five seconds to complete a task, but generally gestures would be completed in under five seconds. Once all the data were collected they were analysed to determine whether there were any differences in the success rates between different gestures. In other words, researchers attempted to establish whether certain gestures were more suited to individuals with DS than others.

5. Results

To test the hypotheses set forth in Section 3.1, the data gathered for gestures have been analysed in different steps: the first step consists of comparing the success rates against failure rates for each type of gesture; next, the success rate for each gesture is compared against the variables of gender, socioeconomic status, and type of DS; the third step consists of comparing the completion time for each gesture against the three aforementioned variables; and then data analysis concludes with observations made during the study and other qualitative results.

5.1. Comparing Gestures

The statistical test Pearson’s Chi-square test (χ²) is used to determine whether the gestures have the same degree of difficulty. The Chi-square test allows us to determine the relationship between two categorical variables, thereby analysing the relationship between two qualitative variables (type of gesture and success rate): on one hand, the twenty (20) gestures described in Section 4.3, and on the other hand, the execution of the gesture with two possible outcomes (success or fail).
Once the comparison of all gestures has been completed the resulting Pearson’s Chi-squared value stands at χ² = 1042.863 and the p-value = 0.000, indicating that there is significant different between the gestures. In other words, from a statistical stance there is significant difference between success rate and type of gesture. As such, research hypothesis HR1 is accepted. This analysis indicates that for individuals with DS the execution of different gestures on a touchscreen entails different degrees of difficulty, depending on the gesture in question. Furthermore, in Figure 3 it is possible to see that in the majority of instances the failure rate for gestures is higher than the success rate: out of a total of 5420 gestures that were analysed, 3293 were not completed correctly whilst 2127 were completed correctly (60.8% failed attempts against 39.2% successful attempts). It should be noted that some gestures are classified as ‘fails’ either because they were not performed correctly or because they were not fully completed. In other words, the execution of the gesture (e.g., Tap or Double tap), as well as the degree of execution of a gesture (e.g., Touch and slide) determine whether it is deemed to be a ‘success’ or ‘failure’. In the latter case, users have been observed correctly beginning and successfully executing a certain percentage of a gesture, yet failing to complete the entire gesture.
In light of the observed data, a priori it is possible to state that success rates are higher for the following gestures: Tap; Touch and hold; Stretch; Slide; and Separate. Likewise, it is possible to state that failure rates are higher for the following gestures: Press and drag; Double tap; Rotate using 2 fingers (both on same hand); Rotate using 2 fingers (one on each hand); Rotate with one hand; Pinch; and Move with hand.
Researchers ran a deeper analysis by performing a pair-wise comparison of gestures to test whether success rates were independent of gestures. This analysis was performed for each group of gestures (fingers-one hand, fingers-two hands, one whole hand, two whole hands). Once again, Pearson’s Chi-Square test was used with a significance level of p < 0.05. A Bonferroni correction was performed, thus establishing the levels of significance as a consequence of applying a large number of comparisons (according to group of gestures: 28, 6, 10 or 3 hypotheses). Results of the statistical analysis are displayed in Table 6, Table 7, Table 8 and Table 9. Each cell contains the p-value significance level for each of the gestures (shaded cells display analysis of success rate; un-shaded cells display analysis of completion time).
These tables reveal that not all gestures have the same success rate, and as such, in relation to the statement expounded in research hypothesis HR1, it is possible to once again confirm “Touch gestures have a different degree of difficulty for individuals with DS”. It is evident that there is significant difference in almost all gesture pairs analysed. Each cell indicates the comparison of two gestures. There is no significant difference in pairs with a p-value greater than 0.001, as such there is no difference in the degree of difficultly between the gestures paired in the list below:
  • Tap/Touch and hold/Slide
  • Touch and hold/Slide.
  • Touch and slide/Pinch/Stretch.
  • Double tap/Pinch
  • Rotate with fingers of both hands/Join with fingers of both hands.
  • Close fist/Spread fingers/Stop with hand
  • Spread fingers/Rotate with the hand
  • Move with the hand/Rotate with the hand
In contrast, the use of both hands presents a different degree of difficulty, whatever the gesture. Table A1, which can be found in the Appendix A of this paper, contains pair-wise gesture comparisons to test whether the success rate was independent of gesture (complete comparison). This table is of interest to readers wishing to know the significance between two gestures of different groups.
This section may be divided by subheadings. It should provide a concise and precise description of the experimental results, their interpretation as well as the experimental conclusions that can be drawn.

5.2. Success Rate by Gender, Socioeconomic Status, and Type of Down Syndrome

Figure 4, Figure 5 and Figure 6 show the success/failure rate of each gesture by gender, socioeconomic status and type of Down syndrome, respectively.
The objective is to establish whether the success rate of a gesture is dictated by any of the following factors: gender, socioeconomic status, or type of Down syndrome. Pearson’s chi-square tests were conducted on each gesture to determine the independence of success from these three qualitative factors.
In the column for Gender in Table 10, it is possible to observe that the majority of gestures do not affect the success rate; in other words, male of female participants with DS do not find it difficult to execute these gestures. From the results displayed in this column, it can be observed that four gestures are significantly influenced by the factor Gender (gestures 11, 12, 13, and 18). From a statistical standpoint, this would be sufficient to argue that the success rate of a gesture is indeed influenced by gender, however, as data indicates that gender only affects the success rate of four out of a total of 20 gestures it would also be possible to accept the null hypothesis (Ho3) of research hypothesis HR3. However, strictly speaking, the null hypothesis is rejected given that there is significant difference for gender in four gestures. For Socioeconomic Status the results were uniform. For the same reason mentioned above, it could be argued that overall there is no significant difference, and as such the null hypothesis (Ho6) of research hypothesis HR6 could be accepted. It could be assumed that users possessing touchscreen devices i.e., smartphones, which are items that the middle and upper classes are more able to afford, would demonstrate greater dexterity due to increased practice. However, analysis demonstrated that although only five gestures showed significant difference for Socioeconomic Status (gestures 5, 6, 8, 11, and 16) the null hypothesis (Ho6) for research hypothesis HR6 must be rejected given that significant difference does in fact exist for Socioeconomic Status in five gestures. In the column displaying results for type of Down syndrome it can be observed that this is clearly a significant factor as almost all the p-values are under 0.05, with the exception of six gestures that do not show significant difference, theses being: 7, 8, 12, 13, 15, and 16. As such, the null hypothesis (Ho5) of research hypothesis HR5 is accepted.
For this reason, the analysis of the three variables demonstrated that the type of Down syndrome does influence the success rate of gestures, and to a lesser extent so do gender and socioeconomic status. Table 10 is completed with an analysis of the interactions between the variables Gender-Socioeconomic Status, Gender-Type of Down Syndrome, and Socioeconomic Status-Type of Down Syndrome. With the exceptions of gestures 5 and 7, results show that there is significant difference in gender and socioeconomic status, meaning that there is significant difference in the function of socioeconomic status and whether someone is male or female; in the contingency tables obtained from calculations it is possible to observe that females of a lower socioeconomic status commit a greater number of errors. In the interaction between gender and type of Down syndrome 50% of the gestures analysed (10 gestures) reveal ‘significant difference’. When observing the data, it can be seen that more errors are committed by female participants regardless the type of Down syndrome they live with. It can therefore be concluded that women with Down syndrome encounter greater difficulties when performing touch gestures. The interaction between socioeconomic status and type of Down syndrome indicates there is significant different; at all social levels it is clear the variable type of Down syndrome influences the success rate of gesture execution as p-value = 0.000.
In conclusion, this analysis of success rates based on the study variables has demonstrated that although the factor gender does not affect all gestures, research hypothesis HR3 is rejected because gender does in fact affect some of the gestures. The null hypothesis (Ho6) of research hypothesis HR6 is rejected and the null hypothesis (Ho5) of research hypothesis HR5 is accepted for the same reasons, as both socioeconomic Status and type of Down Syndrome affect the success rate of a gesture. Whilst this analysis is valuable, the detailed data provided in the tables specifies the effect, or lack thereof, of each of these variables on each gesture.

5.3. Completion Time—ANOVA

An initial calculation related to the type of gesture and completion time, using ANOVA and the dependent variable Timer and independent variable Type of gesture (F = 51.010, p-value = 0.000) indicates that there is significant difference in the duration of different types of gestures. The null hypothesis (Ho2) of research hypothesis HR2 is accepted: Gesture difficulty is correlated with completion time.
The research project and corresponding study were performed in order to establish the influence of the variables gender, type of Down syndrome and socioeconomic status on the completion times of gestures. For this analysis the authors considered only including gestures that were successfully completed. Figure 7, Figure 8 and Figure 9 display the average times for each of the gestures and for each of the variables being researched.
Two-way ANOVA is applied with the independent variable of gender, type of Down syndrome and socioeconomic status, and the dependent variable of completion time (Table 11). This shows that for three gestures (11, 15, 19) completion time is influenced by gender, whereas in the majority of gestures completion time is not significantly influenced by gender (p-value > 0.05). As such, research hypothesis HR4 is rejected and therefore the null hypothesis Ho4 is also rejected; completion time is gender dependent. It has also been demonstrated that completion time is significantly influenced (p-value < 0.05) by the type of Down syndrome (gestures: 2, 3, 8, 10, 13, 14, 15, 16, 17, 18, and 19) and by socioeconomic status (gestures: 2, 4, 5, 6, 7, 8, 11, 12, 13, 14, 15, 16, 18, and 19), as such research hypotheses HR7 and HR8 are accepted.
Analysis also reveals that completion time is significantly influenced by the interaction of the following factors: Gender-Type of Down syndrome, Gender-Socioeconomic Status, and Type of Down Syndrome-Socioeconomic Status.
The observation of data in this analysis indicates that male participants perform gestures faster than female participants, and that individuals of a higher socioeconomic status perform gestures faster than those of a lower socioeconomic status. In general, there are no differences in the completion times between Type of Down Syndrome-Gender, although there are for Type of Down Syndrome-Socioeconomic Status. As such, it appears that the variables socioeconomic status and gender do in fact affect gesture completion, and thus success rates.

5.4. Observational Findings/Qualitative Results

This section contains valuable information regarding participant behaviour during the study. In addition to providing instructions and assistance to participants during tasks the moderator took notes of any difficulties that influenced whether a task was successful or unsuccessful, as well as any issues that caused participants to take longer to complete a given task.
A common habit shared by all participants was that of pressing the touchscreen forcefully with one finger and holding the finger down in the same position over an extended period of time. This did not give rise to too many complications with gestures such as the ‘Tap’ gesture, although it did pose more of a problem when participants came to use the ‘Double Tap’ gesture and translated into a greater number of ‘fails’. The action of pressing down with force over a continued period of time meant that the device interpreted the action as a ‘Drag’ gesture instead of a ‘Tap’ gesture.
It was observed that certain gestures were often not fully completed: in the case of one-handed finger gestures requiring participants to drag an object over a specific distance, i.e., Touch and hold, participants would stop dragging too soon; and in the case of gestures involving sliding or rotation motions, i.e., Pinch, Rotate, Stretch, Slide, Touch and slide, participants would abandon a gesture before completing a task when force was required, rather than just positioning fingers on the screen and sliding them. This led to much frustration among participants, as no matter how hard they tried they were unable to finish a task as a consequence of ending a gesture before the task was complete, and thus a ‘failed’ gesture being recorded.
In the case of gestures that require the use of two fingers, success rates were higher. However, it was observed that there was a lack of coordination when it came to getting both fingers to the same destination, e.g., in the case of Join, or when separating both fingers. This lack of coordination was most evident in the Rotate gesture using two fingers. For the gesture Press and drag, the challenge proved to be understanding the instructions: participants would instinctively make the Separate gesture, and they felt the need to concentrate hard in order to be mentally alert enough to hold one finger in place whilst moving the other.
All of the physical or observable incidences that were detected are the consequence of the physical traits found in individuals with DS, such as rigidity in the extremities, difficulty moving arms, or difficulty with concentrating.
Regarding interactions involving the Double tap gesture, it was observed that cognitive complexity of decision-making is involved. Participants were seen repeatedly tapping the screen, up to four or five times, even once an object had disappeared.
Regarding gestures performed with the entire hand, it was observed that some gestures were easier to perform and the success rate for these was almost perfect, i.e., Stop, Move, and Rotate with hand. The gestures Close fist and Spread fingers proved more difficult. These consisted of joining or separating all five fingers of one hand on the touchscreen. As the fingers of individuals with DS are shorter, more curved, and more rigid, the act of opening and closing the fingers of the hand on a screen’s surface proved to be no easy feat for them.
Regarding gestures with two hands, Spread and Rotate had a high failure rate as a consequence of a lack of coordination between the two hands when moving at the same time. However the gesture Join using both hands had a higher success rate.
It was also observed that female participants showed greater patience when performing tasks and older individuals made efforts to improve on subsequent attempts. The youngest individuals found it hardest to focus their attention and concentrate on the task at hand, and, even though the tasks involved “playing games” they were easily distracted.

6. Discussion

6.1. UX Considerations

Based on the results obtained, individuals with Down syndrome are able to perform all of the proposed touch gestures (one-handed finger gestures, two-handed finger gestures, one hand touch gestures, and two hand touch gestures) although the success rates and completion times for each of these varies considerably.
This study analyses the differences between gestures in terms of execution success rates and completion times. For execution success rates, higher success rates were obtained for the gestures Slide, Tap, Touch and slide, Stretch, Separate, and Join. When performing pairwise comparisons of gestures, it was found that certain pairs had the same success rates: Tap/Touch and Hold; Tap/Slide; Double tap/Pinch; Touch and hold/Stretch; Rotation 2 hands/Join 2 hands; Touch and hold/Slide; Pinch/Touch and Slide.
Comparative analogous research was performed to study differences in completion times. It was found that no differences exist between certain gesture pairs (Stretch/Pinch; Stretch/Touch and slide; Separate 2 hands/Join 2 hands; Move with the hand/Rotate with the hand). The comparisons presented in this paper will help developers select suitable gestures when designing an application.
What is more, the data presented in this paper reveals issues that require further discussion. In all instances, verbal instructions proved insufficient and the moderator needed to physically demonstrate how to perform the gestures to participants so they could understand what was required of them. For their part, participants were more interested in completing the tasks themselves than in performing the gesture, as such researchers observed that they would find and use shortcuts to complete the tasks. For example, some individuals would use the fingers on both hands, despite having received instructions to only use two fingers from the same hand.
During the Tap gesture, some participants would place both hands on the edge of the tablet and swap indiscriminately between the thumbs on either their left or right hand when performing an action in order to use the thumb closest to the object on screen. The Double tap gesture tended to lead to errors. It was observed that the majority of participants waited for too long between the first and second tap, thus producing a failed or ‘false’ gesture. This led to a sense of frustration among users, even causing one participant to forcefully press down on the object. Other individuals reacted by tapping the screen 3 or 4 times instead of twice, despite the fact that the object had already disappeared. Thus, developers are advised to use the Tap gesture rather than the Double tap gesture. This issue could be resolved by adapting the programming behind the gesture to the motor skills of these individuals, eliminating lag between taps or, equally, by ignoring subsequent taps once a task is completed.
In the task created to elicit the Hold gesture participants had to stop the Sun as it crossed the sky. Two problems were observed: firstly, instead of stopping the Sun itself some users would use the gesture slightly behind the object as it travelled its path across the screen; secondly, some users stopped the object itself, but did not continue the gesture for long enough. An audio cue was provided to indicate when the task had been successfully completed, however some participants would stop the Hold gesture before hearing this audio cue. It is recommended that developers avoid using moving objects if they intend to use this gesture.
The Rotate gesture is not performed in one single smooth movement, but instead in several steps. One participant used two fingers, but did not separate them enough and the gesture was detected as a single finger, thus a failed or ‘false’ gesture was recorded. On occasion, other participants lifted their hand off the tablet and thus a failed or ‘false’ gesture was recorded. They quickly learnt that they had to keep a least one finger on the screen, and did so. It was also observed that participants always performed the Rotate gesture in a clockwise fashion, most likely due to being right-handed, and did so even when a counter clockwise rotation would have been shorter. Developers should include visual cues indicating the recommended direction for performing the Rotate gesture.
For the Zoom in and Zoom out gestures, participants were only able to perform 75% of either gesture, never the complete gesture, due to having short fingers. Some participants repeatedly used fingers from both hands, ignoring repeated instructions from the moderator informing them that the gesture had to be performed using the fingers from one hand only.
For the Slide gesture the majority of participants performed it without difficulty. However, in the case of two individuals it was observed that they did not start the gesture on the ball itself, but instead they started the gesture just in front of the object causing a failed gesture to be recorded. Nonetheless, it is believed that this problem comes down to how the task was designed. Developers should keep context in mind to ensure that the gesture is started on the object that has to be moved.
The gesture Press and drag proved to be the most complicated for participants to perform. Some individuals tried to perform symmetrical gestures using both hands. In other words, if the finger of the left hand performed the Press, they mirrored this with the finger of the right hand and, consequently, they stopped pressing on the screen when they then had to perform the Drag action. The easiest way to perform this gesture is to press with the left finger and drag with the right finger, although some individuals did this back to front which caused them to clumsily cross one hand over the other when trying to drag the object.
Most participants could perform the Separate gesture without problem. However, they did not execute the action in one smooth go. Instead they would perform the first half of the gesture and then the second.
The Close fist gesture did not present participants with any difficulties.
Participants could perform the Spread fingers gesture, although their fingers slipped and it generally took them a long time to perform.
For the one-handed Stop gesture, there was one participant who would stop the object with the entire hand but then lift all but two fingers of the screen. Nonetheless, we recommending using the one-handed gesture as it proved easier for participants.
For the Move with hand gesture it was observed that despite users performing the gesture correctly, the tablet did not detect all fingers. It was discovered that performing the gesture with just three fingers was better than performing it with all five fingers.
For the gesture Spread with both hands, participants found it very complicated to use when interacting with small objects.
The Join using both hands gesture did not present participants with any difficulties.
For the Rotate using both hands gesture many participants preferred using only one hand. It became evident that if an object was very small users would place both hands within the object and it became difficult to rotate, but with larger objects the task proved easier.
The results and observations obtained in this study enable designers to follow a Design for All approach that supports the special needs of people with DS. The interface can be adjusted to the needs and development level of these users in an adaptive way.

6.2. Gender, Social Status and Type of Down Syndrome

Several pieces of research analyse the effects of gender given that this variable can influence a study. In usability studies in particular, gender is a variable that must be taken into account given that behaviours and use preferences may differ between males and females (e.g., colours, typography, saturation of information, type of image) [77]. In this particular study that has been performed specifically with individuals with Down syndrome no significant difference was identified for success rates or completion times by gender when performing touch gestures on a touchscreen, thus it is possible to argue that the difficulty experienced when performing a gesture (dependant on a unique set of intellectual and physical limitations) and the success rates for gestures are the same for both men and women. The authors of this paper had believed that the socioeconomic status of participants would prove to be a key factor in the success rate of gestures, given that individuals from High or Medium socioeconomic levels have more access to touchscreen devices. However this study has demonstrated that it is not a significant factor, as neither success rates nor completion times show significant difference by socioeconomic status. From a biological standpoint, there are three types of Down syndrome. Following analysis, significant difference was identified for success rates and completion times by type of Down syndrome. Individuals with Mosaicism achieved higher success rates and faster completion times than individuals with Trisomy 21 and translocation DS. No differences were identified between individuals with Trisomy 21 and translocation DS.
Far from attempting to establish intellectual differences between different types of Down syndrome, this paper is merely highlighting the findings obtained from this particular study for the purpose of designing apps that use touch gestures that everyone finds easy and intuitive to use.
Numerous studies indicate that differences in intelligence quotient (IQ) between individuals with Down syndrome are similar to those found in any other group of individuals without learning difficulties [78]. As with any other human characteristic, IQ levels result from genetic inheritance and chance. Differences that exist between individuals are exacerbated to a greater or lesser extent by an individual’s own unique set of personal characteristics, their life experiences and, above all, external stimulation [79]. Every single individual is unique. Everyone has their own personal strengths and weaknesses, and each of us will live different life experiences. The same is true of individuals with Down syndrome. It has been demonstrated that the early stimulation newborns are exposed to in the first few days of life has a significant positive impact on the lives of individuals with learning difficulties [80]. Furthermore, as would be true for anyone, those living with Down syndrome are now achieving things that until recently were considered unimaginable as a result of changing environmental and social expectations, being given opportunities to participate in common areas, and receiving support from others.

6.3. Designing Multi-Touch Application for SD (or for all)

This work highlights the capacity of individuals with Down syndrome, or likewise any individual with a certain degree of physical or intellectual disability, to perform touch gestures on a touchscreen. The observations and findings from this study indicate which gestures are deemed most suitable when trying to ensure high success rates and rapid gesture execution.
To support this work a visual guidebook has been created (see Appendix B) to assist application developers and manufacturers of touchscreen devices. The guidebook lists available gestures together with their characteristics, and includes recommendations on which gestures should be considered as part of a Design for All approach to ensure app design and development is focused on as many users as possible.

7. Conclusions

This study has evaluated a total of 20 touchscreen gestures performed using one-handed finger gestures, two-handed touch gestures, one hand touch gestures, and two hand touch gestures with individuals living with Down syndrome to determine if differences are present in terms of ease of use and execution times of said gestures.
The study is founded on the premise of Design for All. From this standpoint it has been argued that design processes should take into consideration the needs of users with Down syndrome to avoid frustrations as a result of poor design, or at least understand what the most suitable design choices and operations are for this group of individuals.
This study’s results reveal that individuals with Down syndrome can perform all the multi-touch gestures evaluated, although success rates vary considerably depending on the gesture in question. The evaluated gestures can be used in future designs of touchscreen apps targeting this group of individuals even though there are differences in execution times and success rates between some participants when compared against each other. Thus, this work can assist developers in discerning between gestures and help them to pick the most suitable gestures based on success rate or execution time, which in turn will help them to create apps with more natural and intuitive interactions.
The study does not find statistically significant results for success rates by the variables of gender or, democratically, socioeconomic status (only four out of 20 gestures showed significant differences). However, the variable type of Down syndrome did affect success rates. Table 12 shows the acceptance/rejection conclusion of the hypotheses studied in this study.
As a future line of research the authors wish to explore the influence of success rate and execution time following the principles of interaction of Fitts’s Law.

Author Contributions

The contributions to this paper are as follows: Conceptualization, M.S.D.R.G.; methodology, M.S.D.R.G. and J.M.-G.; software, M.S.D.R.G.; validation, investigation, formal analysis, M.S.D.R.G. and J.M.-G.; data curation, J.M.-G.; writing—original draft preparation, M.S.D.R.G. and J.M.-G.; writing—review and editing, supervision, J.M.-G. Both authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available in the paper.

Acknowledgments

We would like to thank the Down syndrome support centre Centro Integral Down, A.C., the Asociación down de Monterrey, the DIF de Santa Catarina, and the Universidad de Monterrey for so generously agreeing to collaborate with us under the Educational and Social Inclusion Program (Programa de Inclusión Social y Educativa—PISYE).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Comparison of touch gestures by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). p-values.
Table A1. Comparison of touch gestures by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). p-values.
Success
Completion Time
1234567891011121314151617181920
1. Tap-0.0000.2730.0000.0000.0000.0000.5730.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.8660.000
2. Double tap0.000-0.0000.0010.0600.0000.0000.0000.0020.0000.0000.0000.6910.3900.0650.0000.0020.0910.0000.063
3. Touch and hold0.0000.000-0.0000.0000.0190.0000.0850.0000.0000.0620.0000.0000.0000.0000.0000.0000.0000.3060.000
4. Rotate0.0000.0000.000-0.0000.0000.0000.0000.0000.1410.0000.0000.0010.0290.0000.1230.9630.2040.0000.000
5. Pinch0.0040.0000.0000.000-0.0000.0500.0000.1310.0000.0000.0070.1910.0160.7740.0000.0000.0020.0000.850
6. Stretch0.0010.0000.0000.0000.405-0.0020.0000.0090.0000.5090.0360.0000.0000.0000.0000.0000.0000.0010.000
7. Touch and slide0.0000.0000.0000.0000.0040.056-0.0000.9540.0000.0000.3510.0010.0000.1900.0000.0000.0000.0000.125
8. Slide0.0000.0000.0000.0000.0000.0000.000-0.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.8410.000
9. Rotate—2 hands0.0000.0000.0000.1050.0000.0000.0000.000-0.0000.0010.4200.0100.0000.2860.0000.0000.0000.0000.220
10. Press and drag—2 hands0.0000.0000.0000.0000.0000.0000.0000.0000.000-0.0000.0000.0000.0000.0000.8210.1960.0030.0000.000
11. Separate—2 hands0.0180.0000.0000.0000.1610.2530.0100.0000.0000.000-0.0050.0000.0000.0000.0000.0000.0000.0050.000
12. Join—2 hands0.0100.0000.0000.0000.0550.1650.0000.0000.0000.0000.313-0.0000.0000.0470.0000.0000.0000.0000.025
13. Close fist0.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.000-0.2580.1640.0000.0020.0590.0000.172
14. Spread fingers0.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0010.0030.546-0.0190.0000.0380.4200.0000.018
15. Stop with hand0.0000.0000.0000.0000.0010.0020.0000.0000.0000.0000.0010.0000.0000.000-0.0000.0000.0030.0000.918
16. Move with hand0.0000.0000.0000.0000.0220.0410.0210.0000.0000.0000.0160.0000.0000.0000.000-0.1680.0040.0000.000
17. Rotate with hand0.0000.0000.0000.0000.2210.2470.4410.0000.0000.0000.0070.0000.0000.0000.0000.523-0.2180.0000.000
18. Spread using both hands0.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.0000.000-0.0000.003
19. Join using both hands0.0000.0000.0000.0000.0010.0090.0000.0000.0000.0000.0600.1220.5290.8970.0000.0000.0000.000-0.000
20. Rotate using both hands0.0040.0000.0000.0460.0000.0000.0000.0010.1360.0000.0000.0000.0000.0000.0000.0000.0000.0000.004-

Appendix B

Table A2. Guide Touch Screen gestures recommendation, to apply in User Interface Design for all.
Table A2. Guide Touch Screen gestures recommendation, to apply in User Interface Design for all.
Type of GestureName/ How Easy to DoFunctionGesturesRecommendationsType of Gesture
Touch gestures using fingers on one handTap
Sensors 21 01328 i044
Select object Sensors 21 01328 i001Individuals only encounter problems when the object is very small (less than 10 mm).Ensure objects are 10 mm or larger.
Doble tap
Sensors 21 01328 i041
Select object Sensors 21 01328 i003Individuals perform the gesture too slowly and it is recognised as two separate tap gestures rather than a double tap gesture.Do not use; we recommend using the tap gesture.
Touch and hold
Sensors 21 01328 i043
Select and move object Sensors 21 01328 i005This gesture is very easy to perform when being used to perform the function ‘select object’ However, this gesture proves more complicated when it is used to perform the function ‘stop object’. Individuals tend to get distracted easy and lift their finger off the screen before completing the gesture.We recommend using this gesture for selection, although the tap gesture is more effective. Do not use for moving objects, the slide gesture would be more suitable.
Rotate
Sensors 21 01328 i041
Rotate object Sensors 21 01328 i007Despite it being easier to perform the gesture in an anticlockwise direction, individuals rotate in a clockwise direction.We recommend two-handed rotation gestures. Indicate direction of rotation using arrows or visual cues.
Pinch
Sensors 21 01328 i041
Shrink object Sensors 21 01328 i009Individuals experience problems with coordination, especially with small objects.We recommend zooming in using two fingers. If used, we recommend using this gesture to confirm critical tasks.
Stretch
Sensors 21 01328 i042
Expand object Sensors 21 01328 i011Individuals experience problems with coordination, especially with small objects.We recommend zooming in using two fingers. If used, we recommend using this gesture to confirm critical tasks.
Touch and slide
Sensors 21 01328 i042
Move object Sensors 21 01328 i013Individuals tend to stop touching the screen before the gesture is finished.Include a very visual goal and provide audible feedback to indicate when a gesture is successfully.
Slide
Sensors 21 01328 i044
Select and move object Sensors 21 01328 i015Individuals only encounter problems when the object is very small (less than 8 mm).Use with objects measuring 15 mm or more.
Touch gestures using fingers on both handsRotate
Sensors 21 01328 i041
Rotate object Sensors 21 01328 i017Despite it being easier to perform the gesture in an anticlockwise direction, individuals rotate in a clockwise direction.We recommend two-handed rotation gestures. Indicate direction of rotation using arrows or visual cues.
Press and drag
Sensors 21 01328 i041
Move object Sensors 21 01328 i019Individuals have difficulty coordinating fingers. They stop pressing before they finish dragging the object.We recommend one finger.
Separate
Sensors 21 01328 i042
Expand and move object Sensors 21 01328 i021Individuals may accidently lean on the screen with the palm of the hand.When using this gesture makes sure other parts of the screen are no active.
Join
Sensors 21 01328 i042
Shrink and move object Sensors 21 01328 i023Individuals may accidently lean on the screen with the palm of the hand.When using this gesture makes sure other parts of the screen are no active.
One-handed gesturesClose fist
Sensors 21 01328 i042
Shrink object Sensors 21 01328 i025The gesture poses significant challenges. It is not detected because not all fingers are used.Not recommended. Do not use this gesture. It is better to use the fingers on both hands.
Spread fingers
Sensors 21 01328 i041
Expand object Sensors 21 01328 i027The gesture poses significant challenges. It is not detected because not all fingers are used.Not recommended. Do not use this gesture. It is better to use the fingers on both hands.
Stop with hand
Sensors 21 01328 i042
Select and move object Sensors 21 01328 i029This gesture is complicated to perform using touchscreen, but one of the easiest to perform using 3D technology.The tap gesture is more suitable for selecting objects. The slide gesture is better for moving objects.
Move with hand
Sensors 21 01328 i042
Move object Sensors 21 01328 i031Individuals may need to perform the gesture in several moves.We recommend the Slide gesture. If used, allow gesture to be performed in several moves.
Rotate
Sensors 21 01328 i041
Rotate object Sensors 21 01328 i033Individuals overcomplicate the gesture: despite it being easier to perform the gesture in an anticlockwise direction, they always rotate in a clockwise direction.We recommend two-handed rotation gestures. Indicate direction of rotation using arrows or visual cues. If used, allow gesture to be performed in several moves.
Two-handed gesturesSpread using both hands
Sensors 21 01328 i041
Move object Sensors 21 01328 i035Individuals have a tendency to perform the gesture in several moves.Allow gesture to be performed in several moves.
Join using both hands
Sensors 21 01328 i044
Move object Sensors 21 01328 i037Individuals have a tendency to perform the gesture in several moves.Allow gesture to be performed in several moves.
Rotate using both hands
Sensors 21 01328 i042
Rotate object Sensors 21 01328 i039Although it takes several moves, it is the easiest way to rotate an object.Recommended gesture for rotation. Allow gesture to be performed in several moves.
Design considerations, building on those proposed by Mendoza and collaborators [60]:
  • Use short verbal instructions
  • Do not use double negatives or complex words
  • Never give instructions in which the individual must imagine being in a different setting
  • Favour gestures related to the context
  • Limit colours to the greatest extent possible
  • Use simple images that are not overly realistic
  • Avoid abstract images
  • Avoid ambiguity
  • Avoid using text inside images
  • Use a white background whenever possible
  • Use the most conventional form for objects

References

  1. De Haan, G. A Vision of the Future of Media Technology Design Education-Design and Education from HCI to UbiComp. In Proceedings of the 3rd Computer Science Education Research Conference on Computer Science Education Research, Arnhem, The Netherlands, 4–5 April 2013; pp. 67–72. [Google Scholar]
  2. Peter, C.; Crane, E.; Fabri, M.; Agius, H.; Axelrod, L. Emotion in HCI: Designing for People. In Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction—Volume 2; BCS-HCI ’08, Liverpool, UK, 1–5 September 2008; BCS Learning & Development Ltd.: Swindon, UK, 2008; pp. 189–190. [Google Scholar]
  3. Smith, S.; Burd, E.; Rick, J. Developing, Evaluating and Deploying Multi-Touch Systems. Int. J. Hum. Comput. Stud. 2012, 70, 653–656. [Google Scholar] [CrossRef]
  4. Buxton, B. Multi-Touch Systems that I Have Known and Loved. Available online: http://billbuxton.com/multitouchOverview.html (accessed on 17 March 2019).
  5. Plowman, L. Researching Young Children’s Everyday Uses of Technology in the Family Home. Interact. Comput. 2015, 27, 36–46. [Google Scholar] [CrossRef]
  6. Media, C.S.; Rideout, V. Zero to Eight: Children’s Media Use in America; Common Sense Media: San Francisco, CA, USA, 2011. [Google Scholar]
  7. Hourcade, J.P.; Mascher, S.L.; Wu, D.; Pantoja, L. Look, My Baby Is Using an IPad! An Analysis of YouTube Videos of Infants and Toddlers Using Tablets. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems—CHI ’15, Seoul, Korea, 18–23 April 2015; ACM Press: New York, NY, USA, 2015; pp. 1915–1924. [Google Scholar] [CrossRef]
  8. Bautista, J.R.; Rosenthal, S.; Lin, T.T.C.; Theng, Y.L. Predictors and Outcomes of Nurses’ Use of Smartphones for Work Purposes. Comput. Hum. Behav. 2018, 84, 360–374. [Google Scholar] [CrossRef]
  9. Muangprathub, J.; Boonnam, N.; Kajornkasirat, S.; Lekbangpong, N.; Wanichsombat, A.; Nillaor, P. IoT and Agriculture Data Analysis for Smart Farm. Comput. Electron. Agric. 2019, 156, 467–474. [Google Scholar] [CrossRef]
  10. Sousa, M.J.; Rocha, Á. Digital Learning: Developing Skills for Digital Transformation of Organizations. Futur. Gener. Comput. Syst. 2019, 91, 327–334. [Google Scholar] [CrossRef]
  11. Park, C.S. Examination of Smartphone Dependence: Functionally and Existentially Dependent Behavior on the Smartphone. Comput. Hum. Behav. 2019, 93, 123–128. [Google Scholar] [CrossRef]
  12. Shneiderman, B.; Plaisant, C.; Cohen, M.; Jacobs, S.; Elmqvist, N.; Diakopoulos, N. Designing the User Interface: Strategies for Effective Human-Computer Interaction, 6th ed.; Pearson: New York, NY, USA, 2016. [Google Scholar]
  13. Hourcade, J.P. Interaction Design and Children. Found. Trends® Hum. -Comput. Interact. 2007, 1, 277–392. [Google Scholar] [CrossRef] [Green Version]
  14. Ingram, A.; Wang, X.; Ribarsky, W. Towards the Establishment of a Framework for Intuitive Multi-Touch Interaction Design. In Proceedings of the International Working Conference on Advanced Visual Interfaces—AVI ’12, Capri Island, Naples, Italy, 22–25 May 2012; ACM Press: New York, NY, USA, 2012; p. 66. [Google Scholar] [CrossRef] [Green Version]
  15. Jeong, H.; Liu, Y. Effects of Touchscreen Gesture’s Type and Direction on Finger-Touch Input Performance and Subjective Ratings. Ergonomics 2017, 60, 1528–1539. [Google Scholar] [CrossRef]
  16. Abdul Aziz, N.S.; Ahmad, W.F.W.; Zulkifli, N.J.B. User Experience on Numerical Application between Children with Down Syndrome and Autism. In Proceedings of the International HCI and UX Conference in Indonesia—CHIuXiD ’15, Bandung, Indonesia, 8–10 April 2015; ACM Press: New York, NY, USA, 2015; pp. 26–31. [Google Scholar] [CrossRef]
  17. Wildenbos, G.A.; Jaspers, M.W.M.; Schijven, M.P.; Dusseljee-Peute, L.W. Mobile Health for Older Adult Patients: Using an Aging Barriers Framework to Classify Usability Problems. Int. J. Med. Inform. 2019, 124, 68–77. [Google Scholar] [CrossRef]
  18. Nacher, V.; Jaen, J.; Navarro, E.; Catala, A.; González, P. Multi-Touch Gestures for Pre-Kindergarten Children. Int. J. Hum. Comput. Stud. 2015, 73, 37–51. [Google Scholar] [CrossRef] [Green Version]
  19. Vatavu, R.-D.; Cramariuc, G.; Schipor, D.M. Touch Interaction for Children Aged 3 to 6 Years: Experimental Findings and Relationship to Motor Skills. Int. J. Hum. Comput. Stud. 2015, 74, 54–76. [Google Scholar] [CrossRef]
  20. Gao, Q.; Sun, Q. Examining the Usability of Touch Screen Gestures for Older and Younger Adults. Hum. Factors 2015, 57, 835–863. [Google Scholar] [CrossRef] [PubMed]
  21. Smith, A.L.; Chaparro, B.S. Smartphone Text Input Method Performance, Usability, and Preference with Younger and Older Adults. Hum. Factors 2015, 57, 1015–1028. [Google Scholar] [CrossRef]
  22. Mihajlov, M.; Law, E.L.-C.; Springett, M. Intuitive Learnability of Touch Gestures for Technology-Naive Older Adults. Interact. Comput. 2015, 27, 344–356. [Google Scholar] [CrossRef] [Green Version]
  23. Kumin, L.; Lazar, J.; Feng, J.H. Expanding Job Options: Potential Computer-Related Employment for Adults with Down Syndrome. ACM SIGACCESS Access. Comput. 2012, 103, 14–23. [Google Scholar] [CrossRef]
  24. Nacher, V.; Cáliz, D.; Jaen, J.; Martínez, L. Examining the Usability of Touch Screen Gestures for Children with Down Syndrome. Interact. Comput. 2018, 30, 258–272. [Google Scholar] [CrossRef] [Green Version]
  25. Statista. Available online: https://www.statista.com/statistics/271644/worldwide-free-and-paid-mobile-app-store-downloads/ (accessed on 12 February 2021).
  26. Meneses Fernández, M.D.; Santana Hernández, J.D.; Martín Gutiérrez, J.; Henríquez Escuela, M.R.; Rodríguez Fino, E. Using Communication and Visualization Technologies with Senior Citizens to Facilitate Cultural Access and Self-Improvement. Comput. Hum. Behav. 2017, 66, 329–344. [Google Scholar] [CrossRef]
  27. Serrano Santoyo, A.; Martínez Martiínez, E. La Brecha Digital: Mitos y Realidades; Universidad Autónoma de Baja California: Baja California, Mexico, 2003. [Google Scholar]
  28. Cabero, J. Reflexiones Sobre La Brecha Digital y La Educación: Siguiendo El Debate. Inmanecencia 2014, 4, 14–26. [Google Scholar]
  29. Lagatta, J.; Di Nicolatonio, M.; Vallicelli, A. Design for Inclusion. Differences and Similarities between DfA and UD in the Field of Sailing Yacht Design. Procedia Manuf. 2015, 3, 2714–2721. [Google Scholar] [CrossRef] [Green Version]
  30. Lazar, J.; Woglom, C.; Chung, J.; Schwartz, A.; Hsieh, Y.G.; Moore, R.; Crowley, D.; Skotko, B. Co-Design Process of a Smart Phone App to Help People with Down Syndrome Manage Their Nutritional Habits. J. Usability Stud. 2018, 13, 73–93. [Google Scholar]
  31. Porter, J. Entering Aladdin’s Cave: Developing an App for Children with Down Syndrome. J. Comput. Assist. Learn. 2018, 34, 429–439. [Google Scholar] [CrossRef] [Green Version]
  32. Cáliz Ramos, D.C. Usability Testing Guide for Mobile Applications Focused on People with Down Syndrome (USATESTDOWN); Universidad Politécnica de Madrid: Madrid, Spain, 2017. [Google Scholar] [CrossRef]
  33. Cortes, M.Y.; Guerrero, A.; Zapata, J.V.; Villegas, M.L.; Ruiz, A. Study of the Usability in Applications Used by Children with Down Syndrome. In Proceedings of the 8th Computing Colombian Conference (8CCC 2013), Armenia, Colombia, 21–23 August 2013; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar] [CrossRef]
  34. Del Rio Guerra, M.; Martin Gutierrez, J.; Aceves, L. Design of an Interactive Gesture Measurement System for down Syndrome People. In Universal Access in Human-Computer Interaction. Methods, Technologies, and Users; Antona, M., Stephanidis, C., Eds.; Springer: Cham, Switzerland, 2018; Volume LNCS10907, pp. 493–502. [Google Scholar] [CrossRef]
  35. Chapman, R.S. Language Development in Children and Adolescents with Down Syndrome. Ment. Retard. Dev. Disabil. Res. Rev. 1997, 3, 307–312. [Google Scholar] [CrossRef]
  36. Feng, J.; Lazar, J.; Kumin, L.; Ozok, A. Computer Usage by Children with Down Syndrome. ACM Trans. Access. Comput. 2010, 2, 1–44. [Google Scholar] [CrossRef]
  37. Williams, K.; Wargowski, D.; Eickhoff, J.; Wald, E. Disparities in Health Supervision for Children with Down Syndrome. Clin. Pediatr. 2017, 56, 1319–1327. [Google Scholar] [CrossRef]
  38. Carr, J. The Development of Intelligence. In Current Approaches to Down’s Syndrome; Lane, D., Stratford, B., Eds.; Cassel Educational Limited: London, UK, 1985. [Google Scholar]
  39. Cohen, W.I. Health Care Guidelines for Individuals with Down Syndrome-1999 Revision. In Down Syndrome; John Wiley & Sons, Inc.: New York, NY, USA, 2003; pp. 237–245. [Google Scholar] [CrossRef]
  40. Abbeduto, L.; Pavetto, M.; Kesin, E.; Weissman, M.D.; Karadottir, S.; O’Brien, A.; Cawthon, S. The Linguistic and Cognitive Profile of Down Syndrome: Evidence from a Comparison with Fragile X Syndrome. Down’s Syndr. Res. Pract. 2001, 7, 9–15. [Google Scholar] [CrossRef] [Green Version]
  41. Kumin, L. Early Communication Skills for Children with Down Syndrome: A Guide for Parents and Professionals; Woodbine House: Bethesda, MD, USA, 2003. [Google Scholar]
  42. Barnhart, R.C.; Connolly, B. Aging and Down Syndrome: Implications for Physical Therapy. Phys. Ther. 2007, 87, 1399–1406. [Google Scholar] [CrossRef] [Green Version]
  43. Sears, A.; Revis, D.; Swatski, J.; Crittenden, R.; Shneiderman, B. Investigating Touchscreen Typing: The Effect of Keyboard Size on Typing Speed. Behav. Inf. Technol. 1993, 12, 17–22. [Google Scholar] [CrossRef] [Green Version]
  44. Colle, H.A.; Hiszem, K.J. Standing at a Kiosk: Effects of Key Size and Spacing on Touch Screen Numeric Keypad Performance and User Preference. Ergonomics 2004, 47, 1406–1423. [Google Scholar] [CrossRef]
  45. Kim, H.; Kwon, S.; Heo, J.; Lee, H.; Chung, M.K. The Effect of Touch-Key Size on the Usability of In-Vehicle Information Systems and Driving Safety during Simulated Driving. Appl. Ergon. 2014, 45, 379–388. [Google Scholar] [CrossRef] [PubMed]
  46. Kim, H.; Song, H. Evaluation of the Safety and Usability of Touch Gestures in Operating In-Vehicle Information Systems with Visual Occlusion. Appl. Ergon. 2014, 45, 789–798. [Google Scholar] [CrossRef]
  47. Garcia-Lopez, E.; Garcia-Cabot, A.; de-Marcos, L. An Experiment with Content Distribution Methods in Touchscreen Mobile Devices. Appl. Ergon. 2015, 50, 79–86. [Google Scholar] [CrossRef]
  48. Rusnák, V.; Appert, C.; Chapuis, O.; Pietriga, E. Designing Coherent Gesture Sets for Multi-Scale Navigation on Tabletops. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems—CHI ’18, Montreal, QC, Canada, 21–26 April 2018; ACM Press: New York, NY, USA, 2018; pp. 1–12. [Google Scholar] [CrossRef] [Green Version]
  49. Mayer, S.; Gad, P.; Wolf, K.; Woźniak, P.W.; Henze, N. Understanding the Ergonomic Constraints in Designing for Touch Surfaces. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services—MobileHCI ’17, Vienna, Austria, 4–7 September 2017; ACM Press: New York, NY, USA, 2017; pp. 1–9. [Google Scholar] [CrossRef]
  50. Crescenzi Lanna, L.; Grané Oro, M. Touch Gesture Performed by Children under 3 Years Old When Drawing and Coloring on a Tablet. Int. J. Hum. Comput. Stud. 2019, 124, 1–12. [Google Scholar] [CrossRef]
  51. Dingler, T.; Rzayev, R.; Shirazi, A.S.; Henze, N. Designing Consistent Gestures Across Device Types. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems—CHI ’18, Montreal, QC, Canada, 21–26 April 2018; ACM Press: New York, NY, USA, 2018; pp. 1–12. [Google Scholar] [CrossRef]
  52. Elias, J.G.; Westerman, W.C.; Haggerty, M.M. Multi-Touch Gesture Dictionary. U.S. Patent 20070177804 A1, 2 August 2007. [Google Scholar]
  53. Plichta, J.; Kukulski, T.; Beckert, J. System and Method for Developing and Classifying Touch Gestures. U.S. Patent 8436821 B1, 7 May 2013. [Google Scholar]
  54. Lim, J.H.; Chunik, J.; Kim, D.-H. Analysis on User Variability in Gesture Interaction. In Convergence and Hybrid Information Technology; Lee, G., Howard, D., Kang, J.J., Ślęzak, D., Eds.; Springer: Cham, Switzerland, 2012; Volume LNCS2012, pp. 209–302. [Google Scholar]
  55. Cáliz, D.; Martinez, L.; Alaman, X.; Teran, C.; Caliz, R. Usability Testing Process with People with Down Syndrome Intreacting with Mobile Applications: A Literature Review. Int. J. Comput. Sci. Inf. Technol. 2016, 8, 117–131. [Google Scholar] [CrossRef] [Green Version]
  56. Amado Sanchez, V.L.; Islas Cruz, O.I.; Ahumada Solorza, E.A.; Encinas Monroy, I.A.; Caro, K.; Castro, L.A. BeeSmart: A Gesture-Based Videogame to Support Literacy and Eye-Hand Coordination of Children with down Syndrome. In Games and Learning Alliance; Dias, J., Santos, P., Veltkamp, R., Eds.; Springer: Cham, Switzerland, 2017; Volume LNCS10653, pp. 43–53. [Google Scholar] [CrossRef]
  57. Feng, J.; Lazar, J.; Kumin, L.; Ozok, A. Computer Usage by Young Individuals with Down Syndrome: An Exploratory Study. In Proceedings of the 10th International ACM Sigaccess Conference on Computers and Accessibility—Assets ’08, Halifax, NS, Canada, 13–15 October 2008; p. 35. [Google Scholar] [CrossRef]
  58. Durango, I.; Carrascosa, A.; Gallud, J.A.; Penichet, V.M.R. Using Serious Games to Improve Therapeutic Goals in Children with Special Needs. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services, Copenhagen, Denmark, 24–27 August 2015; ACM Press: New York, NY, USA, 2015; pp. 743–749. [Google Scholar] [CrossRef]
  59. Macedo, I.; Trevisan, D.G.; Vasconcelos, C.N.; Clua, E. Observed Interaction in Games for Down Syndrome Children. In Proceedings of the 48th Hawaii International Conference on System Sciences, Kauai, HI, USA, 5–8 January 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 662–671. [Google Scholar] [CrossRef]
  60. Torres-Carrión, P.; González-González, C.; Carreño, A.M. Methodology of Emotional Evaluation in Education and Rehabilitation Activities for People with Down Syndrome. In Proceedings of the XV International Conference on Human Computer Interaction, Puerto de la Cruz, Tenerife, Spain, 10–12 September 2014; ACM Press: New York, NY, USA, 2014; pp. 1–4. [Google Scholar] [CrossRef]
  61. Mendoza, A.; Alvarez, J.; Mendoza, R.; Acosta, F.; Muñoz, J. Analyzing Learnability of Common Mobile Gestures Used by Down Syndrome Users. In Proceedings of the XVI International Conference on Human Computer Interaction, Barcelona, Spain, 7–9 September 2015; ACM Press: New York, NY, USA, 2015; pp. 1–8. [Google Scholar] [CrossRef]
  62. Kava, M.P.; Tullu, M.S.; Muranjan, M.N.; Girisha, K. Down Syndrome: Clinical Profile from India. Arch. Med. Res. 2004, 35, 31–35. [Google Scholar] [CrossRef]
  63. Kim, H.I.; Kim, S.W.; Kim, J.; Jeon, H.R.; Jung, D.W. Motor and Cognitive Developmental Profiles in Children with Down Syndrome. Ann. Rehabil. Med. 2017, 41, 97–103. [Google Scholar] [CrossRef] [Green Version]
  64. Malak, R.; Kotwicka, M.; Krawczyk-Wasielewska, A.; Mojs, E.; Samborski, W. Motor Skills, Cognitive Development and Balance Functions of Children with Down Syndrome. Ann. Agric. Environ. Med. 2013, 20, 803–806. [Google Scholar]
  65. Manassero Morales, G. Guía de Práctica Clínica Del Síndrome Down. Revista de la Facultad de Medicina Humana 2016, 16. [Google Scholar] [CrossRef]
  66. Pérez Chavéz, D.A. Síndrome de Down. Rev. Act. Clin. Med. 2014, 45, 2357–2361. [Google Scholar]
  67. Nelson, R.M.; Botkin, J.R.; Levetown, M.; Moseley, K.L.; Truman, J.T.; Wilfond, B.S.; Kazura, A.; Bowes, J.; Krug, E.; Caniano, D.A.; et al. Sterilization of Minors with Developmental Disabilities. Pediatrics 1999, 337–340. [Google Scholar] [CrossRef] [Green Version]
  68. Troncoso, M.V.; del Cerro, M.M. Síndrome de Down: Lectura y Escritura; Masson SA & Fundación Sindrome de Down: Barcelona, Spain, 2005. [Google Scholar]
  69. National Down Syndrome Society. What Is Down Syndrome? Available online: https://www.ndss.org/about-down-syndrome/down-syndrome/ (accessed on 5 March 2019).
  70. Kumin, L.; Lazar, J.; Feng, J.H.; Wentz, B.; Ekedebe, N. A Usability Evaluation of Workplace-Related Tasks on a Multi-Touch Tablet Computer by Adults with Down Syndrome. J. Usability Stud. 2012, 7, 118–142. [Google Scholar]
  71. van Solingen, R.; Basili, V.; Caldiera, G.; Rombach, H.D. Goal Question Metric (GQM) Approach. In Encyclopedia of Software Engineering; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2002. [Google Scholar] [CrossRef]
  72. Peng, P.; Wang, T.; Wang, C.C.; Lin, X. A Meta-Analysis on the Relation between Fluid Intelligence and Reading/ Mathematics: Effects of Tasks, Age, and Social Economics Status. Psychol. Bull. 2019, 145, 189–236. [Google Scholar] [CrossRef]
  73. Fitts, P.M. The Information Capacity of the Human Motor System in Controlling the Amplitude of Movement. J. Exp. Psychol. 1954, 47. [Google Scholar] [CrossRef] [Green Version]
  74. del C. Sierra Romero, M.; Navarrete Hernández, E.; Canún Serrano, S.; Reyes Pablo, A.E.; Valdés Hernández, J. Prevalence of Down Syndrome Using Certificates of Live Births and Fetal Deaths in México 2008–2011. Bol. Med. Hosp. Infant. Mex. 2014, 71, 292–297. [Google Scholar] [CrossRef] [Green Version]
  75. Nielsen, J. Estimating the Number of Subjects Needed for a Thinking Aloud Test. Int. J. Hum. Comput. Stud. 1994, 41, 385–397. [Google Scholar] [CrossRef]
  76. Hwang, W.; Salvendy, G. Number of People Required for Usability Evaluation. Commun. ACM 2010, 53, 130. [Google Scholar] [CrossRef]
  77. Djamasbi, S.; Tullis, T.; Hsu, J.; Mazuera, E.; Osberg, K.; Bosch, J. Gender Preferences in Web Design: Usability Testing through Eye Tracking. In Proceedings of the 13th Americas Conference on Information Systems, Keystone, CO, USA, 9–12 August 2007; AIS Electronic Library (AISeL): Atlanta, GA, USA, 2007; pp. 1–8. [Google Scholar]
  78. Karmiloff-Smith, A.; Al-Janabi, T.; D’Souza, H.; Groet, J.; Massand, E.; Mok, K.; Startin, C.; Fisher, E.; Hardy, J.; Nizetic, D.; et al. The Importance of Understanding Individual Differences in Down Syndrome. F1000Research 2016, 5. [Google Scholar] [CrossRef] [PubMed]
  79. Russell, D.C.; van Heerden, R.; van Vuuren, S.; Venter, A.; Joubert, G. The Impact of the “Developmental Resource Stimulation Programme” (DRSP) on Children with Down Syndrome. S. Afr. J. Occup. 2016, 46, 33–40. [Google Scholar] [CrossRef] [Green Version]
  80. Stagni, F.; Giacomini, A.; Guidi, S.; Ciani, E.; Bartesaghi, R. Timing of Therapies for Down Syndrome: The Sooner, the Better. Front. Behav. Neurosci. 2015, 9, 265. [Google Scholar] [CrossRef] [Green Version]
Figure 1. (a) Architecture of DS-Touch Software. (b) Interface software.
Figure 1. (a) Architecture of DS-Touch Software. (b) Interface software.
Sensors 21 01328 g001
Figure 2. Participants (left and right) interacting with touchscreen (notice moderator showing participants how to perform gesture and finger movements).
Figure 2. Participants (left and right) interacting with touchscreen (notice moderator showing participants how to perform gesture and finger movements).
Sensors 21 01328 g002
Figure 3. Success and failure cases by gesture.
Figure 3. Success and failure cases by gesture.
Sensors 21 01328 g003
Figure 4. Success rate and failure rate by gender.
Figure 4. Success rate and failure rate by gender.
Sensors 21 01328 g004
Figure 5. Success rate and failure rate by type of Down syndrome.
Figure 5. Success rate and failure rate by type of Down syndrome.
Sensors 21 01328 g005
Figure 6. Success rate and failure rate by socioeconomic status.
Figure 6. Success rate and failure rate by socioeconomic status.
Sensors 21 01328 g006
Figure 7. Mean completion times for each gesture by gender.
Figure 7. Mean completion times for each gesture by gender.
Sensors 21 01328 g007
Figure 8. Mean completion times for each gesture by type of Down syndrome.
Figure 8. Mean completion times for each gesture by type of Down syndrome.
Sensors 21 01328 g008
Figure 9. Mean completion times for each gesture by socioeconomic status.
Figure 9. Mean completion times for each gesture by socioeconomic status.
Sensors 21 01328 g009
Table 1. Distribution of participants by gender, socioeconomic status and type of Down syndrome (DS): Trisomy 21-T21; Translocation-TR; Mosaicism-MOS.
Table 1. Distribution of participants by gender, socioeconomic status and type of Down syndrome (DS): Trisomy 21-T21; Translocation-TR; Mosaicism-MOS.
Socioeconomic StatusType of DS
No.HighMed.LowT21TRMOS
Male13373643
Female5221221
Total18594864
Table 2. One-handed finger gestures: tasks used to elicit and measure gestures.
Table 2. One-handed finger gestures: tasks used to elicit and measure gestures.
GoalGestureFigureTaskTask Instructions
Select object1. Tap Sensors 21 01328 i001 Sensors 21 01328 i002Using one finger, tap the mole with your finger.
Select object2. Double tap Sensors 21 01328 i003 Sensors 21 01328 i004Using one finger, double tap the bomb to stop it exploding.
Select object3. Touch and hold Sensors 21 01328 i005 Sensors 21 01328 i006Using one finger, stop the sun until you hear the beep.
Rotate object4. Rotate Sensors 21 01328 i007 Sensors 21 01328 i008Using two fingers on the same hand, rotate the star until it covers its shadow.
Scale object5. Pinch Sensors 21 01328 i009 Sensors 21 01328 i010Using two fingers from the same hand, reunite the baby with its mother.
Scale object6. Stretch Sensors 21 01328 i011 Sensors 21 01328 i012Using two fingers from the same hand, stop the cat and dog from fighting.
Move object7. Touch and slide Sensors 21 01328 i013 Sensors 21 01328 i014Using one finger, drag the bee to the flower.
Move object8. Slide Sensors 21 01328 i015 Sensors 21 01328 i016Using one finger, move the football into the goal.
Table 3. Two-handed touch gestures using fingers: tasks used to elicit and measure gestures.
Table 3. Two-handed touch gestures using fingers: tasks used to elicit and measure gestures.
GoalGestureFigureTaskTask Instructions
Rotate object9. Rotate Sensors 21 01328 i017 Sensors 21 01328 i018Using one finger from each hand, rotate the star until it covers its shadow.
Move object10. Press and drag Sensors 21 01328 i019 Sensors 21 01328 i020Using one finger from your left hand, stop the wolf. Using one finger from your right hand, help Little Red Riding Hood get home safely.
Move object11. Separate Sensors 21 01328 i021 Sensors 21 01328 i022Using one finger from each hand, stop the cat and dog from fighting.
Move object12. Join Sensors 21 01328 i023 Sensors 21 01328 i024Using one finger from each hand, reunite the baby with its mother.
Table 4. One-handed touch gestures using the whole hand: tasks used to elicit and measure gestures.
Table 4. One-handed touch gestures using the whole hand: tasks used to elicit and measure gestures.
GoalGestureFigureTasksTask Instructions
Scale object13. Close fist Sensors 21 01328 i025 Sensors 21 01328 i026Using one hand, make a paper ball from the sheet of paper.
Scale object14. Spread fingers Sensors 21 01328 i027 Sensors 21 01328 i028Using all five fingers from one hand, make the starfish bigger.
Select object15. Stop with hand Sensors 21 01328 i029 Sensors 21 01328 i030Using one hand, stop the butterfly.
Move object16. Move with hand Sensors 21 01328 i031 Sensors 21 01328 i032Using one hand, drag the mouse to the cheese.
Rotate object17. Rotate Sensors 21 01328 i033 Sensors 21 01328 i034Using one hand, wave goodbye.
Table 5. Two-handed touch gestures using whole hand: tasks used to elicit and measure gestures.
Table 5. Two-handed touch gestures using whole hand: tasks used to elicit and measure gestures.
GoalGestureFigureTasksTask Instructions
Move object18. Spread using both hands Sensors 21 01328 i035 Sensors 21 01328 i036Using both hands, unroll the treasure map.
Move object19. Join using both hands Sensors 21 01328 i037 Sensors 21 01328 i038Using both hands, join the two piles of coins.
Rotate object20. Rotate using both hands Sensors 21 01328 i039 Sensors 21 01328 i040Using both hands, rotate the book so the girl can read it.
Table 6. Comparison of gestures (fingers—one hand) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/28 = 0.00178 *, p < 0.001 **.
Table 6. Comparison of gestures (fingers—one hand) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/28 = 0.00178 *, p < 0.001 **.
Success//
Completion Time
12345678
1. Tap-0.000 **0.2730.000 **0.000 **0.000 **0.000 **0.573
2. Double tap0.000-0.000 **0.001 *0.0600.000 **0.000 **0.000 **
3. Touch and hold0.0000.000-0.000 **0.000 **0.0190.000 **0.085
4. Rotate0.0000.0000.000-0.000 *0.000 **0.000 **0.000 **
5. Pinch0.0040.0000.0000.000-0.000 **0.0500.000 **
6. Stretch0.0010.0000.0000.0000.405-0.0020.000 **
7. Touch and slide0.0000.0000.0000.0000.0040.056-0.000 **
8. Slide0.0000.0000.0000.0000.0000.0000.000-
(* p-value < 0.00178; ** p-value < 0.001).
Table 7. Comparison of gestures (fingers—two hands) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/6 = 0.008 *, p < 0.001 **.
Table 7. Comparison of gestures (fingers—two hands) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/6 = 0.008 *, p < 0.001 **.
Success//
Completion Time
9101112
9. Rotate—2 hands-0.000 **0.001 *0.420
10. Press and drag—2 hands0.000-0.000 **0.000 **
11. Separate—2 hands0.0000.000-0.005 *
12. Join—2 hands0.0000.0000.313-
(* p-value < 0.008; ** p-value < 0.001).
Table 8. Comparison of gestures (one hand—whole) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/10 = 0.005 *, p < 0.001 **.
Table 8. Comparison of gestures (one hand—whole) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/10 = 0.005 *, p < 0.001 **.
Success//
Completion Time
1314151617
13. Close fist-0.2580.1640.000 **0.002 *
14. Spread fingers0.546-0.0190.000 **0.038
15. Stop with hand0.000 -0.000 **0.000 **
16. Move with hand0.0000.0000.000-0.168
17. Rotate with hand0.0000.0000.0000.523-
(* p-value < 0.005; ** p-value < 0.001).
Table 9. Comparison of gestures (two hands—whole) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/3 = 0.017 *, p < 0.001 **.
Table 9. Comparison of gestures (two hands—whole) by success with Pearson’s chi-squared test of independence χ² (DoF = 1, N = 18). Bonferroni p-value p < 0.05/3 = 0.017 *, p < 0.001 **.
Success//
Completion Time
181920
18. Spread using both hands-0.000 **0.003 *
19. Join using both hands0.000-0.000 **
20. Rotate using both hands0.0000.004-
(* p-value < 0.017; ** p-value < 0.001).
Table 10. Statistic of Pearson’s chi-squared test. F-statistic from the success analysis.
Table 10. Statistic of Pearson’s chi-squared test. F-statistic from the success analysis.
ID
Gest
GenderSocioeconomic StatusType of Down SyndromeGen-ESGen-TDES-TD
DoFχ²p-ValDoFχ²p-ValDoFχ²p-Valχ²p-Valχ²p-Valχ²p-Val
112.9330.08722.6520.266218.0620.00029.5150.0003.4590.17793.5540.000
212.3170.12821.7980.407214.7190.00157.7110.00010.1630.006192.630.000
310.0070.93320.5700.752212.3730.00218.4520.0000.6180.73442.0780.000
410.0030.95820.3990.819215.8970.00017.3290.00038.5580.00084.1170.000
510.6130.434239.6400.000212.1080.0023.4430.1795.9700.051115.6280.000
610.1630.686220.4930.000211.7740.0038.5170.0140.6590.71999.9370.000
710.0730.78721.5970.45021.2460.5365.1660.07636.7670.00094.1770.000
810.9130.339210.4780.00525.4120.06720.4690.0004.8420.089191.6070.000
911.7750.18320.9230.630227.1750.00012.7670.0020.6960.70660.4160.000
1010.8530.35620.5570.757211.4240.00314.4080.00010.1570.006246.1120.000
1117.9370.005218.6720.000214.2630.00119.2550.0000.2960.862127.5970.000
1215.3210.02125.0810.07922.3320.31233.6300.0005.0390.08193.4510.000
1314.6110.03222.1090.34820.3630.8348.9620.01118.3360.000108.4020.000
1410.6840.40820.9300.62826.4400.04039.6870.00011.2920.004105.3580.000
1512.7820.09524.7010.09525.0240.0814.6110.0009.2940.01066.7100.000
1610.0310.86027.9510.01924.2110.12222.1090.00026.1850.000162.2290.000
1710.3960.52921.8390.39926.4040.04142.9000.0001.1600.56058.7610.000
18118.3460.00025.0780.07928.8450.01242.4050.00015.1980.00162.6050.000
1910.0250.87422.8260.243215.9560.00018.3570.0000.1490.92324.6260.000
2010.0150.90221.7050.42626.4830.03946.5780.00018.1190.00055.2610.000
In Bold, p-values < 0.05. This means that the gesture has significant difference for the column variable.
Table 11. Statistic of Pearson’s chi-squared test. F-statistic from the completion time analysis.
Table 11. Statistic of Pearson’s chi-squared test. F-statistic from the completion time analysis.
ID
Gest
Gender Socioeconomic StatusType of Down SyndromeGen-ESGen-TDES-TD
DoFFp-ValDoFFp-ValFp-ValDoFFp-ValFp-ValFp-Val
110.0420.83822.5250.0832.8990.05840.4270.5140.7870.3758.6200.004
210.0010.98228,3790.0009.2380.00040.6870.4080.4860.61070.5970.000
311.3890.24120.6050.54816.2880.00044.7430.0321.3410,2480.4420.508
410.0300.86320.1270.0000.9240.39840.0340.8540.1600.6899.0810.003
511.1370.287214.0030.0002.1900.11440.1620.68812.7840.0000.0740.785
612.5860.10923.7740.0241.0090.36642.0140.1575.2500.0230.3540.552
713.2440.07323.0890.0470.4150.66140.0900.7650.0480.8266.9320.009
8112.0740.00123.6540.02752.2450.000412.0470.00131.6460.0002.1370.144
910.1230.72620.4910.6131.7020.18640.1900.8270.9840.3760.2770.758
1010.0230.87920.0160.9855.1840.00643.2930.0701.6880.1943.2590.072
11114.6070.00029.9520.0001.9470.14540.7880.3765.2170.02322.0040.000
1211.1850.27724.9540.0081.3600.25940.3000.5842.8090.0958.8140.003
1310.1200.72928.5400.0008,3650.00040.7160.3980.4490.5036.1090.003
14119.3350.000217.6070.00016.4030.00040.3870.5342.9940.0858.4350.000
1510.0130.91129.2130.00011.6900.00044.6380.0330.0470.8341.0240.313
1611.0090.316216.0190.00014.3740.00041.3460.2470.7660.3826.2550.002
1711.0520.30620.0990.9053.8980.022418.1700.0000.4690.4940.6740.511
1813.5880.06025.8350.00315.6910.00040.0490.8260.6150.5595.1100.025
19110.9400.001213.3000.00010.3280.00040.0120.9011.9450.1430.2840.743
2010.0420.83721.0790.3420.2040.81540.7800.3781.1910.2887.6740.006
In Bold, p-values < 0.05. This means that the gesture has significant difference for the column variable.
Table 12. Summary of accepted/rejected hypotheses.
Table 12. Summary of accepted/rejected hypotheses.
HypothesisConclusions
HR1. Touchscreen gestures pose different levels of difficulty for individuals living with Down syndrome.
Ho1: The gestures analysed have the same degree of difficulty.
Ho1 is rejected. The statement of HR1 is accepted.
HR2. The gesture used has an effect on completion time.
Ho2: Gesture difficulty is related to completion time.
Ho2 is accepted. The statement of HR2 is accepted.
HR3. Gender has an effect on the success rate of a gesture.
Ho3: The gender of a person with DS does not influence the success rate of a gesture.
Ho3 is accepted. The statement of HR3 is rejected.
HR4. Gender has an effect on the completion time of a gesture.
Ho4: The gender of a person with DS does not influence the completion time of a gesture.
Ho4 is rejected. The statement of HR4 is rejected.
HR5. The type of Down syndrome has an effect on the success rate of a gesture.
Ho5: The type of Down syndrome influences the success rate of a gesture.
Ho5 is accepted. The statement of HR5 is accepted.
HR6. Socioeconomic status has an effect on the success rate of a gesture.
Ho6: Socioeconomic status does not influence the success rate of a gesture.
Ho6 is accepted. The statement of HR6 is rejected.
HR7. The type of Down syndrome has an effect on the completion time of a gesture.
Ho7: The completion time of a gesture is not influenced by the type of Down syndrome.
Ho7 is rejected. The statement of HR7 is accepted.
HR8. Socioeconomic status has an effect on the completion time of a gesture.
Ho8: Socioeconomic status influences gesture completion time.
Ho8 is accepted. The statement of HR8 is accepted.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Martin-Gutierrez, J.; Del Rio Guerra, M.S. Analysing Touchscreen Gestures: A Study Based on Individuals with Down Syndrome Centred on Design for All. Sensors 2021, 21, 1328. https://doi.org/10.3390/s21041328

AMA Style

Martin-Gutierrez J, Del Rio Guerra MS. Analysing Touchscreen Gestures: A Study Based on Individuals with Down Syndrome Centred on Design for All. Sensors. 2021; 21(4):1328. https://doi.org/10.3390/s21041328

Chicago/Turabian Style

Martin-Gutierrez, Jorge, and Marta Sylvia Del Rio Guerra. 2021. "Analysing Touchscreen Gestures: A Study Based on Individuals with Down Syndrome Centred on Design for All" Sensors 21, no. 4: 1328. https://doi.org/10.3390/s21041328

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop