Next Article in Journal
A Strain-Based Method to Estimate Tire Parameters for Intelligent Tires under Complex Maneuvering Operations
Next Article in Special Issue
Analyzing the Use of Accelerometers as a Method of Early Diagnosis of Alterations in Balance in Elderly People: A Systematic Review
Previous Article in Journal
Deep Ensemble Learning Based Objective Grading of Macular Edema by Extracting Clinically Significant Findings from Fused Retinal Imaging Modalities
Previous Article in Special Issue
Eating and Drinking Recognition in Free-Living Conditions for Triggering Smart Reminders
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Can We Rely on Mobile Devices and Other Gadgets to Assess the Postural Balance of Healthy Individuals? A Systematic Review

by
Alexandre S. Pinho
1,2,
Ana P. Salazar
1,3,
Ewald M. Hennig
4,
Barbara C. Spessato
1,3,
Antoinette Domingo
5 and
Aline S. Pagnussat
1,2,3,*
1
Movement Analysis and Rehabilitation Laboratory, Universidade Federal de Ciências da Saúde de Porto Alegre (UFCSPA), Porto Alegre, RS 90050-170, Brazil
2
Health Sciences Graduate Program, Universidade Federal de Ciências da Saúde de Porto Alegre (UFCSPA), Porto Alegre, RS 90050-170, Brazil
3
Rehabilitation Sciences Graduate Program, Universidade Federal de Ciências da Saúde de Porto Alegre (UFCSPA), Porto Alegre, RS 90050-170, Brazil
4
Institute of Health & Biomedical Innovation (IHBI), Queensland University of Technology (QUT), Kelvin Grove, Brisbane QLD 4059, Australia
5
School of Exercise and Nutritional Sciences, San Diego State University, San Diego, CA 92182-7251, USA
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(13), 2972; https://doi.org/10.3390/s19132972
Submission received: 22 May 2019 / Revised: 29 June 2019 / Accepted: 2 July 2019 / Published: 5 July 2019
(This article belongs to the Special Issue Wearable Motion Sensors Applied in Older Adults)

Abstract

:
The consequences of falls, costs, and complexity of conventional evaluation protocols have motivated researchers to develop more effective balance assessments tools. Healthcare practitioners are incorporating the use of mobile phones and other gadgets (smartphones and tablets) to enhance accessibility in balance evaluations with reasonable sensitivity and good cost–benefit. The prospects are evident, as well as the need to identify weakness and highlight the strengths of the different approaches. In order to verify if mobile devices and other gadgets are able to assess balance, four electronic databases were searched from their inception to February 2019. Studies reporting the use of inertial sensors on mobile and other gadgets to assess balance in healthy adults, compared to other evaluation methods were included. The quality of the nine studies selected was assessed and the current protocols often used were summarized. Most studies did not provide enough information about their assessment protocols, limiting the reproducibility and the reliability of the results. Data gathered from the studies did not allow us to conclude if mobile devices and other gadgets have discriminatory power (accuracy) to assess postural balance. Although the approach is promising, the overall quality of the available studies is low to moderate.

1. Introduction

According to the Global Health Organization, falls are the second leading cause of deaths due to accidents or unintentional injury worldwide [1]. The consequences of falls, especially in the elderly population, have drawn attention to the development of fall prevention strategies, focusing on training protocols and more effective/precise balance assessments [1,2,3,4].
A wide variety of mathematical models, evaluation protocols, and instruments have been proposed for quantitative measurements of balance. The costs, and complexity of devices for quantitative data make, the assessment and interpretation of results challenging and restricted to academic research or expensive private services [5,6]. Recently, healthcare practitioners have been incorporating the use of less expensive sensors for balance assessment. Mobile phones and other gadgets (smartphones and tablets) have been used because they have triaxial accelerometers and gyroscopes embedded, which turn them into wireless inertial measurement units (IMU). Although these devices have dramatically improved regarding their speed of real-time computing processing and accuracy [7,8,9], there are still some challenges to overcome. For instance, it is not well understood how this data can be best interpreted and applied to clinical practice [10,11].
Mobile sensors and processing apps (applications/software) are novel technologies used to enhance accessibility to balance evaluations with reasonable sensitivity and good cost–benefit [12,13,14,15,16,17,18]. This technology allows us to assess balance through acceleration measurement resultants calculated using a simplified approach from the position of the center of mass (COM) usually described by an arbitrary (not estimated) single point where the sensor is positioned [19,20]. Some advantages of using inertial sensors from smartphones or tablets to assess balance are: (1) The equipment is affordable and accessible, (2) allows real-time evaluations, (3) self-administered protocols, (4) quick and reliable feedback, (5) user-friendly apps and charts reports, (6) easy disseminating results, improving the link between patient, healthcare professionals, and family, (7) ease of understanding and monitoring follow-up [9].
As a disadvantage, we can point out the nature of Micro-Electro-Mechanical Systems (MEMS) sensing technology embedded on those devices, which addresses some intrinsic errors to data acquisition, mostly regarding deterministic errors. The main source of these errors is “white noise”. These problems may be overcome, in theory, by carefully analyzing data and using specific filters and proper calibration [21]. Smartphone ownership is on the rise in emerging economies, but its cost is still an issue. The global median rate is 59%, but it could be as high as 94% in South Korea, 83% in Israel, 82% in Australia. On the other hand, this rate is reported to be less than 50% in 12 of 22 countries surveyed by the Pew Research Center (2018) or even less in poorer countries [22].
Even though the use of mobile phones and other gadgets with built-in sensors are not fully validated, the prospects are evident as well as the need to question weakness and strengths of the different approaches [4,8,9]. The primary outcome of this study was to verify if mobile devices and other gadgets are able to assess balance. The secondary outcomes were, to review the current protocols used to assess balance with consumer-level mobile devices (mobile phones and tablets) and to summarize: (a) parameters used to define balance, (b) main characteristics and technical specifications of devices and sensors, (c) mathematical models and algorithms used to process data. Additionally, we examined the potentialities and limitations of protocols to guide readers about the most reliable and convenient method of accelerometer-based balance assessment.

2. Materials and Methods

This systematic review was reported according to the preferred reporting items for systematic reviews and meta-analysis (PRISMA) and Cochrane guidelines [23,24]. The protocol was recorded at the International Prospective Register of Systematic Reviews (PROSPERO, CRD42018103481).

2.1. Eligibility and Inclusion Criteria

We included only articles that reported the comparison of general balance evaluation methods to the use of mobile inertial sensors as devices (smartphones and tablets). Regardless of the methods for blinding and randomization, all study designs were included if they assessed standing balance in healthy adults and had been published up to 2019.

2.2. Search Strategy

A systematic search was conducted (from inception to February 2019) using the following databases: PubMed, EMBASE, Scopus, and Cochrane Central. The search strategy included terms as ‘accelerometry,’ ‘accelerometer,’ ‘gyroscope,’ ‘body wear sensors,’ ‘wearable sensors,’ ‘inertial sensors’, ‘IMU’, ‘inertial measurement units’, ‘mobile application’, ‘mobile app’, ‘mobile device’, ‘smartphone app’ and words related to ‘postural balance,’ ‘sway,’ or ‘postural control.’ The search was limited to papers written in English, Spanish, and Portuguese with no restriction to date. The complete search strategy is presented in Appendix A. (available as supplementary material online)

2.3. Data Extraction, Risk of Bias and Quality Assessment

Two reviewers (ASP and APS) independently screened the studies by titles and abstracts and deleted duplicates based on the inclusion criteria. After this step, the same reviewers assessed the full texts separately. The authors were contacted by email when data were not available. If the two reviewers did not find a consensus in all phases of the selection (including the screening for the quality assessment), a third reviewer (BCS) made the final decision. All the reviewers have broad experience in the research field. EndNoteTM X7 software (Clarivate Analytics US LLC, Philadelphia, PA, USA) was used to select and search for articles. During the screening step, the selection was blinded, and there was no disagreement between ASP and APS. The data extracted from the included studies were: Type of study, number of participants, type and location of the wearable sensor, time acquisition, general conditions of assessments, and primary outcomes of each study. If necessary, a third reviewer was consulted to solve disagreements.
Methodological quality assessments were performed for all studies using the Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies from the National Heart, Lung and Blood Institute (NHLBI) of the United States National Institutes of Health (NIH) [25]. A fourteen-criteria tool designed for a critical appraisal which involves considering the risk of potential for selection bias. This instrument measures the ability of the study to draw associative conclusions about the effects of the exposures being studied on outcomes. The quality was expressed as a percentage of the total possible score, with a maximum of two points for each criterion (“Yes” = 2, “Cannot determine” = 1, “No” = 0). The studies were classified as: “high quality” (>75%), “moderate quality” (>50% to 75%), “low quality” (25% to 50%), and “very low quality” (<25%). Considering the characteristics of papers included in this review, only 12 items were evaluated with a maximum of 24 points (available as supplementary material online, Appendix B).
Due to the lack of a particular tool to access the consistency of the balance protocols information, we created a 10-point checklist to access the main information on balance protocols addressing aspects related to measurement bias and reproducibility. Even if this tool did not have its efficacy validated yet, we believe that due to its custom developed characteristics, it brings light whether the main parameters that can possibly influence the results of the balance assessment in general were present or not (available as supplementary material online, Appendix C). For each topic, two researchers gave a yes (Y), or no (N) score and the sum of all topics resulted in the paper’s total score (when a topic was “not applicable” a “Y” was given). If the study scored 8 to 10 points, we classified it as being “highly detailed.” In other words, a “highly detailed” study presents great consistency, low risk of measurement bias, and enough information to allow reproducibility. If the study was scored between six and seven, we classified it as “fairly detailed,” or, the study has some risk of measurement bias but is consistent enough to allow reproducibility. Finally, a study with less than six points, the study was classified as “poorly detailed”, or with high risk of measurement bias and/or not fully reproducible (available as supplementary material online, Appendix C).

3. Results

The initial search identified 1309 studies. After excluding duplicates (427) and screening titles and abstracts, nine papers were considered potentially relevant and were included in this systematic review. All studies included healthy individuals [14,15,17,18,26,27,28,29,30] and had a cross-sectional design and have been published between 2014 and 2019. A flow diagram elucidating the study selection is provided in Figure 1.

3.1. Sample Characteristics

The sample size varied from 12 to 60 individuals. Studies included males and females, from different age groups between 16.4 and 78.9 years. Five studies included only young adults and teenagers. One study categorized subjects into three age groups, young, middle age, and older adults [14] and other study selected only the older population [15]. One study did not report the age of the subjects [29], and others did not present the standard deviation of their sample [27]. Four papers did not report height and body mass of the included subjects [14,15,27,29]. Hsieh et al. selected their sample classifying between high risk and low risk of falls [30]. Table 1 shows the sample characteristics of the studies.

3.2. Overview of Studies Objectives

All studies had a cross-sectional design and used dedicated apps [15,17,18,28,29] or raw data acquisition apps to determine the capability to evaluate postural balance [14,26,27]. One study did not report this information [30]. Two studies compared the data acquired using gadgets to “gold standard” balance assessment devices: Biodex Balance System™ [28] or NeuroCom® Smart Balance Master [26]. Four other studies compared gadgets with kinematic data by motion capture system [15,27], commercial accelerometers [14], and force plates [30]. Subjective clinical evaluation tests (full or adapted versions) were performed in five papers [15,17,27,28,29]. One paper performed the Physiological Profile Assessment (PPA) test on their participants [30]. The PPA test measures fall risk based on vision, reaction time, leg strength, proprioception, and balance, which gives a score and characterizes individuals between low risk and high risk of fall [31]. The iOS (Apple Inc. Cupertino, CA, USA) was the operating system of seven studies [14,15,17,26,27,28,29], and the Android Inc. (by Google Inc. Palo Alto, CA, USA) were at two studies [18,30].

3.3. Balance Assessment Protocols

Tasks used to evaluate postural balance varied across the studies and included: Balance error scoring system (BESS) [15,17,27], athlete single leg test [28], Romberg and tandem Romberg tests [29], NeuroCom® sensory organization test (SOT) [27], SWAY balance test [17,28], and other general tasks [14,18,30], (Table 2).
One paper used the six formal conditions of BESS [27], which is performed with eyes closed for 20 s: (1) Double-leg stance, firm surface, (2) single-leg stance, firm surface, (3) tandem stance, firm surface (dominant leg in front of the other), (4) double-leg stance, foam surface, (5) double-leg stance, foam surface, and (6) tandem stance, foam surface. Patterson et al. 2014 [17] adapted the BESS by modifying the hand’s position during the test. Lastly, one study altered the BESS conditions adjusting the test to an older population [15]. The author modified the analysis by excluding the single-leg stance, performing some parts of the test with open eyes.
One study followed the NeuroCom® protocol device [26], which uses the NeuroCom® sensory organization test (SOT), resulting in an equilibrium score. The protocol includes several procedures that combine stable and unstable surface with open and closed eyes, as well as with an oscillation of the visual references. The authors evaluated 49 individuals through Neurocom to determine whether an accelerometer and gyroscope data sampled from a consumer electronics device (iPad2) could provide enough resolution of the center of gravity (COG) movements to accurately quantify postural stability. Six conditions of SOT were used to compare the scores generated and calculated from both devices. Limits of agreement were defined as the mean bias (NeuroCom, iPad) + 2 standard deviations. Through the comparison of the real-time center of gravity sway, they found that the best agreement by the mean difference in equilibrium scores was of 0.01% for the SOT-1 and the largest difference was −6.2% for the SOT-5.
Two other papers performed the SWAY Balance Test [17,28]. This test consists of five stances including single leg stance, feet together and tandem during 10 s on a firm surface with eyes closed. One article evaluated 30 young individuals performing a single trial of the Athlete Single Leg Test requesting the subjects to stand on their non-dominant foot for 10 s [28]. Balance scores were generated from arbitrary units of both systems determined by undisclosed calculations. The balance scores derived from the smartphone accelerometers (SWAY Balance Mobile Application software) were consistent with balance scores obtained from the Biodex System, showing no significant differences (p = 0.818) between the means. A significant correlation between the two data sets was found (p < 0.01, r = 0.632).
Other tasks chosen by authors included a dual-task protocol with a “letter fluency test” in a parallel stance and a semi-tandem stance with eyes open and closed [14] and with a concurrent cognitive challenge, having the participants simultaneously subtracting by seven from a random number between 100 and 200 [30]. Eight different conditions were used with the myAnkle application and are detailed in Table 1 and Table 2 [18]. One paper used the Romberg test and the Romberg tandem test performed with and without noise restriction. Subjects went through a combination of sixteen postures, including open and closed eyes, feet together, and tandem, on a firm and foam surface [29].

3.3.1. Feet and Arms Position

Regarding foot position, some papers followed closed protocols [15,17,26,27]. Other studies evaluated only non-dominant single leg stance [28], feet parallel and semi-tandem [14], tandem, feet closed together and apart [18], or feet together and tandem [29] (Figure 2). Studies used a barefoot condition [15,18,27] or assessed subjects wearing socks [26] or shoes not specifying the type [17]. Four studies did not describe foot condition [14,28,29], and one study only described it partially [30].
Regarding the arms or hands position, three articles used a software application protocol where subjects held the mobile at the sternum mid-point [17,28,30]. Three papers described the position of the “hands” instead of “arms”, which were resting on the subjects’ iliac crests [15,27] or on the subject’s hips [18]. Authors also instructed subjects to “rest the arms at body side” according to the device’s protocol of SOT [26] and to use the same “arms position of Romberg’s tests” [29]. One paper did not specify this information [14].
Two studies described a visual target reference during the mobile data acquisition which was located at 3 m [15] and 4 m [18] ahead but did not mention the height from the ground and size of the target. Four authors did not report visual reference. In other studies, this aspect could not be analyzed due to a closed eyes condition [17,27] or some specific visual task [26].

3.3.2. Number of Acquisitions, Sessions and Total Time of Acquisition

Most studies conducted only one trial of data acquisition [14,17,18,27,28,29] while others performed three [26] and two trials [15,30]. The time acquisition ranged between 10 s and 60 s. None of the articles used or reported using a time window (cropped time) at the analysis (Table 3). Three studies used a well-established sampling rate recommendation of 100 Hz [10] for balance data acquisition [15,26,27]. In three other studies, the rates varied from 200 Hz [30], 88–92 Hz [14] to 14–15 Hz [18]. Three articles did not fully describe the sampling rates [17,28,29] (Table 3).
Only one study presented test–retest reliability. The author repeated data acquisition twice. Although the description indicates that the test–retest was within the same day with a short interval, there was no time interval reported between acquisitions [14]. The intraclass correlation coefficient (ICC) values found for the root mean square (RMS) of the accelerations was 0.83 and 0.90, and for the Sway Area, ICC was 0.81 and 0.91 during parallel stance and semi-tandem stance respectively.

3.3.3. Measurement Device and Position

Studies used different wearable sensors (Table 2). Three studies used iPad devices [15,26,27], three used iPods [14,17,28] and three used smartphones, an iPhone [29], a LG Optimus One [18] and a Samsung Galaxy S6 [30]. Four studies reported placing the mobile sensor on the participants’ lumbar or sacral region [14,15,26,27]. Three studies placed the gadget on the sternal midpoint [17,28,30], one on the left upper arm [29] and another one positioned three devices on different body places (malleolus, patella, umbilicus) [18] (Figure 3).

3.3.4. Devices Synchronization

Acquisitions of data with the use of more than one piece of equipment in which the time phases must occur at the same time, theoretically presuppose the use of a synchronization method. Three studies did not report any synchronization method [18,28,30], while four studies adequately described this process [14,15,26,27]. In two articles, the synchronization procedure did not apply [17,29].

3.3.5. Measurements and Signal Processing Parameters

The primary signal processing parameters used in quantitative continuous data measurements are briefly listed (available as supplementary material online, Appendix D) as stated in the study. Some studies reported using the raw data to run their own post-processing algorithms for the computing balance metrics [14,15,26,27,30]. One author designed a mobile phone app [18]. Other authors did not report if they had access to the app algorithm or calculations [17,28,29].

3.4. Methodological Quality Assessment

3.4.1. Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies NIH-NHLBI

Three papers were classified as “low quality” (from 25% to 50%) [17,28,29] and six papers as “moderate quality” (from >50% to 75%) [14,15,18,26,27,30]. No articles were considered “very low quality” (<25%) nor “high quality” (>75%), (available as supplementary material online, Appendix B).

3.4.2. A 10-Point Checklist for Balance Assessment Protocols

Four studies were classified as “highly detailed” [15,18,26,27]. One study was considered “fairly detailed” [17], and four studies were considered “poorly detailed” [14,28,29,30] (available as supplementary material online, Appendix C).

4. Discussion

This study aimed to review systematically the current protocols used to assess balance with mobile devices. We provided an overview of parameters used to define balance, main characteristics of devices and technical specifications, mathematical models, and algorithms used to process data. Briefly, we found that studies presented good consistency with the assessment procedures. However, we found a widespread lack of standardization in data acquisition, which compromises the data repeatability and reproducibility. Besides, methods to evaluate the mobile capability in assessing balance were too varied among studies, as well as the mathematical models, variables, tasks, and posture conditions.
It is well known that methodological aspects like anthropometric characteristics, time of acquisition, feet and arms position can influence the results and the reliability of measurements of postural balance. So, these parameters must be controlled and described in detail in scientific papers as it has been already established in the literature [10,20,32,33]. Five articles did not report height and body mass data [14,15,27,29,30]. Normalization methods for a proper comparison among subjects were reported by a few studies [15,27]. Height and body mass are an essential anthropometric characteristic affecting the base of support and the COM position, thus, these parameters must be controlled or normalized while comparing groups or describing samples to assess balance. For individual assessments by clinicians or customers, this issue could be of less importance, considering these parameters are less susceptible to changes.
Another aspect to be considered in balance evaluation protocols is the time of acquisition. Time of acquisition ranged from 10 s to 60 s in the included papers. The literature describes that time of acquisition may result in slight changes in balance parameters [32]. However, the shorter the time acquisition, the higher the synchronization control of devices must be, which was not a point of concern for all authors. We highlight the time of acquisition as another aspect that influence decisions about mathematical models and data processing methods [32].
Feet and arms positions are directly related to the physics concepts of stability. Moving the feet apart increases the size of the base of support and the capacity of stabilizing, as evidenced by patterns of the center of pressure (COP) variables [34]. On the other hand, the position of arms can alter the body oscillations and stability by slight changes in the COM affecting the base of support. This is another critical point to be considered. The studies included in this review used a wide range of feet and arms positions, and most studies used different restricted postures during the balance assessment. The body oscillation is changed when an individual is restricted or not to a specific posture. So, the data acquired from different protocols may be diverse, not allowing comparison, or not reflecting the general characteristics of balance in some cases.
Based on the physics concepts, the best way to describe balance and its displacement remains an open question [35,36,37,38,39]. It is very common to use the COM sway represented, estimated as a single point around the base of the lumbar spine [37]. Another possibility is to use the COP trajectories, which represent a weighted average of all the pressures over the surface at the base of support [35,36,38,39]. These two parameters are measured by different techniques and the position of the sensors can influence the results. The majority of studies positioned the sensors on the pelvis, lumbar, and sacral vertebra [14,15,26,27]. Some studies have chosen the upper limbs [29], lower limbs [18], and chest [17,28,30] to place the sensors. The trunk seems to be the best option due to the proximity of the body COM as well as avoiding unwanted movements of limbs interfering with balance assessment. A previous study showed good to excellent test–retest reliability using acceleration rates and COP parameters when the sensor is placed in the lumbar region [12], reinforcing this statement.
We cannot determine if the best choice is to fix the device in some specific area of the body or just ask for the individual to hold the device on their own. Encouraging the individual to hold the device would favor the self-administration of balance tests and empower patients to care about their health but could compromise data acquisition. Positioning near or away from the body’s center of mass will influence the movement degree of freedom caused by specific joints strategies for balance control [12]. The choice of the device position would affect the relative plane orientation and influence the repeatability and data accuracy. Although there is a lack of studies covering those aspects, it is known that the design of applications and decisions on protocol procedures induce specific and careful data processing. Moreover, the algorithm must be in conformity with the theoretical approach [8].
One of the major concerns in protocols of balance evaluation is the time acquisition. In studies included in this review, time acquisition ranged from 10 to 60 seconds. The shorter the time interval chosen, the more caution measures had to be taken due to the accuracy needed for synchronizing the devices. Additionally, the time of acquisition critically influence the decisions on the mathematical model and data processing methods [32]. Selecting a time window (cropped time window) is a usual procedure in quantitative balance analysis when a few seconds are withdrawn from the total acquisition time (arbitrarily) at the beginning and at the end of each attempt. Although not clear in the literature, it has been justified by reducing disturbing movements in the initial posture and attenuating the effects of fatigue, ensuring steadiness with less unwanted "noise" and "artifacts" in the signal. None of the papers reported using this method. The main objective of the studies selected was a comparison between sensors and devices, what probably dispense the use of this procedure, but raises a doubt whether it would increase or not the sensitivity of the protocols and data correlation.
A test–retest approach might have also enhanced the results, which was performed only by one study [14]. Likewise, the number of acquisitions, although a previous study which compared the acceleration data to the center of pressure reported that the data from three trials are similar to those obtained in only one trial [12]. Even being a signal with stochastics characteristics, it is suggested that only one trial may be reliable and useful to be applied in clinical practice.
Signal processing methods varied among studies and included the calculation of COM, COP, and raw acceleration through the measurements of RMS, the standard deviation, the maximum peak of displacement, maximum amplitude displacement, and sway area. All of these parameters can be applied in balance assessments [40]. One study used the raw acceleration data as a parameter to define stability, which is not a direct measurement of the position and is an unusual method to describe stability. A previous systematic review explored the best outcomes to assess standing balance and walking stability in subjects with Parkinson’s disease. The authors included 26 studies and defined “jerk” (the time derivative of acceleration) and trunk RMS acceleration as the most useful measures to differentiate patients from healthy controls [41].
It is important to highlight the use of two “gold standard” clinical devices to evaluate young individuals. One aimed a validation of measurement with a specific mobile software [28] concluding that the scores from the smartphone were consistent with the validated balance system. The other compared equilibrium scores [26] calculating the limits of agreement between the devices. The author concludes that mobile hardware provided data of sufficient precision and accuracy to quantify postural stability is a reasonable approach for in clinical and field environments. At the Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies NIH-NHLBI, the studies ware ranked as “low quality” [28] and “moderate quality” [26], respectively. At the 10-point checklist for balance assessment protocols, they achieved “poorly detailed” [28] and “highly detailed” [26], respectively.
This systematic review presents some limitations that make challenging to state recommendations about the most appropriate protocol to assess balance using gadgets. The majority of studies included in this review did not provide sufficient information about their assessment protocols, which make difficult the reproducibility of the evaluation, the reliability of the results and limiting the judgment of the discriminatory power (accuracy) of studies to assess postural balance. The overall quality of studies included in this review was low to moderate using the methodological Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies from the NHLBI, NIH weakening the consistency of the conclusions from the studies due to the lack of information on the internal and external validity and possible increase of risk of bias. It is also important to state that the 10-point checklist for balance assessment protocols used to assess the studies in this review is a custom developed tool and is not validated for its efficacy, although it was created based upon authors expertise and after a detailed discussion of the methods presented.
Considering the quality of the evaluation procedures, technical specifications, and data processing information, only four studies were classified as “highly detailed” [15,18,26,27], restricting the reproducibility of the protocols. Finally, most studies lack a direct sensor comparison, using a "gold standard" transducer system to determine the accuracy of the various transducer outputs from mobile devices, a question that still has to be addressed.

5. Conclusions

The results from this systematic review did not allow to perform an evaluation of the diagnostic and accuracy tests as expected. Thus, from our preliminary findings, we cannot ensure the use of mobile devices and other gadgets to assess postural balance. However, two studies presented consistent data supporting enough accuracy and good reliability for the use of this method to evaluate healthy young individuals. Due to the differences in hardware and operating systems, the comparisons between several mobile phone systems that are currently on the market is still a fragile aspect that needs to be explored. Clear balance protocol information, anthropometric characteristics of the sample, and technical specifications of the equipment and sensors are indispensable and have to be stated. Further studies are highly encouraged, with adequate sample size, different population, test–retest measurements, and low risk of bias are necessary to provide a better understanding of this promising approach.

Author Contributions

A.S.P. (Alexandre S. Pinho) and A.P.S. (Ana P. Salazar) conceived, analyzed the data and wrote the manuscript with support from B.C.S. at the experiment design. E.M.H. analyzed the data and reviewed the manuscript with A.D.; A.S.P. (Aline S. Pagnussat) reviewed the manuscript and supervised the project along with E.M.H. All authors discussed the results and contributed to the final manuscript.

Funding

This study was partially financed by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES, Brasília, DF, Brazil), Finance Code 001. The authors would like to thank the research funding agency Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES, Brasília, DF, Brazil) for the scholarship granted [88881.135564/2016-01] Alexandre S.P.; for the Ph.D. scholarship granted to Ana P.S. and Alexandre S.P.; for the Post-Doc scholarship of B.C.S.

Conflicts of Interest

The authors have no conflict of interest to declare.

Appendix A. Search Strategy (Databases)

MEDLINE accessed by Pubmed, Embase, Cochrane andSCOPUS up to February 2019.
The search strategy of the Pubmed, Embase, and SCOPUS databases was used the terms as follows:
#1
“Accelerometry”[Mesh]
“Accelerometer”
“Gyroscope”
“Bodywear sensors”
“Wearable sensors”
“Wear sensor”
“Inertial sensors”
“IMU”
“inertial measurement unit”
#2
“App Mobile”[Mesh]
“Apps, Mobile”
“Application, Mobile”
“Applications, Mobile”
“Mobile Application”
“Mobile Apps”
“Portable Electronic Apps”
“App, Portable Electronic”
“Apps, Portable Electronic”
“Electronic App, Portable”
“Electronic Apps, Portable”
“Portable Electronic App”
“Portable Electronic Applications”
“Application, Portable Electronic”
“Applications, Portable Electronic”
“Electronic Application, Portable”
“Electronic Applications, Portable”
“Portable Electronic Application”
“Portable Software Apps”
“App, Portable Software”
“Apps, Portable Software”
“Portable Software App”
“Software App, Portable”
“Software Apps, Portable”
“Portable Software Applications”
“Application, Portable Software”
“Applications, Portable Software”
“Portable Software Application”
“Software Application, Portable”
“Software Applications, Portable”
“mobile device”
“mobile smartphone”
“smartphone application”
“smartphone app”
#3
“Postural Balance”[MESH]
“Balance, Postural”
“Musculoskeletal Equilibrium”
“Equilibrium, Musculoskeletal”
“Postural Equilibrium”
“Equilibrium, Postural”
“Musculoskeletal Equilibrium”
“Equilibrium, Musculoskeletal”
“Postural Equilibrium”
“Equilibrium, Postural”
“Balance, Postural”
“sway”
“postural control”
“body sway”
#1 AND #2 AND #3
Database: Cochrane (up to February 2019).
“Accelerometer”OR“Mobile applications”AND“Postural Balance”.

Appendix B. Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies (NIH–NHLBI)

Table A1. Results of the quality assessment tool of the 9 studies (NIH-NHLBI).
Table A1. Results of the quality assessment tool of the 9 studies (NIH-NHLBI).
Quality Assessment
Study1234567891011121314Total%
Alberts et al., 2015 [26]22220002222--01667%
Alberts et al., 2015 [27]22222002002--21667%
Kosse et al., 2015 [14]22222002022--01667%
Hsieh et al., 2019 [30]22222002202--01667%
Ozinga et al., 2014 [15]22222002202--21875%
Patterson et al., 2014 [17]22222002001--01350%
Patterson et al., 2014 [28]22222000002--01250%
Shah et al., 2016 [18]22212002202--21771%
Yvon et al., 2015 [29]21220002001--01042%
The quality was expressed as a percentage of the total possible score, where each criterion could reach a maximum of two points (“Yes” = 2, Not Clear = “1”, “No” = 0). The studies were classified as: “high quality” (>75%), “moderate quality” (>50% to 75%), “low quality” (25% to 50%), and “very low quality” (<25%). Items 12 and 13 were considered not applicable due to the design characteristics of studies. Numbers 1 to 14 are related to each question of the quality assessment tool [25]

Appendix C. 10-Point Checklist for Balance Assessment Protocols

Table A2. Results of the 10-Point Checklist for Balance Assessment tool of the 9 studies.
Table A2. Results of the 10-Point Checklist for Balance Assessment tool of the 9 studies.
AuthorSample InformationTasks DescriptionFeet ConditionFeet and Arms PositionVisual Reference EyesVisual Reference TargetCropped TimeSampling RatesData/Signal Processing MethodSynch MethodTotal Core
Alberts et al., 2015 [26]YYYYYYNYYY9
Alberts et al., 2015 [27]NYYYYY_(NA)NYYY8
Kosse et al., 2015 [14]NYNNYNNYYY5
Hsieh et al., 2019 [30]NYYNYNNYYN5
Ozinga et al., 2014 [15]NYYYYYNYYY8
Patterson et al., 2014 [17]YYYYYY_(NA)NNYY_(NA)8
Patterson et al., 2014 [28]YYNYYNNNYN5
Shah et al., 2016 [18]YYYYYYNYYN8
Yvon et al., 2015 [29]NYNYYNNNNY_(NA)4
Cropped time = Total time acquired minus Time window analyzed; Synch = Synchronization; Y = yes; N = no; Y_(NA) = for a “not applicable” item a “Y” was given.

Appendix D. Overview (Description of Parameters and Measurements as Stated in the Study and General Comments)

Table A3. The primary signal processing parameters used in quantitative continuous data measurements as stated in the 9 studies.
Table A3. The primary signal processing parameters used in quantitative continuous data measurements as stated in the 9 studies.
AuthorOverviewParameters and MeasurementsGeneral Comments on the Strengths and Limitations
Alberts et al., 2015 [26]Used the Neurocom® device through force plate measurements with sensory organization test (SOT) to determine whether an iPad2 provides sufficient resolution of the center of gravity (COG) movements to quantify postural stability in healthy young people accurately.Center of pressure (COP) of the anterior-posterior (AP) and medium-lateral (ML) sway. Three-dimensional (3D) device-rotation rates and linear acceleration. COG of the AP angle was used for all outcomes.Only sample curves are shown for the CoG-AP sways for conditions 1, 4, 5 and 6. No numerical data are given for the actual physical measures from the Neurocom A-P sway-data as compared to the calculated A-P sway-data from iPad sensor. Nevertheless, the overall performance of the iPad for predicting the Equilibrium Score of the Neurocom appears excellent. The 100 Hz data sampling is more than sufficient to determine low-frequency body sways, probably using the smaller gadget/mobile, rather than the large iPad-2 may have resulted in even better results.
Alberts et al., 2015 [27]Assessed the accuracy of the iPad by comparing the metrics of postural stability with a 3D motion capture system and proposed a method of quantification of the Balance error scoring system (BESS) using the center of mass (COM) acceleration data.3D Position, linear and angular accelerations of the COM at the AP and ML. 3D Linear acceleration and rotation-rate, (1) peak-to-peak, (2) normalized path length, (3) root mean square (RMS) of the displacements COM, (4) 95% ellipsoid volume of sway. A spectral analysis of ML, AP, and trunk (TR) acceleration.No numerical data were compared between the motion capture results and the calculated values from the iPad sensors. Only correlations were calculated, and no raw data were presented, what leave readers not sure of the measurements’ consistency. This applies to Table 1, where no raw data for normalized path length (NPL), peak-to-peak (P2P), and RMS are presented, just comparisons with low to medium rho values. The small (Correlation coefficient) Rho values of only 0.55 and less (Table 2) for the iBESS Volume against the error score is even less convincing, but this can also be due to the low reliability of the subjective error scoring. The iPad sensors are probably much better able to detect balance deficits as compared to the more subjective BESS.
Kosse et al., 2015 [14]Compared to the data from a stand-alone accelerometer unit to establish the validity and reliability of gait and posture assessment of an iPod.AP and ML trunk acceleration and a resultant vector (1) RMS accelerations of body sway in AP and ML, (2) sway area, (3) total power median of the signal from frequency spectrum signals.Good direct comparison study from an iPod with a "Gold Standard" DynaPort triaxial accelerometer. For comparing wave, similarity Cross-correlations were determined after time normalization (100 Hz). The values were around 0.9 for all experimental conditions in AP and ML directions suggesting a high-quality acceleration signal and software evaluation. Time-lag values were almost identical between the two transducers. Validity and test–retest reliability intraclass-correlations (ICC) values were also excellent for RMS signals in both the AP and ML direction. Only for the median power frequency (MPF) lower ICC´s were found for the test–retest reliability, (possibly caused by the different sampling frequencies, requiring time normalization procedures. Excellent and comprehensive analysis, including a measurement section for a pure comparison of transducer technology as well as application to groups of three age group participants.
Hsieh et al., 2019 [30]Static balance tests were conducted while standing on a force plate and holding a smartphone. COP data from the force plate and acceleration data from the smartphone were compared. Validity between the measures was assessed and the Correlations coefficients were extracted to determine if a smartphone embedded accelerometer can measure static postural stability and distinguish older adults at high levels of fall risk.The COP parameters included in the analysis were: (1) 95% confidence ellipse and (2) velocity in the anteroposterior (AP) direction and mediolateral (ML) direction. From the smartphone, (1) maximum acceleration in the ML, vertical, and AP directions and (4) root mean square (RMS) in the ML, vertical, and AP axis were exported and processed.A promising approach was used to distinguish subjects with risks of falling associating acceleration data and COP parameters to the "physiological profile assessment” which is an evaluation of the risk of falling based on the assessment of multiple domains. Strong significant correlations between measures were found during challenging balance conditions (ρ = 0.42–0.81, p < 0.01–0.05). Correlations that, to some extent, were expected although it seems to be quite difficult to differentiate between vertical, AP, and ML components between the force plate and the accelerometer. Especially, during challenging balance tasks, there will be quite a bit of movement of the upper extremities against the body, creating all kinds of extra accelerations at the phone. A more trustful comparison of acceleration data from the phone to the force plate seems only possible if the phone would have been fastened at the CoG of subjects or close to the CoG.
Ozinga et al., 2014 [15]Simultaneous kinematic measurements from a 3D motion analysis system during balance conditions were used to compare the movements of COM to investigate if an iPad can provide sufficient accuracy and quality for the quantification of postural stability in older adults.Angular velocities and linear accelerations were processed to allow direct comparison to Position of whole-body COM, (1) peak-to-peak displacement amplitude, (2) normalized path length, (3) RMS displacements of COM, (4) 95% ellipsoid volume of sway. Spectral analysis of the magnitude of the ML, AP, and trunk acceleration was used.Fairly high correlations were present between the cinematographic, and the iPad derived data, suggesting that the iPad would be a good alternative to cinematographic posture analyses. Procedures and methods were well chosen. However, the number of subjects was fairly low.
Patterson et al., 2014 [17]Compared the scores of a mobile technology application within an iPod through balance tasks with a commonly used subjective balance assessment, the Balance error scoring system BESS.Balance scores by 3D Acceleration measurements.An inverse relationship of r = -0.77 (p < 0.01) was found between the BEES score and the SWAY results. Thus, the iPod acceleration signals were proven to be a fairly good predictor of stability. An elevated BESS score reflected a high number of balance errors, whereas the SWAY Balance score assigns a higher value to more stable performance. The fact that subjects had to press the iPod with their hands against the sternum brings some weakness to the test procedures. It restricts freedom for balancing their body in the five exercises and introducing mechanical artifacts by having their hands at the sensors during balancing task. Because no direct comparisons between two sensors were made, only an indirect estimation of the quality of the iPod touch transducer can be made. Most likely, the major error of the limited r = -0.77 is not a function of the quality of the iPod sensor but rather a consequence of poor BESS rating quality. Mentioning yet, that BESS scoring test was done by only two people. However, since BESS is well accepted, this paper shows, that mobile phone integrated sensors are well suited for evaluating postural stability.
Patterson et al, 2014 [28]A Biodex© Balance System which gives an AP Stability Index from a force platform was used to compare and evaluate the validity of a Balance Mobile Application which uses the 3D accelerometers from an iPod while subjects were performing a single trial of the Athlete Single Leg Test protocol.Degree of tilt about each axis: (1) ML stability index, (2) AP stability index and the (3) overall stability index; The displacement in degrees from level was termed the “balance score” from the AP stability index (APSI).AP stability index (APSI) score on the balance platform 1.41 was similar to the smartphone SWAY score 1.38 with no statistically significant difference. However, the correlation (ICC) between the scores was low - only r= 0.632 (p<0.01). As it was the case in the Patterson et al., 2014a, the same weakness was found, once the subjects had to hold the iPod touch with both hands at the sternum and only sway in AP direction was measured. Other than indicated the authors, an ICC of only r= 0.632 appears very low when considering that the same measure was taken by two systems at the same time for a single leg stance.
Shah et al, 2016 [18]A mobile application was developed to provide a method of objectively measuring standing balance using the phone’s accelerometer. Eight independent therapists ranked a balance protocol based on their clinical experience to assess the degree of exercise difficulty. The concordance between the results was obtained to determine if the mobile can quantify standing balance and distinguish between exercises of varying difficulty.3D accelerometer data were obtained from three mobile phones and mean acceleration was calculated; After a correction for static bias the corrected value was applied, the magnitude of the resultant vector (R) was calculated for each of measurement; The metric "mean R" was the average magnitude all resultant vectors and was then used as an index of balance.Even though Shah et al., 2016 did not make a direct comparison between 2 sensor systems, accelerometer readings were calculated for each exercise at each ankle and knee and the torso. A high differentiation between the stability exercises shows lower values for ankle, knee, and torso, indicating that the acceleration results from the mobile phones have a strong relationship to the subjective rating of the 8 experienced clinicians. The results indicate that one sensor location appears sufficient since all sensors follow the same trend, it appears that knee, and torso locations could be used. From a practical point of view, easiest to mount and use would be the torso or hip location.
Yvon et al, 2015 [29]An iPhone application was used to quantify sway while performing the Romberg and the Romberg tandem tests in a soundproof room and then in a normal room.Output data (‘K’ value) was used to represent the area of an ellipse with two standard deviations in the anteriorposterior and lateral planes about a mean point.The article explores a not usual protocol trying to evaluate the contributions of auditory sensory inputs on balance, through a combination of postures in different sound room condition. No raw data were presented or clearly specified; data processing procedures were not reported. Differences on postural sway measurements have been found among different room conditions with a dedicated application

References

  1. World Health Organization. Falls. The Problem & Key Facts Sheets Reviewed. 2018. Available online: http://www.who.int/mediacentre/factsheets/fs344/en/ (accessed on 7 April 2018).
  2. Lee, J.; Geller, A.I.; Strasser, D.C. Analytical Review: Focus on Fall Screening Assessments. PM R 2013, 5, 609–621. [Google Scholar] [CrossRef] [PubMed]
  3. Van der Kooij, H.; van Asseldonk, E.; van der Helm, F.C.T. Comparison of different methods to identify and quantify balance control. J. Neurosci. Methods 2005, 145, 175–203. [Google Scholar] [CrossRef] [PubMed]
  4. Fabre, J.M.; Ellis, R.; Kosma, M.; Wood, R.H. Falls risk factors and a compendium of falls risk screening instruments. J. Geriatr. Phys. Ther. 2010, 33, 184–197. [Google Scholar] [PubMed]
  5. Baratto, L.; Morasso, P.G.; Re, C.; Spada, G. A new look at the posturographic analysis in the clinical context: Sway-density versus other parameterization techniques. Mot. Control 2002, 6, 246–270. [Google Scholar] [CrossRef]
  6. Clark, S.; Riley, M.A. Multisensory information for postural control: Sway-referencing gain shapes center of pressure variability and temporal dynamics. Exp. Brain. Res. 2007, 176, 299–310. [Google Scholar] [CrossRef] [PubMed]
  7. Wong, S.J.; Robertson, G.A.; Connor, K.L.; Brady, R.R.; Wood, A.M. Smartphone apps for orthopedic sports medicine—A smart move? BMC Sports Sci. Med. Rehabil. 2015, 7, 23. Available online: http://bmcsportsscimedrehabil.biomedcentral.com/articles/10.1186/s13102-015-0017-6 (accessed on 21 April 2018).
  8. Del Rosario, M.; Redmond, S.; Lovell, N. Tracking the Evolution of Smartphone Sensing for Monitoring Human Movement. Sensors 2015, 15, 18901–18933. [Google Scholar] [CrossRef] [Green Version]
  9. Dobkin, B.H.; Dorsch, A. The Promise of mHealth: Daily Activity Monitoring and Outcome Assessments by Wearable Sensors. Neurorehabil. Neural Repair 2011, 25, 788–798. [Google Scholar] [CrossRef]
  10. Ruhe, A.; Fejer, R.; Walker, B. The test-retest reliability of center of pressure measures in bipedal static task conditions—A systematic review of the literature. Gait Posture 2010, 32, 436–445. [Google Scholar] [CrossRef]
  11. Habib, M.; Mohktar, M.; Kamaruzzaman, S.; Lim, K.; Pin, T.; Ibrahim, F. Smartphone-Based Solutions for Fall Detection and Prevention: Challenges and Open Issues. Sensors 2014, 14, 7181–7208. [Google Scholar] [CrossRef] [Green Version]
  12. Whitney, S.L.; Roche, J.L.; Marchetti, G.F.; Lin, C.-C.; Steed, D.P.; Furman, G.R.; Musolino, M.C.; Redfern, M.S. A comparison of accelerometry and center of pressure measures during computerized dynamic posturography: A measure of balance. Gait Posture 2011, 33, 594–599. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Chung, C.C.; Soangra, R.; Lockhart, T.E. Recurrence Quantitative Analysis of Postural Sway using Force Plate and Smartphone. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 1271–1275. [Google Scholar] [CrossRef] [Green Version]
  14. Kosse, N.M.; Caljouw, S.; Vervoort, D.; Vuillerme, N.; Lamoth, C.J.C. Validity and Reliability of Gait and Postural Control Analysis Using the Tri-axial Accelerometer of the iPod Touch. Ann. Biomed. Eng. 2015, 43, 1935–1946. [Google Scholar] [CrossRef] [PubMed]
  15. Ozinga, S.J.; Alberts, J.L. Quantification of postural stability in older adults using mobile technology. Exp. Brain. Res. 2014, 232, 3861–3872. [Google Scholar] [CrossRef] [PubMed]
  16. Ozinga, S.J. Quantification of Postural Stability in Parkinson’s Disease Patients Using Mobile Technology; Cleveland State University: Cleveland, OH, USA, 2015; Available online: http://rave.ohiolink.edu/etdc/view?acc_num=csu1450261576 (accessed on 21 April 2018). [CrossRef]
  17. Patterson, J.A.; Amick, R.Z.; Pandya, P.D.; Hakansson, N.; Jorgensen, M.J. Comparison of a Mobile Technology Application with the Balance Error Scoring System. Int. J. Athl. Ther. Train. 2014, 19, 4–7. [Google Scholar] [CrossRef]
  18. Shah, N.; Aleong, R.; So, I. Novel Use of a Smartphone to Measure Standing Balance. JMIR Rehabil. Assist. Technol. 2016, 3, e4. [Google Scholar] [CrossRef] [PubMed]
  19. Mayagoitia, R.E.; Lötters, J.C.; Veltink, P.H.; Hermens, H. Standing balance evaluation using a triaxial accelerometer. Gait Posture 2002, 16, 55–59. [Google Scholar] [CrossRef]
  20. Neville, C.; Ludlow, C.; Rieger, B. Measuring postural stability with an inertial sensor: Validity and sensitivity. Med. Devices Evid. Res. 2015, 8, 447. [Google Scholar] [CrossRef]
  21. Van Tang, P.; Tan, T.D.; Trinh, C.D. Characterizing Stochastic Errors of MEMS-Based Inertial Sensors. VNU J. Sci. Math. Phys. 2016, 32, 34–42. [Google Scholar]
  22. Poushter, J.; Caldwell, B.; Hanyu, C. Social Media Use Continues to Rise in Developing Countries But Plateaus Across Developed Ones. Available online: https://assets.pewresearch.org/wp-content/uploads/sites/2/2018/06/15135408/Pew-Research-Center_Global-Tech-Social-Media-Use_2018.06.19.pdf (accessed on 21 April 2018).
  23. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; Group, P. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef]
  24. Chandler, J.; Higgins, J.P.; Deeks, J.J.; Davenport, C.; Clarke, M.J. Cochrane Handbook for Systematic Reviews of Interventions, 50. Available online: https://community.cochrane.org/book_pdf/764 (accessed on 3 February 2018).
  25. National Heart, Lung, and Blood Institute (NHLBI) of the United States National Institute of Health (NIH). Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies. Available online: https://www.nhlbi.nih.gov/health-pro/guidelines/in-develop/cardiovascular-risk-reduction/tools/cohort (accessed on 7 April 2018).
  26. Alberts, J.L.; Hirsch, J.R.; Koop, M.M.; Schindler, D.D.; Kana, D.E.; Linder, S.M.; Campbell, S.; Thota, A.K. Using Accelerometer and Gyroscopic Measures to Quantify Postural Stability. J. Athl. Train. 2015, 50, 578–588. [Google Scholar] [CrossRef] [PubMed]
  27. Alberts, J.L.; Thota, A.; Hirsch, J.; Ozinga, S.; Dey, T.; Schindler, D.D.; Koop, M.M.; Burke, D.; Linder, S.M. Quantification of the Balance Error Scoring System with Mobile Technology. Med. Sci. Sports Exerc. 2015, 47, 2233. [Google Scholar] [CrossRef] [PubMed]
  28. Patterson, J.A.; Amick, R.Z.; Thummar, T.; Rogers, M.E. Validation of measures from the smartphone sway balance application: A pilot study. Int. J. Sports Phys. Ther. 2014, 9, 135. [Google Scholar] [PubMed]
  29. Yvon, C.; Najuko-Mafemera, A.; Kanegaonkar, R. The D+R Balance application: A novel method of assessing postural sway. J. Laryngol. Otol. 2015, 129, 773–778. [Google Scholar] [CrossRef] [PubMed]
  30. Hsieh, K.L.; Roach, K.L.; Wajda, D.A.; Sosnoff, J.J. Smartphone technology can measure postural stability and discriminate fall risk in older adults. Gait Posture 2019, 67, 160–165. [Google Scholar] [CrossRef] [PubMed]
  31. Lord, S.R.; Menz, H.B.; Tiedemann, A. A Physiological Profile Approach to Falls Risk Assessment and Prevention. Phys. Ther. 2003, 83, 237–252. [Google Scholar] [CrossRef] [PubMed]
  32. Scoppa, F.; Capra, R.; Gallamini, M.; Shiffer, R. Clinical stabilometry standardization. Gait Posture 2013, 37, 290–292. [Google Scholar] [CrossRef] [PubMed]
  33. Evans, O.M.; Goldie, P.A. Force platform measures for evaluating postural control: Reliability and validity. Arch. Phys. Med. Rehabil. 1989, 70, 510–517. [Google Scholar]
  34. Kirby, R.L.; Price, N.A.; MacLeod, D.A. The influence of foot position on standing balance. J. Biomech. 1987, 20, 423–427. [Google Scholar] [CrossRef]
  35. Winter, D.A. Human balance and posture control during standing and walking. Gait Posture 1995, 3, 193–214. [Google Scholar] [CrossRef]
  36. Zatsiorsky, V.M.; Duarte, M. Instant equilibrium point and its migration in standing tasks: Rambling and trembling components of the stabilogram. Motor Control 1999, 3, 28–38. [Google Scholar] [CrossRef]
  37. Morasso, P.G.; Spada, G.; Capra, R. Computing the COM from the COP in postural sway movements. Hum. Mov. Sci. 1999, 18, 759–767. [Google Scholar] [CrossRef]
  38. Zatsiorsky, V.M.; Duarte, M. Rambling and trembling in quiet standing. Mot. Control 2000, 4, 185–200. [Google Scholar] [CrossRef]
  39. Lin, D.; Seol, H.; Nussbaum, M.A.; Madigan, M.L. Reliability of COP-based postural sway measures and age-related differences. Gait Posture 2008, 28, 337–342. [Google Scholar] [CrossRef]
  40. Deshmukh, P.M.; Russell, C.M.; Lucarino, L.E.; Robinovitch, S.N. Enhancing clinical measures of postural stability with wearable sensors. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; pp. 4521–4524. [Google Scholar] [CrossRef]
  41. Roeing, K.L.; Hsieh, K.L.; Sosnoff, J.J. A systematic review of balance and fall risk assessments with mobile phone technology. Arch. Gerontol. Geriatr. 2017, 73, 222–226. [Google Scholar] [CrossRef]
Figure 1. Flow diagram.
Figure 1. Flow diagram.
Sensors 19 02972 g001
Figure 2. Feet positions: a = Single leg, b = Feet together, c = Feet apart, d = Semi-tandem, e = Tandem3.3.2. Visual Reference
Figure 2. Feet positions: a = Single leg, b = Feet together, c = Feet apart, d = Semi-tandem, e = Tandem3.3.2. Visual Reference
Sensors 19 02972 g002
Figure 3. Devices and arms positions: (a) Lumbar or sacral region arms not reported [14,26]; (b) Lumbar or sacral region [15,27]; (c) Sternum dominated hand [30]; (d) Sternum both hands [17,28] (e) Malleolus, patella, umbilicus [18] (f) left upper arm [29].
Figure 3. Devices and arms positions: (a) Lumbar or sacral region arms not reported [14,26]; (b) Lumbar or sacral region [15,27]; (c) Sternum dominated hand [30]; (d) Sternum both hands [17,28] (e) Malleolus, patella, umbilicus [18] (f) left upper arm [29].
Sensors 19 02972 g003
Table 1. Sample demographic characteristics (mean ± standard deviation).
Table 1. Sample demographic characteristics (mean ± standard deviation).
AuthorSample
Gender
Age YearsHeight (cm)
Body Mass (kg)
Alberts et al., 2015 [26]n = 49
22 male
19.5 ± 3.1167.7 ± 13.2
68.5 ± 17.5
Alberts et al., 2015 [27]n = 32
14 male
20.9 ± NRNR
Kosse et al., 2015 [14]n = 60
28 male
26 ± 3.9 (young)
45 ± 6.7 (middle)
65 ± 5.5 (older)
NR
Hsieh et al., 2019 [30]n = 30
12 male
64.8 ± 4.5 (Low RF)
72.3 ± 6.6 (High RF)
NR
Ozinga et al., 2014 [15]n = 12
5 male
68.3 ± 6.9NR
Patterson et al., 2014 [17]n = 21
7 male
23 ± 3.34171.66 ± 10.2
82.76 ± 25.69
Patterson et al., 2014 [28]n = 30
13 male
26.1 ± 8.5170,1 ± 7,9
72.3 ± 15.5
Shah et al., 2016 [18]n = 48
21 male
22 ± 2.5175 ± 9.7
72.57 ± 1.29
Yvon et al., 2015 [29]n = 50
13 male
NRNR
cm = centimeter, kg = kilograms, n = sample size, NR = Not reported.
Table 2. Tasks and balance assessment protocol.
Table 2. Tasks and balance assessment protocol.
AuthorAssessed TasksFeet
Condition
Feet
Position
Hands/Arms
Position
Visual InputVisual Reference
Alberts et al., 2015 [26]Six conditions
NeuroCom® SOT
According to SOTAccording
to SOT
According
to SOT
EO/ECAccording
to SOT
Alberts et al., 2015 [27]Six conditions
BESS
Wearing
socks
According
to BESS
Resting on
the iliac crests
ECNA
Kosse et al., 2015 [14]Two conditions
1- Quiet standing
2- a Dual-task
(letter fluency test)
NRParallel
Semi-tandem
NREO/ECNR
Hsieh et al., 2019 [30]1- Quiet standing
2- a Dual-task
(subtracting numbers)
Wearing socks(NC )
Semi-tandem Tandem
Single leg
dominate hand holding phone medially against the chestEO/ECNR
Ozinga et al., 2014 [15]Six conditions
adapted from BESS
BarefootAccording to BESSResting on
the iliac crests
EO/EC3m target
Patterson et al., 2014 [17]Six conditions BESS (adapted)
Five conditions Sway Test
ShoedAccording to BESSHolding Mobile
at Sternum
mid-point
ECNA
Patterson et al., 2014 [28]Single condition
Athlete’s Single Leg Test
NRNon-dominant
foot stance
Holding Mobile
at Sternum
mid-point
EONR
Shah et al., 2016
[18]
Eight conditionsBarefootApart
Together
Tandem
On the hipsEO/EC4.37 m
target
Yvon et al., 2015 [29]Romberg and tandem Romberg tests in
Sixteen conditions
NRApart
Together
Tandem
Side armsEO/ECNR
SOT = Sensory organization test, EO = Eyes open, EC = Closed eyes, BESS = Balance error scoring system, NA = Not applicable, NR = Not reported, NC = Not clearly stated.
Table 3. Balance protocol procedures, devices and technical specifications.
Table 3. Balance protocol procedures, devices and technical specifications.
AuthorNumber of TrialsTotal Time (Time Cropped) SecondsDevice I
Device II
(Sampling Rate)
Device PositionApp Used for AcquisitionSynchronization
Alberts et al., 2015 [26]320 s
(NR)
iPad2 (100 Hz)
NeuroCom® (100 Hz)
SacrumSensor Data by Wavefront LabsLabVIEW data collection program.
Alberts et al., 2015 [27]120 s
(NR)
iPad (SNR) (100 Hz)
Eagle 3D Motion
analysis System (100 Hz)
SacrumCleveland Clinic ConcussionArduino Pro Mini 3.3 v and a LED light
Kosse et al., 2015 [14]160 s
(NR)
iPod Touch (88–92 Hz)
Accelerometer DynaPort® hybrid unit (100 Hz)
L3 vertebraeiMoveDetectionCross-correlation analysis
Hsieh et al., 2019 [30]230 s
(NR)
Samsung Galaxy S6 (200 Hz)
Force plate (Bertec Inc, Columbus, OH)
(1000 Hz)
SternumNRNR
Ozinga et al., 2014 [15]260 s
(NR)
iPad 3 (100 Hz)
Eagle 3D Motion
analysis System (100 Hz)
Second sacral
vertebrae
Cleveland Clinic Balance AssessmentArduino Pro Mini 3.3 v and a LED light
Patterson et al., 2014 [17]110 s STS
20 s BESS
(NA)
iPod Touch (NR)
NA
Sternum midpointSWAY Balance MobileNA
Patterson et al., 2014
[28]
110 s
(NR)
iPod Touch (NR)
Biodex© Balance System (NR)
Sternum midpointSWAY Balance MobileNR
Shah et al., 2016 [18]1(NR)LG Optimus One (14–15 Hz)
NA
Malleols Patella UmbilicsmyAnkleNR
Yvon et al., 2015 [29]130 s
(NR)
iPhone (SNR)
NA
Participant’s left
upper arm
D + R BalanceNR
NR = Not reported, SNR = Specification of the device not reported, STS = Sway Test Software, NA= Not applicable, BESS = Balance error scoring system.

Share and Cite

MDPI and ACS Style

Pinho, A.S.; Salazar, A.P.; Hennig, E.M.; Spessato, B.C.; Domingo, A.; Pagnussat, A.S. Can We Rely on Mobile Devices and Other Gadgets to Assess the Postural Balance of Healthy Individuals? A Systematic Review. Sensors 2019, 19, 2972. https://doi.org/10.3390/s19132972

AMA Style

Pinho AS, Salazar AP, Hennig EM, Spessato BC, Domingo A, Pagnussat AS. Can We Rely on Mobile Devices and Other Gadgets to Assess the Postural Balance of Healthy Individuals? A Systematic Review. Sensors. 2019; 19(13):2972. https://doi.org/10.3390/s19132972

Chicago/Turabian Style

Pinho, Alexandre S., Ana P. Salazar, Ewald M. Hennig, Barbara C. Spessato, Antoinette Domingo, and Aline S. Pagnussat. 2019. "Can We Rely on Mobile Devices and Other Gadgets to Assess the Postural Balance of Healthy Individuals? A Systematic Review" Sensors 19, no. 13: 2972. https://doi.org/10.3390/s19132972

APA Style

Pinho, A. S., Salazar, A. P., Hennig, E. M., Spessato, B. C., Domingo, A., & Pagnussat, A. S. (2019). Can We Rely on Mobile Devices and Other Gadgets to Assess the Postural Balance of Healthy Individuals? A Systematic Review. Sensors, 19(13), 2972. https://doi.org/10.3390/s19132972

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop