Next Article in Journal
A Data-Driven Approach to SAR Data-Focusing
Next Article in Special Issue
Optimizing the Scale of a Wavelet-Based Method for the Detection of Gait Events from a Waist-Mounted Accelerometer under Different Walking Speeds
Previous Article in Journal
A Seven-Rod Dielectric Sensor for Determination of Soil Moisture in Well-Defined Sample Volumes
Previous Article in Special Issue
Validity and Reliability of Wearable Sensors for Joint Angle Estimation: A Systematic Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Use of a Single Wireless IMU for the Segmentation and Automatic Analysis of Activities Performed in the 3-m Timed Up & Go Test

by
Paulina Ortega-Bastidas
1,
Pablo Aqueveque
2,*,
Britam Gómez
2,
Francisco Saavedra
2 and
Roberto Cano-de-la-Cuerda
3
1
Kinesiology Department, Faculty of Medicine, Universidad de Concepción, 4030000 Concepcion, Chile
2
Electrical Engineering Department, Faculty of Engineering, Universidad de Concepción, 219 Edmundo Larenas St., 4030000 Concepción, Chile
3
Physiotherapy, Occupational Therapy, Rehabilitation and Physical Medicine Department, Universidad Rey Juan Carlos, 28922 Madrid, Spain
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(7), 1647; https://doi.org/10.3390/s19071647
Submission received: 15 February 2019 / Revised: 12 March 2019 / Accepted: 19 March 2019 / Published: 6 April 2019
(This article belongs to the Special Issue Wearable Sensors for Gait and Motion Analysis 2018)

Abstract

:
Falls represent a major public health problem in the elderly population. The Timed Up & Go test (TU & Go) is the most used tool to measure this risk of falling, which offers a unique parameter in seconds that represents the dynamic balance. However, it is not determined in which activity the subject presents greater difficulties. For this, a feature-based segmentation method using a single wireless Inertial Measurement Unit (IMU) is proposed in order to analyze data of the inertial sensors to provide a complete report on risks of falls. Twenty-five young subjects and 12 older adults were measured to validate the method proposed with an IMU in the back and with video recording. The measurement system showed similar data compared to the conventional test video recorded, with a Pearson correlation coefficient of 0.9884 and a mean error of 0.17 ± 0.13 s for young subjects, as well as a correlation coefficient of 0.9878 and a mean error of 0.2 ± 0.22 s for older adults. Our methodology allows for identifying all the TU & Go sub–tasks with a single IMU automatically providing information about variables such as: duration of sub–tasks, standing and sitting accelerations, rotation velocity of turning, number of steps during walking and turns, and the inclination degrees of the trunk during standing and sitting.

1. Introduction

The number of people over 60 years of age is rapidly increasing worldwide. The main reasons for this demographic change are the increase in life expectancy and the fall in the birth rate [1]. This has become a public health problem, since aging is generally associated with a decrease in physical and psychological capacity, as well as an increase in the risk of disability, dependence and a number of comorbidities [1,2].
One of the consequences generated by aging is the increased risk of falls, which have been defined as accidental events in which the person falls after losing control of the center of gravity and no effort is made to restore the balance, or it is inefficient [3].
Falls represent a major public health problem in the elderly population. Approximately one third of the population over 65 years old has experienced at least one fall per year. In addition, this frequency increases by 50% in individuals older than 85 years. Between 20% and 30% of falls result in an injury that requires medical attention, constituting the leading cause of death or non-fatal injury in older adults [4,5,6]. Therefore, the early detection of the decreased function of the elderly is important to initiate precociously the preventive measures that allow for maintaining their functional independence [7].
Normally, rehabilitation professionals perform evaluations through observation, as well as through the application of scales or assessment instruments, which provide a certain level of objectivity to the evaluations. There is a large number of tests and scales that allow for evaluating and assessing the static balance, dynamic balance and gait of healthy subjects or with motor problems [8], the Timed Up & Go test (TU & Go) being the most used worldwide. This test measures the dynamic balance and functional mobility in older adults, as well as in the neurological population [9,10,11].
The TU & Go is a simple test that can be performed anywhere, and consists of the subject getting up from a chair from the sitting to the bipedal position, walking three meters, turn, returning and sitting on the chair again, as illustrated in Figure 1. The variable measured is the total time taken by the test and then the score assigned in seconds is observed, which is correlated with the risk of falls [9,12].
Some of the advantages of the TU & Go test is the simplicity in its application and its short duration. In addition, it requires little equipment and allows the possibility that people with a functional impairment can perform the test. However, one of the limitations is that the TU & Go, although it provides relevant information about the risk of falls, is not capable of determining subjects with greater difficulties objectively. Barry et al. [13] mentioned that a limitation in the predictive value of the test could be explained by the fact that it is a unique test that evaluates the overall balance and mobility, which can be improved by combining it with technological tools for the movement analysis, such as optoelectronic laboratories with passive reflective markers, considered the gold standard instrument for analysis of human movement, or several alternatives, like wireless motion sensors such as inertial sensors or inertial measurement units (IMU).
Optoelectronic laboratories, despite providing accurate measurements, are expensive and their application takes a long time, since training and experience is required to interpret the results. In addition, in several countries, there are rural or remote locations with no resources for these advanced technological evaluations systems.
In recent years, a variety of evidence has been observed regarding the development of different devices that use inertial sensors, applications and/or smartphones as a low-cost alternative to optoelectronic systems, which have allowed for specifically visualizing the phases in which subjects could present greater problems with the consequent probability to fall during the application of the TU & Go test [14,15]. Different authors [7,8,11] indicate that the phases of the TU & Go test correspond to: the transition from sitting position to standing, walking towards the turning mark, turning, walking back to the chair, turning and the sit back. This has been shown to increase the predictive value of the test in relation to the risk of falls and the phases in which subjects present greater difficulties.
Thus, it has been demonstrated that it is possible to implement an automatic segmentation of these phases or activities through feature-based algorithms [11,16], complex algorithms based on machine learning [7,17] or by principal components analysis (PCA) [18]—as techniques based in Wavelet decomposition [19]—using inertial sensors. Feature-based algorithms have the advantage of being simple to implement, but their performance is diminished due to the great variability that exists in the morphology of the signals they use to perform the segmentation—angular velocity and acceleration—those that depend on the environment and the execution time of the activities to be identified. Algorithms based on machine learning have a good response to the great variability of the characteristics present in the signals to be processed, but they are complex in their implementation—like those that use principal components analysis—and, to guarantee the above, they depend on a large amount of data where the mentioned variabilities are presented.
One way to reduce the disadvantages of feature-based methods is using the orientation data from Inertial Measurement Units (IMU) [14,20]. This orientation angles have been used in different parts of the body for the segmentation of activities during the execution of the test, since they have low variability in their inter-subject characteristics, allowing for extracting characteristics of each phase independently [21,22].
In relation to the above, this study presents an automatic segmentation methodology that uses a feature-based algorithm to identify the typical sub-tasks carried out during a three-meter TU & Go test in two groups of subjects (young and older adults) using the orientation angles of a single wireless IMU on the back (L3–L4, approximately) to analyze independently the data of the inertial sensors, in order to provide a complete report on the risk of falls and promote the use of low–cost technological objective elements and simple use in hospitals or rehabilitation centers of rural or remote locations.

2. Material and Methods

2.1. Design and Setting

A study with a descriptive design is presented, in which experimental tests were performed to analyze the segmentation of the TU & Go test stages, contrasted with the measurements obtained using a wireless IMU on the lower–back. A total of 25 healthy young subjects (18 men and 7 women) between 25 and 33 years old, and 12 elderly subjects (7 men and 5 women) between 59 and 93 years old were recruited in the city of Concepcion, Chile.
Exclusion criteria were the diagnosis of a neurological, vestibular, musculoskeletal or systemic disease that could alter the ability to walk; the diagnosis of any cardiovascular, respiratory or metabolic disease or other conditions that could interfere with the present study; having undergone surgery on the trunk and lower limbs at least two years prior to the present study, the use of assistive devices for walking and the presence of serious visual alterations that could alter the gait pattern.
The measurements were carried out at the Biomedical Engineering laboratory and in the facilities of the Kinesiology department of the Universidad de Concepción. Prior to the measurements, the test was explained to the participants and two trials without data collection were performed to check the understanding of each participant. The authorization was requested through informed consent, which was approved by the Biosecurity, Bioethical and Ethical Committee of the University of Concepción (Number 3180551).

2.2. IMU Sensor

A homemade IMU—developed at the laboratory of Biomedical Engineering of the Universidad de Concepción—was used.
The chip sensor utilized [23] has a three-axis accelerometer, a three-axis gyroscope and a three-axis magnetometer (see Table 1), as well as an embedded internal processor able to fuse the magnetic and the inertial data using an extended Kalman filter to accurately deliver the orientation in quaternions, to avoid singularities presents in the Euler and Navigation representations. Then, the orientation data is obtained in angle representation with an accuracy of ±1 deg. The data is sampled at 100 Hz with a low cost 32–bit microcontroller with an Advance RISC (reduce instruction set computer) Machine (ARM) Cortex-M0+ processor and sent to a software application via Bluetooth 3.0 up to a maximum distance of 20 m without risk of occlusion. The entire system is powered by a 500-mAh LiPo battery, which gives 10-h of autonomy.
Figure 2 shows the implemented sensor and its disposition on the subjects.

2.3. Test Procedure

The developed sensor is positioned on the back at the height of L3–L4 for young and elderly subjects. It has been shown that the use of a single IMU in that position allows for the detection of all gait events, biomechanical elements of the pelvis and other spatial and temporal kinematics factor [24,25,26].
The TU & Go test was performed following the recommendations of [22]. Three meters away from the chair, a cone was used to mark the location where the patients had to make the turn. Before carrying out the tests, the procedure was explained with a demonstration, to resolve and clarify doubts. Each participant was recorded at 60 fps using a GOPRO HERO 7 high resolution digital video camera (GoPro, Inc., San Mateo, CA, USA) using lineal FOV (Field of View) mode to reduce the image distortion.
Three TU & Go test repetitions were performed by the older adults group, using for the analysis the performance of the higher time of the test carried out. For the young group, only one test was carried out.
Figure 3 shows the set up used for the measurements.

2.4. Segmentation Algorithm

For the segmentation of standing, walking, turning and sitting activities, an algorithm was designed. This algorithm processed inclination ( P i t c h ) and rotation ( Y a w ) signals of an inertial sensor placed on the back of a subject at L3–L4, approximately.

2.4.1. Standing/Sitting Events Identification

To determine the events of standing and sitting, the P i t c h signal was used. This corresponds to the inclination actions during the activities to be identified. When a subject is standing or sitting, he tends to make a slight forward inclination with respect to the resting position until it regains its initial inclination (see Figure 4), which can even be observed in subjects with reduced mobility [27].
In order to condition the P i t c h signal, this is smoothed by an average filter of order N = 5 (see Equation (1)) and normalized by the absolute maximum of the smooth signal, as seen in Equation (2):
P i t c h s m o o t h ( n ) = 1 N [ P i t c h ( n ) + P i t c h ( n + 1 ) + . . . + P i t c h ( n + N 1 ) ] ,
P i t c h n o r m ( n ) = P i t c h s m o o t h ( n ) m a x ( | P i t c h s m o o t h | ) .
Then, the Pitch signal is processed using a local maximum detector, finding T p e a k 1 and T p e a k 2 , that corresponds to the maximum angle of inclination in the standing and sitting events, respectively (see Figure 5).
Then, the difference between samples to the left of T p e a k 1 is calculated, as indicated in Equation (3), and to the right of T p e a k 1 , as indicated in Equation (4), to start the search for the standing action, were i is a sample iterator. Thus, the start and end of standing action are obtained using Equations (5) and (6), respectively, with a factor of 0.05 to determine the threshold slope stop, which was found experimentally:
Δ 1 = | P i t c h n o r m ( T p e a k 1 i 0 . 1 ) P i t c h n o r m ( T p e a k 1 i ) | < 0 . 05 , i = 0 , 0 . 1 , 0 . 2 , ,
Δ 2 = | P i t c h n o r m ( T p e a k 1 + i + 0 . 1 ) P i t c h n o r m ( T p e a k 1 + i ) | < 0 . 05 , i = 0 , 0 . 1 , 0 . 2 , ,
s t a n d i n g i = T p e a k 1 i ,
s t a n d i n g f = T p e a k 1 + i .
Similarly, through the same methodology, the sitting action is sought using the difference between samples to the left of T p e a k 2 , as indicated in Equation (7), and to the right of T p e a k 2 , as indicated in Equation (8). Then, the start and end of standing action are obtained using the Equations (9) and (10), respectively:
Δ 3 = | P i t c h n o r m ( T p e a k 2 i 0 . 1 ) P i t c h n o r m ( T p e a k 2 i ) | < 0 . 05 , i = 0 , 0 . 1 , 0 . 2 , . . . ,
Δ 4 = | P i t c h n o r m ( T p e a k 2 + i + 0 . 1 ) P i t c h n o r m ( T p e a k 2 + i ) | < 0 . 05 , i = 0 , 0 . 1 , 0 . 2 , . . . ,
s i t t i n g i = T p e a k 2 i ,
s i t t i n g f = T p e a k 2 + i .
Figure 6 shows the final result of the search method and Figure 7 shows the proposed standing/sitting activities identification method.

2.4.2. Turning Events Identification

To determine the turns around the 3-m mark and prior to sitting, Y a w is used, which corresponds to changes in the orientation of the sensor during turns. When a subject performs the TU & Go test, it is subjected to a circuit that forces it to make turns of 180 degrees approximately, which can be unequivocally measured by the sensor in the back, since, during turns, the sensor also changes its orientation next to it (see Figure 8).
To condition the orientation change signal, as for the inclination signal, this is smoothed by an average filter of order N = 5 (see Equation (11)) and normalized by the absolute maximum of the signal, as seen in Equation (12):
Y a w s m o o t h ( n ) = 1 N [ Y a w ( n ) + Y a w ( n + 1 ) + . . . + Y a w ( n + N 1 ) ] ,
Y a w n o r m ( n ) = Y a w s m o o t h ( n ) m a x ( | Y a w s m o o t h | ) .
Then, the rotation signal is derived and processed to identify the maximum value m a x ( d / d t ) and the minimum value m i n ( d / d t ) , which are useful to identify the start and end points of the first turn and the second turn, respectively, as seen in Figure 9.
Thus, through a sliding window of 0.1 s or 10 samples (Equations (13) and (14)), the start of the 3-m tuning is searched by calculating the mean of the samples in the window to the left of m a x ( d / d t ) and the end of the 3-m tuning calculating the mean of the samples in the window to the right of m a x ( d / d t ) using Equations (15) and (16), respectively:
W 1 = [ T m a x ( d / d t ) 10 i , T m a x ( d / d t ) i ] ,
W 2 = [ T m a x ( d / d t ) + i , T m a x ( d / d t ) + 10 + i ] ,
W 1 ¯ > 0 . 02 , i = 0 , 0 . 01 , 0 . 02 , 0 . 03 , . . . ,
W 2 ¯ < 0 . 9 , i = 0 , 0 . 01 , 0 . 02 , 0 . 03 , . . .
Finally, the start and the end of the 3-m turning are obtained through Equations (17) and (18), respectively:
t u r n 1 i = m i n ( W 1 ) ,
t u r n 1 f = m a x ( W 2 ) .
Then, using a sliding window of 0.1 s or 10 samples (Equations (19) and (20)), the start of the pre-sitting tuning is sought by calculating the mean of the samples in the window to the left of m i n ( d / d t ) and the end of the pre-sitting tuning calculating the mean of the samples in the window to the right of m i n ( d / d t ) using Equations (21) and (22), respectively:
W 3 = [ T m i n ( d / d t ) 10 i , T m i n ( d / d t ) i ] ,
W 4 = [ T m i n ( d / d t ) + i , T m i n ( d / d t ) + 10 + i ] ,
W 3 ¯ < 0 . 9 , i = 0 , 0 . 01 , 0 . 02 , 0 . 03 , . . . ,
W 4 ¯ > 0 . 02 , i = 0 , 0 . 01 , 0 . 02 , 0 . 03 , . . .
Similarly, the start and the end of the pre-sitting turning are obtained through Equations (23) and (24), respectively:
t u r n 2 i = m i n ( W 3 ) ,
t u r n 2 f = m a x ( W 4 ) .
Figure 10 shows the final result of the search method and Figure 11 indicates the proposed turning activities’ identification method.

3. Results

The data were analyzed using MATLAB R2017b (The MathWorks, Inc., Natick, MA, USA) to obtain the results presented in this section. The video recording was analyzed using the Wondershare Filmora version 8.4.0 video editor (Wondershare Software Ltd., Shenzhen, China).

3.1. IMU Measurements Validation versus the Standard Clinical Procedure

To evaluate the performance of the proposed methodology in relation to the typical visual clinical procedure, the total time from each TU & Go test captured in videos for young population was tabulated obtained as the average of the times observed frame by frame by two different evaluators to compensate the bias introduced by the subjectivity present in the real practice. This was used to observe the measurements error and the Pearson correlation coefficient with respect to the typical clinical application to assess the strength of the association between the total time from each TU & Go test captured by video and the IMU measurements. The results are tabulated in Table 2 and shown in Figure 12.
The measurement system with the proposed segmentation algorithm and methodology could deliver data similar to those obtained by the observational clinical application of the TU & Go test measured in the videos for young population (see Figure 12). Here, a Pearson correlation coefficient of 0.9884 (see Figure 12a) and a mean error of 0.17 ± 0.13 s was observed, as shown in Figure 12b.
In addition, the proposed methodology in the older adults was used to evaluate the performance in the target population of this procedure. In addition, two different evaluators analyzed the videos of the tests carried out by each subject independently, the results of Table 3 and Figure 13 being obtained.
The measurement system with the proposed methodology has a similar performance to that obtained for healthy young subjects, with a Pearson correlation coefficient of 0.9878 (see Figure 13a) and a mean error of 0.20 ± 0.22 s as shown in Figure 13b. Moreover, the proposed methodology was capable of correctly classifying the 92% of the subjects measured according to their risk of falling, taking into account the total time of the test assessed by the IMU, including subjects with a high risk of falling (see subject 12 of Table 3).

3.2. Activities Segmentation Analysis of the TU & Go Test

To evaluate the performance of the proposed segmentation algorithm, the recording videos of the tests of each subject were analyzed and the time of each stage was tabulated to compare it with the times obtained when processing the inclination data ( P i t c h ) and the sensor rotation data ( Y a w ).
Figure 14 shows the correlation for the activity segmentation time for the young subjects, showing that the minimum correlation is obtained in the identification of the transition between the end of the standing and the start of the first walk with a Pearson correlation coefficient of 0.8138 (see Figure 14a). The best correlation occurs in the identification times of the transition between the end of the 3-m turning and the start of the second walk with a Pearson correlation coefficient of 0.9854 (see Figure 14d). The above indicates that the measurement system has a high degree of agreement with respect to a visual segmentation.
Figure 15 shows the measurement errors between the proposed segmentation methodology for each sub-task during the TU & Go test in the young subjects compared to the video analysis. In Figure 15a, it is observed that the segmentation algorithm is capable of identifying the transition between the end of the standing and the start of the first walk with an average error of −0.02 s. Figure 15b shows that the transition between the end of the first walk and the start of the 3-m turning is identified with an average error of 0.36 s. Figure 15c shows that the transition between the end of the 3-m turning and the start of the second walk is identified with an average error of 0.11 s. Figure 15d shows that the transition between the end of the second walk and the start of the pre-sitting turn is identified with an average error of 0.25 s. The sub-task of the end of the pre-sitting turning is identified with an average error of 0.16 s (see Figure 15e). Regarding the sitting sub-task, the algorithm identifies the start with an average error of 0.18 s (see Figure 15f) and the end with an average error of 0.24 s (see Figure 15g).
Figure 16 shows the measurement errors between the proposed segmentation methodology for each sub-task during the TU & Go test in the older adults group compared to the video analysis. In Figure 16a, it is observed that the segmentation algorithm is capable of identifying the transition between the end of the standing and the start of the first walk with an average error of 0.07 s. Figure 16b shows that the transition between the end of the first walk and the start of the 3-m turning is identified with an average error of 0.29 s. Figure 16c shows that the transition between the end of the 3-m turning and the start of the second walk is identified with an average error of 0.43 s. Figure 16d shows that the transition between the end of the second walk and the start of the pre-sitting turn is identified with an average error of 0.63 s. The sub-task of the end of the pre-sitting turning is identified with an average error of 0.21 s (see Figure 16e). Regarding the sitting sub-task, the algorithm identifies the start with an average error of 0.25 s (see Figure 16f) and the end with an average error of 0.26 s (see Figure 16g).

3.3. Characterization of the Signals Acquired from the Stages of the TU & Go Test

After the segmentation, each action performed by the subjects is characterized independently, and the following variables are obtained: duration of each sub-task, standing acceleration (Acc. Su), sitting acceleration (Acc. Sd), rotation velocity of the 3-m turning (Vel. T 1 ), rotation velocity of the pre-sitting turning (Vel. T 2 ), number of steps during the first walk ( W 1 ), number of steps during the second walk ( W 2 ), number of steps during the 3-m turning ( T 1 ), number of steps during the pre-sitting turning ( T 2 ), inclination degrees of the trunk during standing ( P i t c h Su) and sitting ( P i t c h Sd) as shown in Figure 17.
The time duration of the sub-tasks performed by the young subjects are shown in Figure 18 and in Table 4. In Table 5, the parameters extracted from the signals of angular velocity and acceleration of the inertial sensors of the IMU are observed.
Figure 18a illustrates that the standing times obtained by the segmentation algorithm have a distribution close to the values observed by the video recordings, with an average error of −0.08 ± 0.15 s. Figure 18b presents one of the most different distributions in the times obtained for the first walk with an average error of 0.42 ± 0.20 s. Figure 18c presents a less dispersed distribution for the 3-m turning times than the tabulated ones from the video recording, estimated with an error of −0.19 ± 0.21 s. Regarding to the second walk times (see Figure 18d) and sitting times (see Figure 18f), an estimation close to the tabulated values are observed from the video analysis, with an error of 0.17 ± 0.11 s and −0.01 ± 0.19 s, respectively. Finally, the pre-sitting turning times are estimated with an error of −0.08 ± 0.11 s, despite the slight difference in the distributions observed in Figure 18e.
On the other hand, the time duration of the sub-tasks performed by the older adults group are shown in Figure 19 and in Table 6. In Table 7, the parameters extracted from the signals of angular velocity and acceleration of the inertial sensors of the IMU are observed.
In Figure 19, it is observed that the proposed method delivers similar distributions to those measured by video, but less variable, which is due to the present subjectivity when discriminating the transition from one sub-task to another visually. The atypical data in all the sub-images of Figure 19 are due to the subjectivity with a high risk of falling measured, demonstrating that the algorithm and the measurement methodology used allows identifying it as well as the visual method, but automatically. In general, the data presented in Table 6 indicate that the older adults group delay more than the young subjects in perform each sub-task (see Table 4), maintaining a similar error with respect to that observed in video.
From Table 5 and Table 7, it can be observed that the parameters that most differ between the two populations measured—young and older adults—are the maximum speed of both turnings, being lower in the elderly subjects than in young subjects. The largest number of steps measured for each walking sub-task of Table 7 belong to subject 12 with a high risk of falls and which is also possible to measure automatically using a simple local peak detection algorithm on the acceleration signal once the segmentation is performed.

4. Discussion

In this study, a segmentation of the TU & Go test activities using a single wireless IMU was performed in two different age groups. For this, a comparison between the measurements obtained by the typical observational analysis and the proposed methodology, the segmentation of the TU & Go test activities and the characterization of the inertial signals acquired from the TU & Go test stages were carried out with interesting results.
Effective treatment, specifically for gait disturbances and for risk of falls assessment, requires reliable tools. Tridimensional motion capturing systems are the gold standard of gait assessment, but, due to the space and time requirements and the high cost of the equipment, its use in clinical practice is far from routine [29]. In addition, a limited number of consecutive strides can be measured and they require camera and markers that may limit their use in the clinical practice.
In this context, IMUs may be used to assess gait performance and risk of falls. Moreover, these technologies present several advantages such as they are portable, low-cost and lightweight; they are good at measuring acceleration and turns, they are suitable for measuring brief, high speed events and they can be used indoors and outdoors regardless of lighting conditions; finally, they can continuously evaluate over long periods of time. However, they must answer to their usefulness as a clinical tool [30]. Thus, these technological devices must be assessed in terms of clinical feasibility and psychometric properties [31,32].
The proposal model obtained a high correlation respect to video recording (see Figure 12), allowing for identifying all the events with only a single wireless IMU which present a low variability in characteristics inter-subjects. It allows for extracting characteristics of each phases independently, in comparison with other works that use a body fixed sensor array [11,16]. In general, during the proof of concept of the proposed methodology in young subjects, a high degree of concordance was obtained with respect to the segmentation performed by video, which is demonstrated in the correlation coefficients of Figure 14. In the case of the group of older adults, the correlation analysis was omitted because the results could be unrepresentative due to the low number of subjects and the atypical data present in the subject with a high risk of falls (see subject 12 of Table 3). The worst correlation obtained was found in the identification of the moment in which the person finishes the sit-to-stand transition, due to the difficulty to identify in a visual way the exact point of the end of the standing and the beginning of the first walk sub-task. However, the proposed analysis algorithm was capable of identifying the exact transition point with an average maximum error of 0.36 s (see Figure 15b) for young subjects and a maximum average error of 0.63 s for older adults (see Figure 16d).
The segmentation presented in the present study allows for improving objectivity to the clinical practice in the evaluation of the performance of the patients. By observing the standing/sitting phases using P i t c h , it is possible to determine the inclination of the trunk during sit to walk transfer in young subjects (see Table 5) and older adults subjects (see Table 7. This is relevant because, as mentioned by Pozaic et al., the transition from sit to walk is the event with the highest number of falls in the elderly population [33].
On the other hand, the analysis of the turning phases through Y a w will allow for observing the degrees and the time that the subject takes to carry out this activity (Figure 18c,e). The ability to turn safely is relevant to functional independence and is considered another difficult task with a high risk of falling [34,35].
Although the conventional TU & Go test is a clinical tool that allows for measuring the risk of falling, it uses a single global parameter to estimate it. However, the stage that could be interfering in the execution of the task in less time is not specifically discriminate. The current methodology provides information about variables such as: standing acceleration (Acc. Su), sitting acceleration (Acc. Sd), rotation velocity of the 3-m turning (Vel. T 1 ), rotation velocity of the pre-sitting turning (Vel. T 1 ), number of steps during the first walk (Steps W 1 ), number of steps during the second walk (Steps W 2 ), number of steps during the 3-m turning (Steps T 1 ), number of steps during the pre-sitting turning (Steps T 2 ), inclination degrees of the trunk during standing ( P i t c h Su) and sitting ( P i t c h Sd) (see Table 5).
Other authors [36,37,38,39] have explored the use of IMU technology for gait assessment during the TU & Go test. However, this approach allows for segmenting the transitions between each sub-task in an exact and automatic way, using a simple algorithm and a low–cost movement sensor, allowing for extracting characteristics of each one of them due to the positioning of the IMU in the young and older adults groups. The above can be extrapolated when performing tests of 6 and 10 m to extract more parameters of the gait during the first and second walk.
This study presents some limitations. Firstly, the segmentation of each event during the analysis of the videos was performed in an observational way. Second, it is not possible to determine how the algorithm would perform in other populations such as people with neurological disorders. However, the whole sample recruited was determined as the starting point to be capable of validating the algorithm and the sensors used in pathological population in future research. Clinical validation studies of these devices should be carried out in populations with specific characteristics related to gait and balance impairments.

5. Conclusions

In conclusion, an IMU located on the back may detect main gait events and spatial-temporal kinematic factors during the TU & Go test, with excellent correlation with the conventional visual clinical procedure in young and older adults. Thus, a user-friendly technological tool to health-care professionals may offer objective measurements for the segmentation of the activities as standing, walking, turning and sitting related to critical events related with the risk of falls.

Author Contributions

Conceptualization, P.O.B., B.G., P.A. and R.C.d.l.C.; Methodology, P.O.B., B.G. and R.C.d.l.C.; Software, B.G. and F.S.; Data curation, P.O.B. and B.G.; Formal analysis, B.G.; Resources, P.O.B., B.G., F.S. and P.A.; Writing—original draft, P.O.B. and B.G.; Writing—review and editing, P.O.B. and B.G.; Supervision, P.A. and R.C.d.l.C.

Funding

This research was funded by FONDECYT Posdoctorado n 3180551.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jeon, M.Y.; Jeong, H.; Petrofsky, J.; Lee, H.; Yim, J. Effects of a randomized controlled recurrent fall prevention program on risk factors for falls in frail elderly living at home in rural communities. Med. Sci. Monit. 2014, 20, 2283. [Google Scholar]
  2. Kingston, A.; Wohland, P.; Wittenberg, R.; Robinson, L.; Brayne, C.; Matthews, F.E.; Jagger, C.; Green, E.; Gao, L.; Barnes, R.; et al. Is late-life dependency increasing or not? A comparison of the Cognitive Function and Ageing Studies (CFAS). Lancet 2017, 390, 1676–1684. [Google Scholar] [CrossRef]
  3. Sharif, S.I.; Al-Harbi, A.B.; Al-Shihabi, A.M.; Al-Daour, D.S.; Sharif, R.S. Falls in the elderly: Assessment of prevalence and risk factors. Pharm. Pract. 2018, 16, 1206. [Google Scholar] [CrossRef] [PubMed]
  4. Talarska, D.; Strugała, M.; Szewczyczak, M.; Tobis, S.; Michalak, M.; Wróblewska, I.; Wieczorowska-Tobis, K. Is independence of older adults safe considering the risk of falls? BMC Geriatr. 2017, 17, 66. [Google Scholar] [CrossRef] [PubMed]
  5. Qiu, H.; Rehman, R.Z.U.; Yu, X.; Xiong, S. Application of Wearable Inertial Sensors and a New Test Battery for Distinguishing Retrospective Fallers from Non-fallers among Community-dwelling Older People. Sci. Rep. 2018, 8, 16349. [Google Scholar] [CrossRef] [PubMed]
  6. Sun, D.Q.; Huang, J.; Varadhan, R.; Agrawal, Y. Race and fall risk: data from the National Health and Aging Trends Study (NHATS). Age Ageing 2016, 45, 120–127. [Google Scholar] [CrossRef] [PubMed]
  7. Hellmers, S.; Izadpanah, B.; Dasenbrock, L.; Diekmann, R.; Bauer, J.; Hein, A.; Fudickar, S. Towards an Automated Unsupervised Mobility Assessment for Older People Based on Inertial TUG Measurements. Sensors 2018, 18, 3310. [Google Scholar] [CrossRef]
  8. Park, S.H. Tools for assessing fall risk in the elderly: A systematic review and meta-analysis. Aging Clin. Exp. Res. 2018, 30, 1–16. [Google Scholar] [CrossRef] [PubMed]
  9. Zasadzka, E.; Borowicz, A.M.; Roszak, M.; Pawlaczyk, M. Assessment of the risk of falling with the use of timed up and go test in the elderly with lower extremity osteoarthritis. Clin. Interv. Aging 2015, 10, 1289. [Google Scholar] [CrossRef] [PubMed]
  10. Persson, C.U.; Danielsson, A.; Sunnerhagen, K.S.; Grimby-Ekman, A.; Hansson, P.O. Timed Up & Go as a measure for longitudinal change in mobility after stroke–Postural Stroke Study in Gothenburg (POSTGOT). J. Neuroeng. Rehabil. 2014, 11, 83. [Google Scholar]
  11. Nguyen, H.; Lebel, K.; Boissy, P.; Bogard, S.; Goubault, E.; Duval, C. Auto detection and segmentation of daily living activities during a Timed Up and Go task in people with Parkinson’s disease using multiple inertial sensors. J. Neuroeng. Rehabil. 2017, 14, 26. [Google Scholar] [CrossRef] [PubMed]
  12. Podsiadlo, D.; Richardson, S. The timed “Up & Go”: A test of basic functional mobility for frail elderly persons. J. Am. Geriatr. Soc. 1991, 39, 142–148. [Google Scholar] [PubMed]
  13. Barry, E.; Galvin, R.; Keogh, C.; Horgan, F.; Fahey, T. Is the Timed Up and Go test a useful predictor of risk of falls in community dwelling older adults: A systematic review and meta-analysis. BMC Geriatr. 2014, 14, 14. [Google Scholar] [CrossRef] [PubMed]
  14. Galán-Mercant, A.; Barón-López, F.J.; Labajos-Manzanares, M.T.; Cuesta-Vargas, A.I. Reliability and criterion-related validity with a smartphone used in timed-up-and-go test. BioMed. Eng. Online 2014, 13, 156. [Google Scholar] [CrossRef]
  15. Kleiner, A.F.R.; Pacifici, I.; Vagnini, A.; Camerota, F.; Celletti, C.; Stocchi, F.; De Pandis, M.F.; Galli, M. Timed Up and Go evaluation with wearable devices: Validation in Parkinson’s disease. J. Bodyw. Mov. Ther. 2018, 22, 390–395. [Google Scholar] [CrossRef]
  16. Weiss, A.; Mirelman, A.; Buchman, A.S.; Bennett, D.A.; Hausdorff, J.M. Using a body-fixed sensor to identify subclinical gait difficulties in older adults with IADL disability: Maximizing the output of the timed up and go. PLoS ONE 2013, 8, e68885. [Google Scholar] [CrossRef]
  17. Nguyen, H.; Lebel, K.; Bogard, S.; Goubault, E.; Boissy, P.; Duval, C. Using inertial sensors to automatically detect and segment activities of daily living in people with Parkinson’s disease. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 197–204. [Google Scholar] [CrossRef]
  18. Tanaka, N.; Zakaria, N.A.; Kibinge, N.K.; Kanaya, S.; Tamura, T.; Yoshida, M. Fall-risk classification of the timed up-and-go test with principle component analysis. Int. J. Neurorehabilit. 2014, 1, 106. [Google Scholar]
  19. Soangra, R.; Lockhart, T.E.; Van de Berge, N. An approach for identifying gait events using wavelet denoising technique and single wireless IMU. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2011, 55, 1990–1994. [Google Scholar] [CrossRef]
  20. Lebel, K.; Nguyen, H.; Duval, C.; Plamondon, R.; Boissy, P. Capturing the cranio-caudal signature of a turn with inertial measurement systems: methods, parameters robustness and reliability. Front. Bioeng. Biotechnol. 2017, 5, 51. [Google Scholar] [CrossRef] [PubMed]
  21. Vervoort, D.; Vuillerme, N.; Kosse, N.; Hortobágyi, T.; Lamoth, C.J. Multivariate analyses and classification of inertial sensor data to identify aging effects on the Timed-Up-and-Go test. PLoS ONE 2016, 11, e0155984. [Google Scholar] [CrossRef]
  22. Salarian, A.; Horak, F.B.; Zampieri, C.; Carlson-Kuhta, P.; Nutt, J.G.; Aminian, K. iTUG, a sensitive and reliable measure of mobility. IEEE Trans. Neural Syst. Rehabil. Eng. 2010, 18, 303–310. [Google Scholar] [CrossRef]
  23. Sensortec, B. Intelligent 9-Axis Absolute Orientation Sensor. BNO055 Datasheet. 2014. Available online: https://easyeda.com/gerrychen/BNO055_Project-aPDdNKIiS (accessed on 5 February 2019).
  24. Kose, A.; Peruzzi, A.; Cereatti, A.; Laudani, L.; Della Croce, U. Detection of heel strikes and toe-offs during gait using a single inertial measurement unit attached to the waist. In Proceedings of the Second National Congress of Biongineering, Turin, Italy, 8–10 July 2010; Volume 233. [Google Scholar]
  25. Weinberg, H. Using the ADXL202 in Pedometer and Personal Navigation Applications. Analog Devices AN-602 Application Note. 2002, Volume 2, pp. 1–6. Available online: http://www.bdtic.com/download/adi/513772624an602.pdf (accessed on 5 February 2019).
  26. Sejdić, E.; Lowry, K.A.; Bellanca, J.; Perera, S.; Redfern, M.S.; Brach, J.S. Extraction of stride events from gait accelerometry during treadmill walking. IEEE J. Transl. Eng. Health Med. 2016, 4, 1–11. [Google Scholar] [CrossRef]
  27. Pham, M.H.; Warmerdam, E.; Elshehabi, M.; Schlenstedt, C.; Bergeest, L.M.; Heller, M.; Haertner, L.; Ferreira, J.; Berg, D.; Schmidt, G.; et al. Validation of a lower back “wearable”-based sit-to-stand and stand-to-sit algorithm for patients with Parkinson’s disease and older adults in a home-like environment. Front. Neurol. 2018, 9, 652. [Google Scholar] [CrossRef]
  28. Mancilla, E.; Valenzuela, J.; Escobar, M. Timed up and go right and left unipodal stance results in Chilean older people with different degrees of disability. Rev. Med. Chile 2015, 143, 39–46. [Google Scholar]
  29. Gor-Garcia-Fogeda, M.D.; de la Cuerda, R.C.; Tejada, M.C.; Alguacil-Diego, I.M.; Molina-Rueda, F. Observational gait assessments in people with neurological disorders: A systematic review. Arch. Phys. Med. Rehabil. 2016, 97, 131–140. [Google Scholar] [CrossRef]
  30. Linares-Del Rey, M.; Vela-Desojo, L.; Cano-de la Cuerda, R. Mobile phone applications in Parkinson’s disease: A systematic review. Neurología 2019, 34, 38–54. (In English) [Google Scholar] [CrossRef]
  31. Hobart, J.C.; Lamping, D.L.; Thompson, A.J. Evaluating neurological outcome measures: the bare essentials. J. Neurol. Neurosurg. Psychiatry 1996, 60, 127. [Google Scholar] [CrossRef]
  32. Andresen, E.M. Criteria for assessing the tools of disability outcomes research. Arch. Phys. Med. Rehabil. 2000, 81, S15–S20. [Google Scholar] [CrossRef]
  33. Pozaic, T.; Lindemann, U.; Grebe, A.K.; Stork, W. Sit-to-stand transition reveals acute fall risk in activities of daily living. IEEE J. Transl. Eng. Health Med. 2016, 4, 1–11. [Google Scholar] [CrossRef]
  34. Chan, W.N.; Tsang, W.W. The performance of stroke survivors in turning-while-walking while carrying out a concurrent cognitive task compared with controls. PLoS ONE 2017, 12, e0189800. [Google Scholar] [CrossRef]
  35. Mancini, M.; Schlueter, H.; El-Gohary, M.; Mattek, N.; Duncan, C.; Kaye, J.; Horak, F.B. Continuous monitoring of turning mobility and its association to falls and cognitive function: A pilot study. J. Gerontol. Ser. A Biomed. Sci. Med. Sci. 2016, 71, 1102–1108. [Google Scholar] [CrossRef]
  36. Zampieri, C.; Salarian, A.; Carlson-Kuhta, P.; Nutt, J.G.; Horak, F.B. Assessing mobility at home in people with early Parkinson’s disease using an instrumented Timed Up and Go test. Parkinsonism Relat. Disord. 2011, 17, 277–280. [Google Scholar] [CrossRef]
  37. Beyea, J.; McGibbon, C.A.; Sexton, A.; Noble, J.; O’Connell, C. Convergent validity of a wearable sensor system for measuring sub-task performance during the timed Up-and-Go Test. Sensors 2017, 17, 934. [Google Scholar] [CrossRef]
  38. Pedrana, A.; Comotti, D.; Locatelli, P.; Traversi, G. Development of a telemedicine-oriented gait analysis system based on inertial sensors. In Proceedings of the 2018 7th International Conference on Modern Circuits and Systems Technologies (MOCAST), Thessaloniki, Greece, 7–9 May 2018; pp. 1–4. [Google Scholar]
  39. Reinfelder, S.; Hauer, R.; Barth, J.; Klucken, J.; Eskofier, B.M. Timed Up-and-Go phase segmentation in Parkinson’s disease patients using unobtrusive inertial sensors. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milano, Italy, 25–29 August 2015; pp. 5171–5174. [Google Scholar]
Figure 1. Activities performed in the 3-m TU & Go test. 1 = standing, 2 = first walk, 3 = 3-m turning, 4 = second walk, 5 = pre-sitting turning, and 6 = sitting.
Figure 1. Activities performed in the 3-m TU & Go test. 1 = standing, 2 = first walk, 3 = 3-m turning, 4 = second walk, 5 = pre-sitting turning, and 6 = sitting.
Sensors 19 01647 g001
Figure 2. IMU sensor develop in the biomedical engineering laboratory of the Universidad de Concepción, Chile.
Figure 2. IMU sensor develop in the biomedical engineering laboratory of the Universidad de Concepción, Chile.
Sensors 19 01647 g002
Figure 3. Set up used for the acquisition data during 3-m TU & Go test.
Figure 3. Set up used for the acquisition data during 3-m TU & Go test.
Sensors 19 01647 g003
Figure 4. Identification of Standing/Sitting events using P i t c h .
Figure 4. Identification of Standing/Sitting events using P i t c h .
Sensors 19 01647 g004
Figure 5. Peaks identified for the search of start and end of standing and sitting actions.
Figure 5. Peaks identified for the search of start and end of standing and sitting actions.
Sensors 19 01647 g005
Figure 6. Result of the proposed method for the search of the beginning ( s t a n d i n g i ) and the end ( s t a n d i n g f ) of the Standing event and for the search of the beginning ( s i t t i n g i ) and the end ( s i t t i n g f ) of the Sitting event.
Figure 6. Result of the proposed method for the search of the beginning ( s t a n d i n g i ) and the end ( s t a n d i n g f ) of the Standing event and for the search of the beginning ( s i t t i n g i ) and the end ( s i t t i n g f ) of the Sitting event.
Sensors 19 01647 g006
Figure 7. Automatic feature-based segmentation algorithm for the standing and sitting activities in the 3–m TU & Go test.
Figure 7. Automatic feature-based segmentation algorithm for the standing and sitting activities in the 3–m TU & Go test.
Sensors 19 01647 g007
Figure 8. Turning events identification using Y a w .
Figure 8. Turning events identification using Y a w .
Sensors 19 01647 g008
Figure 9. Maximum and minimum of the rotation signal derived to start the search for turning events.
Figure 9. Maximum and minimum of the rotation signal derived to start the search for turning events.
Sensors 19 01647 g009
Figure 10. Result of the proposed method for the search of the beginning ( t u r n 1 i ) and the end ( t u r n 1 f ) of the 3-m turning event and for the search of the beginning ( t u r n 2 i ) and the end ( t u r n 2 f ) of the pre-sitting event.
Figure 10. Result of the proposed method for the search of the beginning ( t u r n 1 i ) and the end ( t u r n 1 f ) of the 3-m turning event and for the search of the beginning ( t u r n 2 i ) and the end ( t u r n 2 f ) of the pre-sitting event.
Sensors 19 01647 g010
Figure 11. Automatic feature-based segmentation algorithm for the turning actions in the 3-m TU & Go test.
Figure 11. Automatic feature-based segmentation algorithm for the turning actions in the 3-m TU & Go test.
Sensors 19 01647 g011
Figure 12. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in young subjects. (a) Pearson correlation between IMU and the video record analysis; (b) Bland–Altman plot between IMU and the video record analysis.
Figure 12. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in young subjects. (a) Pearson correlation between IMU and the video record analysis; (b) Bland–Altman plot between IMU and the video record analysis.
Sensors 19 01647 g012
Figure 13. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in the older adults group. (a) Pearson correlation between IMU and the video record analysis without the high risk of falling subject; (b) Bland–Altman plot between IMU and the video record analysis.
Figure 13. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in the older adults group. (a) Pearson correlation between IMU and the video record analysis without the high risk of falling subject; (b) Bland–Altman plot between IMU and the video record analysis.
Sensors 19 01647 g013
Figure 14. Pearson correlation coefficient for the segmentation time for each sub task of the TU & Go test in young subjects. (a) end of the standing/start of the first walk; (b) end of the first walk/start of the 3-m turn; (c) end of the 3-m turn/start of the second walk; (d) end of the second walk/start of the pre-sitting turn; (e) end of the pre-sitting turn; (f) start of the sitting; (g) end of the sitting.
Figure 14. Pearson correlation coefficient for the segmentation time for each sub task of the TU & Go test in young subjects. (a) end of the standing/start of the first walk; (b) end of the first walk/start of the 3-m turn; (c) end of the 3-m turn/start of the second walk; (d) end of the second walk/start of the pre-sitting turn; (e) end of the pre-sitting turn; (f) start of the sitting; (g) end of the sitting.
Sensors 19 01647 g014
Figure 15. Error plots for the segmentation time for each sub task of the TU & Go test in young subjects. The segmented lines correspond to the mean error. (a) end of the standing/start of the first walk; (b) end of the first walk/start of the 3-m turn; (c) end of the 3-m turn/start of the second walk; (d) end of the second walk/start of the pre-sitting turn; (e) end of the pre-sitting turn; (f) start of the sitting; (g) end of the sitting.
Figure 15. Error plots for the segmentation time for each sub task of the TU & Go test in young subjects. The segmented lines correspond to the mean error. (a) end of the standing/start of the first walk; (b) end of the first walk/start of the 3-m turn; (c) end of the 3-m turn/start of the second walk; (d) end of the second walk/start of the pre-sitting turn; (e) end of the pre-sitting turn; (f) start of the sitting; (g) end of the sitting.
Sensors 19 01647 g015
Figure 16. Error plots for the segmentation time for each sub task of the TU & Go test in the older adults group. The segmented lines correspond to the mean error. (a) end of the standing/start of the first walk; (b) end of the first walk/start of the 3-m turn; (c) end of the 3-m turn/start of the second walk; (d) end of the second walk/start of the pre-sitting turn; (e) end of the pre-sitting turn; (f) start of the sitting; (g) end of the sitting.
Figure 16. Error plots for the segmentation time for each sub task of the TU & Go test in the older adults group. The segmented lines correspond to the mean error. (a) end of the standing/start of the first walk; (b) end of the first walk/start of the 3-m turn; (c) end of the 3-m turn/start of the second walk; (d) end of the second walk/start of the pre-sitting turn; (e) end of the pre-sitting turn; (f) start of the sitting; (g) end of the sitting.
Sensors 19 01647 g016
Figure 17. Characteristics measured after the segmentation of the activities carried out in the 3-m TU & Go test. The measured characteristics correspond to the signals of vertical acceleration (AccZ) and sagittal angular velocity (GyrY).
Figure 17. Characteristics measured after the segmentation of the activities carried out in the 3-m TU & Go test. The measured characteristics correspond to the signals of vertical acceleration (AccZ) and sagittal angular velocity (GyrY).
Sensors 19 01647 g017
Figure 18. Distribution plots of the duration times obtained with the segmentation algorithm compared with the video analysis in the young subjects. (a) duration distribution for standing time; (b) duration distribution for the first walk time; (c) duration distribution for the 3-m turning; (d) duration distribution for the second walk time; (e) duration distribution for the pre-sitting turning time; (f) duration distribution for sitting time.
Figure 18. Distribution plots of the duration times obtained with the segmentation algorithm compared with the video analysis in the young subjects. (a) duration distribution for standing time; (b) duration distribution for the first walk time; (c) duration distribution for the 3-m turning; (d) duration distribution for the second walk time; (e) duration distribution for the pre-sitting turning time; (f) duration distribution for sitting time.
Sensors 19 01647 g018
Figure 19. Distribution plots of the duration times obtained with the segmentation algorithm compared with the video analysis in the older adults group. (a) duration distribution for standing time; (b) duration distribution for the first walk time; (c) duration distribution for the 3-m turning; (d) duration distribution for the second walk time; (e) duration distribution for the pre-sitting turning time; (f) duration distribution for sitting time.
Figure 19. Distribution plots of the duration times obtained with the segmentation algorithm compared with the video analysis in the older adults group. (a) duration distribution for standing time; (b) duration distribution for the first walk time; (c) duration distribution for the 3-m turning; (d) duration distribution for the second walk time; (e) duration distribution for the pre-sitting turning time; (f) duration distribution for sitting time.
Sensors 19 01647 g019
Table 1. Characteristics of the inertial and magnetic sensors of the IMU used.
Table 1. Characteristics of the inertial and magnetic sensors of the IMU used.
SensorAxisRangeBandwidthResolutionOutput Rate
AccelerometerXYZ±16 G62.5 Hz14 bits (≈1.95 mG)100 Hz
GyroscopeXYZ±2000 dps32 Hz16 bits (≈0.061 dps)100 Hz
MagnetometerXY±1300 [ μ T]10 Hz13 bits (≈317 [ η T])20 Hz
-Z±2500 [ μ T]10 Hz15 bits (≈152 [ η T])20 Hz
Table 2. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in young subjects.
Table 2. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in young subjects.
SubjectHeight [cm]Weight [kg]Age [years]GenderTotal Video Time sTotal IMU Time s
Subject 11799325male10.4610.26
Subject 21687025male10.7310.56
Subject 31768128male10.6610.52
Subject 41646826female8.608.24
Subject 51606825female8.638.43
Subject 61627029male8.007.92
Subject 71688026female10.409.83
Subject 81524825female10.169.29
Subject 91959026female10.3010.01
Subject 101908029male10.6610.63
Subject 111696829male10.6010.62
Subject 121537727male10.6010.61
Subject 131607333female10.9310.76
Subject 141728131female10.2610.09
Subject 151687026male10.5010.38
Subject 161737928male10.4610.40
Subject 171707325male10.5010.36
Subject 181636032male8.868.83
Subject 191757028male8.838.72
Subject 201716833male9.068.98
Subject 211657032male8.368.18
Subject 221797326male10.109.94
Subject 231857926male9.168.99
Subject 241908330male10.3310.33
Subject 251605829male10.309.99
Table 3. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in the older adults group, where rof = risk of falling, high = high risk of falling, low = low risk of falling and no = no risk of falling. The classification of the risk of falling of the subjects was obtained by following the manual of the ministry of health (MINSAL) for the Chilean population [28].
Table 3. Results from the measurements obtained by the proposed methodology versus the typical visual clinical procedure in the older adults group, where rof = risk of falling, high = high risk of falling, low = low risk of falling and no = no risk of falling. The classification of the risk of falling of the subjects was obtained by following the manual of the ministry of health (MINSAL) for the Chilean population [28].
SubjectHeight [cm]Weight [kg]Age [years]GenderTotal Video Time s (rof)Total IMU Time s (rof)
Subject 11687560male11.00 (low)10.51 (low)
Subject 21687260male9.63 (no)8.98 (no)
Subject 31569063male10.73 (low)10.57 (low)
Subject 41706865male9.76 (no)9.59 (no)
Subject 51796471male12.00 (low)11.95 (low)
Subject 61576260female13.76 (low)13.13 (low)
Subject 71789367male9.36 (no)9.27 (no)
Subject 81605163female9.03 (no)8.87 (no)
Subject 91457059female10.43 (low)9.92 (no)
Subject 101738459male10.60 (low)10.23 (low)
Subject 111606859female9.33 (low)9.44 (low)
Subject 121576193female43 (high)42.99 (high)
Table 4. Average time duration and error from the IMU activities segmentation algorithm compared to the video analysis of the young subjects.
Table 4. Average time duration and error from the IMU activities segmentation algorithm compared to the video analysis of the young subjects.
StandingFirst Walk3-m TurningSecond WalkPre-Sitting TurningSitting
Video duration s1.16 ± 0.152.51 ± 0.351.65 ± 0.252.39 ± 0.361.03 ± 0.141.48 ± 0.17
IMU duration s1.24 ± 0.072.10 ± 0.451.85 ± 0.162.22 ± 0.371.10 ± 0.061.48 ± 0.15
IMU error estimation s−0.08 ± 0.150.42 ± 0.20−0.19 ± 0.210.17 ± 0.11−0.08 ± 0.11−0.01 ± 0.19
Table 5. Characteristics obtained after the implemented segmentation of the activities performed by the young subjects during the TU & Go test.
Table 5. Characteristics obtained after the implemented segmentation of the activities performed by the young subjects during the TU & Go test.
Acc. Su [m/s 2 ]Acc. Sd [m/s 2 ]Vel. T 1 [deg/s]Vel. T 2 [deg/s]Steps W 1 Steps W 2 Steps T 1 Steps T 2 Pitch Su [deg] Pitch Sd [deg]
Minimum3.91024.459126.3125181.687523212.29793.236
Maximum9.15327.497240.6875264.562576538.99549.9904
Mean6.49 ± 1.426.09 ± 0.87156.25 ± 28.49213.62 ± 22.314.29 ± 1.294.22 ± 1.053.55 ± 0.891.88 ± 0.805.35 ± 1.896.25 ± 1.82
Table 6. Average time duration and error from the IMU activities segmentation algorithm compared to the video analysis of the older adults group.
Table 6. Average time duration and error from the IMU activities segmentation algorithm compared to the video analysis of the older adults group.
StandingFirst Walk3-m TurningSecond WalkPre-Sitting TurningSitting
Video duration s1.41 ± 0.713.49 ± 3.262.23 ± 1.823.75 ± 3.751.32 ± 0.141.65 ± 0.56
IMU duration s1.36 ± 0.503.25 ± 3.702.08 ± 1.303.56 ± 3.131.46 ± 1.301.64 ± 0.62
IMU error estimation s0.05 ± 0.300.23 ± 0.610.14 ± 0.620.19 ± 0.78−0.01 ± 0.270.01 ± 0.29
Table 7. Characteristics obtained after the implemented segmentation of the activities performed by the older adults group during the TU & Go test.
Table 7. Characteristics obtained after the implemented segmentation of the activities performed by the older adults group during the TU & Go test.
Acc. Su [m/s 2 ]Acc. Sd [m/s 2 ]Vel. T 1 [deg/s]Vel. T 2 [deg/s]Steps W 1 Steps W 2 Steps T 1 Steps T 2 Pitch Su [deg] Pitch Sd [deg]
Minimum3.893.1543.8762.0633312.821.33
Maximum9.5210.16228.18295.43353210711.0615.71
Mean7.20 ± 1.637.08 ± 2.18138.75 ± 42.21189.79 ± 57.386.91 ± 8.897.66 ± 7.814.62 ± 1.922.58 ± 1.627.48 ± 2.818.68 ± 4.59

Share and Cite

MDPI and ACS Style

Ortega-Bastidas, P.; Aqueveque, P.; Gómez, B.; Saavedra, F.; Cano-de-la-Cuerda, R. Use of a Single Wireless IMU for the Segmentation and Automatic Analysis of Activities Performed in the 3-m Timed Up & Go Test. Sensors 2019, 19, 1647. https://doi.org/10.3390/s19071647

AMA Style

Ortega-Bastidas P, Aqueveque P, Gómez B, Saavedra F, Cano-de-la-Cuerda R. Use of a Single Wireless IMU for the Segmentation and Automatic Analysis of Activities Performed in the 3-m Timed Up & Go Test. Sensors. 2019; 19(7):1647. https://doi.org/10.3390/s19071647

Chicago/Turabian Style

Ortega-Bastidas, Paulina, Pablo Aqueveque, Britam Gómez, Francisco Saavedra, and Roberto Cano-de-la-Cuerda. 2019. "Use of a Single Wireless IMU for the Segmentation and Automatic Analysis of Activities Performed in the 3-m Timed Up & Go Test" Sensors 19, no. 7: 1647. https://doi.org/10.3390/s19071647

APA Style

Ortega-Bastidas, P., Aqueveque, P., Gómez, B., Saavedra, F., & Cano-de-la-Cuerda, R. (2019). Use of a Single Wireless IMU for the Segmentation and Automatic Analysis of Activities Performed in the 3-m Timed Up & Go Test. Sensors, 19(7), 1647. https://doi.org/10.3390/s19071647

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop