Next Article in Journal
Application of an Electronic Nose Instrument to Fast Classification of Polish Honey Types
Previous Article in Journal
WO3/W Nanopores Sensor for Chemical Oxygen Demand (COD) Determination under Visible Light
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Detecting Falls with Wearable Sensors Using Machine Learning Techniques

1
Department of Electrical and Electronics Engineering, Erciyes University, Melikgazi, Kayseri TR-38039, Turkey
2
Department of Electrical and Electronics Engineering, Bilkent University, Bilkent, Ankara TR-06800, Turkey
*
Author to whom correspondence should be addressed.
Sensors 2014, 14(6), 10691-10708; https://doi.org/10.3390/s140610691
Submission received: 4 April 2014 / Revised: 30 May 2014 / Accepted: 5 June 2014 / Published: 18 June 2014
(This article belongs to the Section Physical Sensors)

Abstract

: Falls are a serious public health problem and possibly life threatening for people in fall risk groups. We develop an automated fall detection system with wearable motion sensor units fitted to the subjects' body at six different positions. Each unit comprises three tri-axial devices (accelerometer, gyroscope, and magnetometer/compass). Fourteen volunteers perform a standardized set of movements including 20 voluntary falls and 16 activities of daily living (ADLs), resulting in a large dataset with 2520 trials. To reduce the computational complexity of training and testing the classifiers, we focus on the raw data for each sensor in a 4 s time window around the point of peak total acceleration of the waist sensor, and then perform feature extraction and reduction. Most earlier studies on fall detection employ rule-based approaches that rely on simple thresholding of the sensor outputs. We successfully distinguish falls from ADLs using six machine learning techniques (classifiers): the k-nearest neighbor (k-NN) classifier, least squares method (LSM), support vector machines (SVM), Bayesian decision making (BDM), dynamic time warping (DTW), and artificial neural networks (ANNs). We compare the performance and the computational complexity of the classifiers and achieve the best results with the k-NN classifier and LSM, with sensitivity, specificity, and accuracy all above 99%. These classifiers also have acceptable computational requirements for training and testing. Our approach would be applicable in real-world scenarios where data records of indeterminate length, containing multiple activities in sequence, are recorded.

1. Introduction

With the world's aging population, health-enabling technologies and ambulatory monitoring of the elderly has become a prominent area of multi-disciplinary research [1,2]. Rapidly developing technology has made mobile and wireless devices part of daily life. An important aspect of context-aware systems is recognizing, interpreting, and monitoring the basic activities of daily living (ADLs) such as standing, sitting, lying down, walking, ascending/descending stairs, and most importantly, emergent events such as falls. If a sudden change in the center of mass of the human body results in a loss of balance, the person falls. The World Health Organization defines falls as involuntary, unexpected, and uncontrollable events resulting in a person impacting and coming to rest on the ground or at a lower level [3].

Falls need to be considered within the same framework as ADLs since they typically occur unexpectedly while performing daily activities. Falls are a public health problem and a health threat, especially for adults of age 65 and older [4]. Statistics indicate that one in every three adults of age 65 or older experiences at least one fall every year. Besides the elderly, children, disabled individuals, workers, athletes, and patients with visual, balance, gait, orthopedic, neurological, and psychological disorders also suffer from falls. The intrinsic factors associated with falls are aging, mental impairment, neurological and orthopedic diseases, vision and balance disorders. The extrinsic factors are multiple drug usage, slippery floors, poor lighting, loose carpets, handrails near bathtubs and toilets, electric or power cords, clutter and obstacles on stairways [5]. Although some of the extrinsic risk factors can be eliminated by taking necessary precautions, intrinsic factors are not readily eliminated and falls cannot be completely prevented. Since the consequences of falls can be serious and costly, falls should be detected reliably and promptly to reduce the occurrence of related injuries and the costs of healthcare. Accurate, reliable, and robust fall detection algorithms that work in real time are essential.

Monitoring people in fall risk groups should occur without intruding on their privacy, restricting their independence, or degrading their quality of life. User-activated fall detection systems do not have much practical usage. Fall detection systems need to be completely automated and may rely on multiple sources of sensory information for improved robustness. A commonly used approach is to fix various sensors to the environment, such as cameras, acoustic, pressure, vibration, force, infrared sensors, lasers, Radio Frequency Identification (RFID) tags, inertial sensors and magnetometers [6,7]. Smart environments can be designed through the use of one or more of these sensors in a complementary fashion, usually with high installation cost [8]. Other people or pets moving around may easily confuse such systems and cause false alarms. The main advantage of this approach is that the person at risk does not have to wear or carry any sensors or devices on his body. This approach may be acceptable when the activities of the person are confined to certain parts of a building. However, when the activities performed take place both indoors and outdoors and involve going from one place to another (e.g., riding a vehicle, going shopping, commuting, etc.), this approach becomes unsuitable. It imposes restrictions on the mobility of the person since the system operates only in the limited environment monitored by the sensors that are fixed to the environment.

Despite that most earlier studies followed the above approach for monitoring people in the fall risk groups, wearable motion sensors have several advantages. The 1-D signals acquired from the multiple axes of motion sensors are much simpler to process and can directly provide the required 3-D motion information. Unlike visual motion-capture systems that require a free line of sight, inertial sensors can be flexibly used inside or behind objects without occlusion. Because they are light, comfortable, and easy to carry, wearable sensors do not restrict people to a studio-like environment and can operate both indoors and outdoors, allowing free pursuit of activities. The required infrastructure and associated costs of wearable sensors are much lower than smart environments and they do not intrude on privacy. Unlike acoustic sensors, they are not affected by the ambient noise. Wearable sensors are thus suitable for developing automated fall detection systems. In this study, we follow this approach for robust and accurate detection and classification of falls that occur while performing ADLs.

Fall detection is surveyed in [9,10]. Earlier work is fragmented, of limited scope, and not very systematic. The lack of common ground among researchers makes results published so far difficult to compare, synthesize, and build upon in a manner that allows broad conclusions to be reached. Sensor configuration and modality, subject number and characteristics, considered fall types and activities, feature extraction, and acquired signal processing are different in individual studies [1114]. Although most studies have investigated voluntary (simulated) falls, a limited number of involuntary falls have been recorded in recent studies [1517]. The latter is a very difficult and time-consuming task [16]. The small number of recorded real-world falls are usually from rare disease populations that cannot be generalized to fall risk groups at large.

Machine learning techniques have been used to distinguish six activities, including falls, using an infrared motion capture system [18]. Studies that use support vector machines are reported in [19,20]. In the latter study, a computer vision based fall recognition system is proposed that combines depth map with normal RGB color information. Better results are achieved with this combination as the depth map reduces the errors and provides more information about the scene. Falls are then recognized and distinguished from ADLs using support vector machines, with accuracy above 95%.

To achieve robust and reliable fall detection and enable comparing different studies, open datasets acquired through standardized experimental procedures are necessary. We found only three works that provide guidelines for fall experiments [2123] and only one that pursues them [8]. In [23], it is stated that there is no open database for falls and the desirable structure and characteristics of a fall database are described.

Although some commercial devices and patents on fall detection exist, these devices are not satisfactory [22]. The main reasons are the high false alarm rates, high initial and maintenance costs of the devices, and their non-ergonomic nature. Wearable fall detection systems are criticized mainly because people may forget, neglect, or not want to wear them. If they are battery operated, batteries will have to be replaced or recharged from time to time. However, with the advances of the Micro Electro Mechanical Sensors (MEMS) technology, these devices have recently become much smaller, more compact, and less expensive. They can be easily integrated to other available alarm systems in the vicinity or to the accessories that the person carries. The lightness, low power consumption, and wireless use of these devices have eliminated the concerns related to their portability and discomfort. Furthermore, smartphones that usually contain embedded accelerometers are suitable devices for executing fall detection algorithms [2426].

Through wearable sensors and machine learning techniques, this study aims to robustly and accurately detect falls that occur while performing ADLs. Instead of using simple rule-based algorithms that rely on thresholding the sensory output (as in most earlier works), we employ features of the recorded signals around the point of peak acceleration. To be able to acquire the sufficient amount of data for algorithm development according to the guidelines provided in [23], we limit our study to voluntary (simulated) falls.

The rest of this article is organized as follows: in Section 2, we describe data acquisition and briefly overview the six machine learning techniques. In Section 3, we compare the performance and the computational requirements of the techniques based on experiments on the same dataset. We discuss the results in Section 4, and draw conclusions and indicate directions for future research in Section 5.

2. Material and Methods

2.1. Data Acquisition

We used the six MTw sensor units that are part of the MTw Software Development Kit manufactured by Xsens Technologies [27]. Each unit comprises three tri-axial devices (accelerometer, gyroscope, and magnetometer/compass) with respective ranges of ±120 m/s2, ±1200°/s, and ±1.5 Gauss, and an atmospheric pressure meter with 300–1100 hPa operating range, which we did not use. We calibrated the sensors before each volunteer began the experiments and captured and recorded raw motion data with a sampling frequency of 25 Hz. Acceleration, rate of turn, and the strength of the Earth's magnetic field along three perpendicular axes (x, y, z) were recorded for each unit. Measurements were transmitted over an RF connection (ZigBee) to Xsens' Awinda Station connected to a remote PC with a USB interface.

2.2. Experimental Procedure

We followed the guidelines provided in [23] for designing fall experiments. With Erciyes University Ethics Committee approval, seven male (24 ± 3 years old, 67.5 ± 13.5 kg, 172 ± 12 cm) and seven female (21.5 ± 2.5 years old, 58.5 ± 11.5 kg, 169.5 ± 12.5 cm) healthy volunteers participated in the study with informed written consent. We performed the tests at Erciyes University Clinical Research and Technology Center. We fitted the six wireless sensor units tightly with special straps to the subjects' head, chest, waist, right wrist, right thigh, and right ankle (Figure 1). Unlike cabled systems, wireless data acquisition allows users to perform motions more naturally. Volunteers wore a helmet, wrist guards, knee and elbow pads, and performed the activities on a soft crash mat to prevent injuries, each trial lasting about 15 s on the average.

A set of trials consists of 20 fall actions and 16 ADLs (Table 1) adopted from [23]; the 14 volunteers repeated each set five times. We thus acquired a considerably diverse dataset comprising 1400 falls (20 tasks × 14 volunteers × 5 trials) and 1120 ADLs (16 tasks × 14 volunteers × 5 trials), resulting in 2520 trials. Many of the non-fall actions included in our dataset are high-impact events that may be easily confused with falls. Such a large dataset is useful for testing/validating fall detection and classification algorithms.

2.3. Feature Selection and Reduction

Earlier studies on fall detection mostly use simple thresholding of the sensory outputs (e.g., accelerations, rotational rates) because of its simplicity and low processing time. This approach is not sufficiently robust or reliable because there are different fall types and their nature shows variations for each individual. Furthermore, certain ADLs can be easily confused with falls. For improved robustness, we consider additional features of the recorded signals. The total acceleration of the waist accelerometer is given by:

A T = A x 2 + A y 2 + A z 2
where Ax, Ay, and Az are the accelerations along the x, y, and z axes, respectively. We first identify the time index corresponding to the peak AT value of the waist accelerometer in each record. Then, we take the two-second intervals (25 Hz × 2 s = 50 samples) before and after this point, corresponding to a time window of 101 samples (50 + AT index + 50) and ignore the rest of the record. Data from the remaining axes of each sensor unit are also reduced in the same way, considering the time index obtained from the waist sensor as reference, resulting in six 101 × 9 arrays of data. Each column of data is represented by an N × 1 vector s = [s1, s2,…, sN]T, where N = 101. Extracted features consist of the minimum, maximum, and mean values, as well as variance, skewness, kurtosis, the first 11 values of the autocorrelation sequence, and the first five peaks of the discrete Fourier transform (DFT) of the signal with the corresponding frequencies:
mean ( s ) = μ = 1 N n = 1 N s n variance ( s ) = σ 2 = 1 N n = 1 N ( s n μ ) 2 skewness ( s ) = 1 N σ 3 n = 1 N ( s n μ ) 3 kurtosis ( s ) = 1 N σ 4 n = 1 N ( s n μ ) 4 autocorrelation ( s ) = 1 N Δ n = 0 N Δ 1 ( s n μ ) ( s n Δ μ ) Δ = 0 , 1 , , N 1 DFT q ( s ) = n = 0 N 1 s n e j 2 π q n N q = 0 , 1 , , N 1
Here, DFTq(s) is the qth element of the 1-D N-point DFT. We performed feature extraction for the 15,120 records (36 motions × 14 volunteers × 5 trials × 6 sensors). The first five features extracted from each axis of a sensor unit are the minimum, maximum, mean, skewness, and kurtosis values. Because each unit contains nine axes, 45 features were obtained (9 axes × 5 values). Autocorrelation produces 99 features (9 axes × 11 features). DFT produces 5 frequency and 5 amplitude values, resulting in a total of 90 features (9 axes × 10 values). Thus, 234 features are extracted from each sensor unit in total (45 + 99 + 90), resulting in a feature vector of dimension 1404 × 1 (=234 features x 6 sensors) for each trial.

Because the initial set of features was quite large (1404) and not all features were equally useful in discriminating between the falls and ADLs, to reduce the computational complexity of training and testing the classifiers, we reduced the number of features from 1404 to M = 30 through principal component analysis (PCA) [28] and normalized the resulting features between 0 and 1. PCA is a transformation that finds the optimal linear combinations of the features, in the sense that they represent the data with the highest variance in a feature subspace, without taking the intra-class and inter-class variances into consideration separately. The reduced dimension of the feature vectors is determined by observing the eigenvalues of the covariance matrix of the 1404 × 1 feature vectors, sorted in Figure 2a in descending order. The largest 30 eigenvalues constitute 72.38% of the total variance of the principal components and account for much of the variability of the data. The 30 eigenvectors corresponding to the largest 30 eigenvalues (Figure 2b) are used to form the transformation matrix, resulting in 30 × 1 feature vectors.

2.4. Classification Using Machine Learning Techniques

A reliable fall detection system requires well-designed, fast, effective, and robust algorithms to make a binary decision on whether a fall has occurred. Its performance can be measured by the following success criteria:

  • Sensitivity (Se) is the capacity of the system to detect falls and corresponds to the ratio of true positives to the total number of falls:

    S e = T P T P + F N × 100

  • Specificity (Sp) is the capacity of the system to detect falls only when they occur:

    S p = T N T N + F P × 100

  • Accuracy (Acc) corresponds to the correct differentiation between falls and non-falls:

    A c c = T P + T N T P + T N + F P + T N × 100

Here, TP (a fall occurs; the algorithm detects it), TN (a fall does not occur; the algorithm does not detect a fall), FP (a fall does not occur but the algorithm reports a fall), and FN (a fall occurs but the algorithm misses it) are the numbers of true positives and negatives, and false positives and negatives, respectively. Obviously, there is an inverse relationship between sensitivity and specificity For instance, in an algorithm that employs simple thresholding, as the threshold level is decreased, the rate of FN decreases and the sensitivity of the algorithm increases. On the other hand, FP rate increases and specificity decreases. As the threshold level is increased, the opposite happens: sensitivity decreases and specificity increases. Based on these definitions, FP and FN ratios can be obtained as:

FP ratio = 1 S p FN ratio = 1 S e

In this study, we consider falls with ADLs because falls typically occur unexpectedly while performing daily activities. An ideal fall detection system should especially be able to distinguish between falls and ADLs that can cause high acceleration of body parts (e.g., jumping, sitting down suddenly). The algorithms must be sufficiently robust, intelligent, and sensitive to minimize FPs and FNs. False alarms (FPs) caused by misclassified ADLs, although a nuisance, can be canceled by the user. However, it is crucial not to misclassify falls as some other activity. FNs, which indicate missed falls, must be avoided by all means, since user manipulation may not be possible if a fall results in physical and/or mental impairment. For example, long periods of inactivity (such as those that may occur after a fall) may be confused with the state of sleeping or resting.

We distinguish falls from ADLs with six machine learning techniques and compare their performances based on their sensitivity, specificity, accuracy, and computational complexity. In training and testing, we randomly split the dataset into p = 10 equal partitions and employ p-fold cross validation. We use p − 1 partitions for training and reserve the remaining partition for testing (validation). When this is repeated for each partition, training and validation partitions cross over in p successive rounds and each record in the dataset gets a chance of validation.

2.4.1. The k-Nearest Neighbor Classifier (k-NN)

The k-NN method classifies a given object based on the closest training object(s) [28]. Class decision is made by majority voting from among a chosen number of nearest neighbors k, where k > 0. There is no standard value for k because the k-NN algorithm is sensitive to the local data structure. Smaller k values increase the variance and make the results less stable, whereas larger k values increase the bias but reduce the sensitivity. Therefore, the proper choice of k depends on the particular dataset. In this work, we determined the value of k experimentally as k = 7, based on our dataset.

2.4.2. The Least Squares Method (LSM)

In LSM, two average reference vectors are calculated for the two classes that correspond to falls and ADLs [28]. A given test vector x = [x1,…,xm]T is compared with each reference vector ri = [ri1, …, riM]T, i = 1, 2 by calculating the sum of the squared differences between them:

ɛ i 2 = m = 1 M ( x m r i m ) 2
The class decision is made by minimizing ɛ i 2.

2.4.3. Support Vector Machines (SVM)

The initial set of coefficients and kernel models affect the classification outcome of SVMs. The training data (xj, lj), j = 1,…, J is of length J, where xj ∈ ℝN and the class labels are lj ∈ {1, −1} for the two classes (falls and ADLs). We used a radial basis kernel function K(x, xj) = eγ|x−xj|2, where γ = 0.2, with a library for SVM, called LIBSVM toolbox in the MATLAB environment [29].

2.4.4. Bayesian Decision Making (BDM)

BDM is a robust and widely used approach in statistical pattern classification. We use the normal density discriminant function for the likelihood in BDM, where the parameters are the mean µ and the covariance matrix C of the training vectors for each class. These are calculated based on the training records of the two classes and are constant for each fold. A given test vector x is assigned to the class with the larger likelihood calculated as follows [28]:

L ( class i ) = 1 2 { ( x μ i ) T C i 1 ( x μ i ) + log [ det ( C i ) ] } i = 1 , 2

2.4.5. Dynamic Time Warping (DTW)

DTW provides a measure of the similarity between two time sequences that may vary in time or speed [30]. The sequences are warped nonlinearly in time to find the least-cost warping path between the test vector and the stored reference vectors. Typically, the Euclidean distance is used as a cost measure between the elements of the test and reference vectors. DTW is employed in applications such as automatic speech recognition to handle different speaking speeds, signature and gait recognition, ECG signal classification, fingerprint verification, word spotting in handwritten historical documents on electronic media and machine-printed documents, and face localization in color images. Here, DTW is used for classifying feature vectors of different activities extracted from the signals of motion sensor units.

2.4.6. Artificial Neural Networks (ANNs)

ANNs are comprised of a set of independent processing units that receive inputs through weighted connections [31]. We implemented a three-layer ANN with 30 neurons each in the input and the hidden layers, and a single neuron at the output layer. In the hidden layer, we use the sigmoid activation function. At the output neuron, we use the purelin linear activation function, which makes the class decision according to the rule:

If OUT 0.5 then ADL , else fall

We created the ANN using the Neural Networks Toolbox in the MATLAB environment and trained it with the Levenberg-Marquardt algorithm.

3. Results

The framework used for the study is subject independent; the classifiers considered here were used to process the complete dataset, instead of designing different classifiers for each subject. We present the performance comparison of the six classifiers in Table 2. The k-NN classifier gives the best accuracy (99.91%), followed by LSM, SVM, BDM, DTW, and ANN. The k-NN has 100% sensitivity, indicating that falls are not missed with this method; however, two to three ADLs were misclassified over 2520 trials in 10 rounds (Table 3). The average accuracies and standard deviations of the classifiers over 10 rounds are provided in Table 3, where we observe the similarity of the results in each round, indicating their repeatability. Because the k-NN classifier and LSM do not miss any falls, we consider them both reliable classifiers. ROC curves for the classifiers are depicted in Figure 3.

We compare the computational requirements of the six machine learning techniques in the last two rows of Table 2 in terms of the training and testing times required for a single fold of the dataset that contains 252 feature vectors. We implemented the algorithms in a MATLAB 7.7.0 environment on a Windows 7 computer with a 2.67 GHz quad core 64-bit Intel Core i5 processor and 4 GB of RAM. In terms of the required training time, the classifiers can be sorted as BDM, LSM, DTW, k-NN, SVM, and ANN in increasing order. In terms of the testing time, the order is ANN, SVM, LSM, BDM, k-NN, and DTW.

4. Discussion

The availability of standardized open databases allows researchers to compare their results with those of others. Diversity of the subjects, activity spectrum, and the number of trials are important factors in constructing a database. When a limited number of activities that are easy to discriminate between are performed by a small number of subjects, it may be possible to achieve very high accuracies. However, such performance may not be maintained when the set of activities is broadened or new subjects participate in the tests. Although some studies with very high (∼100%) sensitivity and specificities exist [32,33], the performance of these algorithms degrades when implemented in the real world under realistic conditions and with new users. There are many academic works with promising results but no reliable off-the-shelf product on the market.

The ADLs that we recorded in this study and included in our dataset are a subset of real-world ADLs, many of which are high-impact events that may be easily confused with falls. Since laboratory-recorded ADLs/falls and those that occur in a natural setting may have some differences, we compared the average and peak acceleration values of the voluntary falls that we recorded, with those in [17], where some involuntary falls by the elderly are recorded. Figure 4 shows sample signals recorded by the waist sensor in our experiments (which is also the location of the sensor in [17]). Back sitting, back lying, and rolling out of bed (Table 1; fall actions 9, 10, and 20, respectively) recordings are illustrated, with average values for female/male volunteers over 35 (= 7 subjects × 5 trials) fall actions each and the minimum/maximum total acceleration. The minimum/maximum values are determined over all records and may belong to a female or a male volunteer. We observe that for a given type of fall, features of the signals recorded from voluntary and involuntary falls are similar in nature. The average duration of the impact from the maximum to the minimum value of total acceleration in both fall types (voluntary and involuntary) is about 0.2 s. Thus, our experimental records are consistent with involuntary falls recorded in an independently conducted study.

Our approach would be applicable to real-world settings where continuous data streams of indeterminate length, containing multiple activities, are recorded. If the data stream contains falls in between a sequence of ADLs, the multiple acceleration peaks can be easily identified. The signal pattern in the time window around each peak can then be processed with machine learning techniques to evaluate if it indeed corresponds to a fall. In real-world testing, we expect our system to give slightly lower accuracies than under laboratory conditions.

The algorithms can be easily embedded into portable devices or accessories carried on the body that can be connected to a telephone network [34]. This feature will allow prompt medical attention, improve the safety, independence, and quality of living of those in fall risk groups, and contribute to the economy by reducing the costs of medical healthcare.

5. Conclusions

We employ six classifiers based on machine learning to distinguish between falls and ADLs using previously proposed, standardized experimental procedures. We compare the performance and computational requirements of the machine learning techniques based on the same dataset and achieve accuracies above 95%. The repeatability of the results over the 10 runs indicates the robustness of the classifiers. The k-NN and LSM methods do not miss any falls; thus, we consider them reliable classifiers. These classifiers also have acceptable computational requirements for training and testing, making them suitable for real-time applications. The fact that we use standardized experimental procedures to perform a comprehensive set of fall experiments sets an example in the fall detection area. This also makes our approach more applicable to real-world scenarios where data records of indeterminate length, containing multiple activities in sequence, are recorded. We plan to test the system with continuous data streams acquired from falls and ADLs. To enable comparison among the algorithms developed in different studies, we intend to make our dataset publicly available at the University of Irvine Machine Learning Repository [35]. Our daily and sports activities dataset is already available at the same website [36]. In our current work, we are investigating which of the six motion sensor units and which axes of these sensors are most useful in activity and fall detection [37]. Incorporating information from biomedical sensors for vital signs and audio sensors may further improve the robustness of our fall detection system. Our ongoing work considers embedding fall detection algorithms to a mobile device (e.g., a smartphone) to be worn around the waist level.

Acknowledgments

This work was supported by Erciyes University Scientific Research Project Coordination Department under grant number FBA-11-3579 and the Scientific and Technological Research Council of Turkey (TÜBİTAK) 2218 National Postdoctoral Research Scholarship Program. The authors wish to thank Kenan Danışman, Director of Clinical Research and Technology Center at Erciyes University, for providing the experimental area during the course of the project. The authors also would like to thank the 14 volunteers who participated in our experiments for their efforts, dedication, and time.

Author Contributions

Ahmet Turan Özdemir coordinated the experimental work, analyzed the experimental data, and prepared the initial draft of the manuscript. Billur Barshan proposed the research topic and the machine learning approach and supervised the study. She also contributed significantly to the writing and organization of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rodríguez-Martín, D.; Pérez-López, C.; Samà, A.; Cabestany, J.; Català, A.A. Wearable inertial measurement unit for long-term monitoring in the dependency care area. Sensors 2013, 13, 14079–14104. [Google Scholar]
  2. Ludwig, W.; Wolf, K.-H.; Duwenkamp, C.; Gusew, N.; Hellrung, N.; Marschollek, M.; Wagner, M.; Haux, R. Health-enabling technologies for the elderly—an overview of services based on a literature review. Comput. Meth. Prog. Biomed. 2012, 106, 70–78. [Google Scholar]
  3. World Health Organization. Available online: http://www.who.int/violence_injury_prevention/other_injury/falls/en/ (accessed on 15 June 2014).
  4. Zakaria, N.A.; Kuwae, Y.; Tamura, T.; Minato, K.; Kanaya, S. Quantitative analysis of fall risk using TUG test. Comput. Meth. Biomech. Biomed. Eng. 2013. [Google Scholar] [CrossRef]
  5. Bianchi, F.; Redmond, S.J.; Narayanan, M.R.; Cerutti, S.; Lovell, N.H. Barometric pressure and triaxial accelerometry-based falls event detection. IEEE Trans. Neural Syst. Rehabil. Eng. 2010, 18, 619–627. [Google Scholar]
  6. Doukas, C.N.; Maglogiannis, I. Emergency fall incidents detection in assisted living environments utilizing motion, sound and visual perceptual components. IEEE Trans. Inf. Technol. B. 2011, 15, 277–289. [Google Scholar]
  7. Roetenberg, D.; Slycke, P.J.; Veltink, P.H. Ambulatory position and orientation tracking fusing magnetic and inertial sensing. IEEE Trans. Bio-Med. Eng. 2007, 54, 883–890. [Google Scholar]
  8. Rimminen, H.; Lindström, J.; Linnavuo, M.; Sepponen, R. Detection of falls among the elderly by a floor sensor using the electric near field. IEEE Trans. Inf. Technol. B. 2010, 14, 1475–1476. [Google Scholar]
  9. Mubashir, M.; Shao, L.; Seed, L. A survey on fall detection: Principles and approaches. Neurocomputing 2013, 100, 144–152. [Google Scholar]
  10. Yang, C.-C.; Hsu, Y.-L. A review of accelerometry-based wearable motion detectors for physical activity monitoring. Sensors 2010, 10, 7772–7788. [Google Scholar]
  11. Malhi, K.; Mukhopadhyay, S.C.; Schnepper, J.; Haefke, M.; Ewald, H. A Zigbee-based wearable physiological parameters monitoring system. IEEE Sens. J. 2012, 12, 423–430. [Google Scholar]
  12. Aziz, O.; Robinovitch, S.N. An analysis of the accuracy of wearable sensors for classifying the causes of falls in humans. IEEE Trans. Neural Syst. Rehabil. Eng. 2011, 19, 670–676. [Google Scholar]
  13. Mariani, B.; Rochat, S.; Büla, C.J.; Aminian, K. Heel and toe clearance estimation for gait analysis using wireless inertial sensors. IEEE Trans. Biomed. Eng. 2012, 59, 3162–3168. [Google Scholar]
  14. Zhang, M.; Sawchuk, A.A. Human daily activity recognition with sparse representation using wearable sensors. IEEE J. Biomed. Health Inform. 2013, 17, 553–560. [Google Scholar]
  15. Bagalà, F.; Becker, C.; Cappello, A.; Chiari, L.; Aminian, K.; Hausdorff, J.M.; Zijlstra, W.; Klenk, J. Evaluation of accelerometer-based fall detection algorithms on real-world falls. PLoS One 2012, 7, e37062. [Google Scholar]
  16. Klenk, J.; Becker, C.; Lieken, F.; Nicolai, S.; Maetzler, W.; Alt, W.; Zijlstra, W.; Hausdorff, J.M.; van Lummel, R.C.; Chiari, L.; Lindemann, U. Comparison of acceleration signals of simulated and real-world backward falls. Med. Eng. Phys. 2011, 33, 368–373. [Google Scholar]
  17. Kangas, M.; Vikman, I.; Nyberg, L.; Korpelainen, R.; Lindblom, J.; Jämsä, T. Comparison of real-life accidental falls in older people with experimental falls in middle-aged test subjects. Gait Posture 2012, 35, 500–505. [Google Scholar]
  18. Luštrek, M.; Kaluža, B. Fall detection and activity recognition with machine learning. Informatica 2009, 33, 205–212. [Google Scholar]
  19. Liu, S.-H.; Cheng, W.-C. Fall detection with the support vector machine during scripted and continuous unscripted activities. Sensors 2012, 12, 12301–12316. [Google Scholar]
  20. Dubey, R.; Ni, B.; Moulin, P. A depth camera based fall recognition system for the elderly. Image Anal. Recognit., Lect. Notes Comput. Sci. 2012, 7325, 106–113. [Google Scholar]
  21. Noury, N.; Fleury, A.; Rumeau, P.; Bourke, A.K.; Laighin, G.O.; Rialle, V.; Lundy, J.E. Fall detection—principles and methods. Proceedings of the 29th Annual International Conference of the IEEE EMBS, Lyon, France, 23–26 August 2007; pp. 1663–1666.
  22. Noury, N.; Rumeau, P.; Bourke, A.K.; ÓLaighin, G.; Lundy, J.E. A proposal for the classification and evaluation of fall detectors. IRBM 2008, 29, 340–349. [Google Scholar]
  23. Abbate, S.; Avvenuti, M.; Corsini, P.; Vecchio, A.; Light, J. Monitoring of Human Movements for Fall Detection and Activities Recognition in Elderly Care Using Wireless Sensor Network: A Survey. In Application-Centric Design Book, 1st ed; InTech: Rijeka, Croatia, 2010; Chapter 9; pp. 147–166. [Google Scholar]
  24. Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity recognition using cell phone accelerometers. ACM SIGKDD Explor. Newsl. 2010, 12, 74–82. [Google Scholar]
  25. Dai, J.; Bai, X.; Yang, Z.; Shen, Z.; Xuan, D. Mobile phone-based pervasive fall detection. Pers. Ubiquitous Comput. 2010, 14, 633–643. [Google Scholar]
  26. Lee, R.Y.W.; Carlisle, A.J. Detection of falls using accelerometers and mobile phone technology. Age Aging 2011, 40, 690–696. [Google Scholar]
  27. MTw User Manual and Technical Documentation; Xsens Technologies B.V.: Enschede, The Netherlands, 2014. Available online: http://www.xsens.com (accessed on 15 June 2014).
  28. Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification, 2nd ed; John Wiley & Sons, Inc.: New York, NY, USA, 2001. [Google Scholar]
  29. Chang, C.-C.; Lin, C.-J. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol. 2011, 2. Article No: 27. [Google Scholar]
  30. Keogh, E.; Ratanamahatana, C.A. Exact indexing of dynamic time warping. Knowl. Inf. Syst. 2005, 7, 358–386. [Google Scholar]
  31. Haykin, S. Neural Networks and Learning Machines; Prentice Hall: Upper Saddle River, NJ, USA, 2009. [Google Scholar]
  32. Liu, J.; Lockhart, T.E. Automatic individual calibration in fall detection—an integrative ambulatory measurement framework. Comput. Meth. Biomech. Biomed. Eng. 2013, 16, 504–510. [Google Scholar]
  33. Bourke, A.K.; van de Ven, P.; Gamble, M.; O'Connor, R.; Murphy, K.; Bogan, E.; McQuade, E.; Finucane, P.; ÓLaighin, G.; Nelson, J. Evaluation of waist-mounted tri-axial accelerometer based fall-detection algorithms during scripted and continuous unscripted activities. J. Biomech. 2010, 43, 3051–3057. [Google Scholar]
  34. Chang, S.-Y.; Lai, C.-F.; Chao, H.-C.J.; Park, J.H.; Huang, Y.-M. An environmental-adaptive fall detection system on mobile device. J. Med. Syst. 2011, 35, 1299–1312. [Google Scholar]
  35. University of Irvine Machine Learning Repository. Available online: http://archive.ics.uci.edu/ml/ (accessed on 15 June 2014).
  36. Barshan, B.; Altun, K.; Daily and Sports Activities Dataset. University of California Irvine, Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences, 2013. Available online: http://archive.ics.uci.edu/ml/datasets/Daily+and+Sports+Activities (accessed on 15 June 2014).
  37. Dobrucalı, O.; Barshan, B. Sensor-Activity Relevance in Human Activity Recognition with Wearable Motion Sensors and Mutual Information Criterion. In Information Sciences and Systems 2013; Proceedings of the 28th International Symposium on Computer and Information Sciences, Paris, France, 28–29 October 2013, Gelenbe, E., Lent, R., Eds.; Springer International Publishing: Cham, Switzerland, 2013; pp. 285–294.
Figure 1. (ac) The configuration of the six MTw units on a volunteer's body; (d) single MTw unit, encasing three tri-axial devices (accelerometer, gyroscope, and magnetometer) and an atmospheric pressure sensor; (e) the three perpendicular axes of a single MTw unit; (f) remote computer, Awinda Station and the six MTw units.
Figure 1. (ac) The configuration of the six MTw units on a volunteer's body; (d) single MTw unit, encasing three tri-axial devices (accelerometer, gyroscope, and magnetometer) and an atmospheric pressure sensor; (e) the three perpendicular axes of a single MTw unit; (f) remote computer, Awinda Station and the six MTw units.
Sensors 14 10691f1 1024
Figure 2. (a) All eigenvalues (1404) and (b) the first 50 eigenvalues of the covariance matrix sorted in descending order.
Figure 2. (a) All eigenvalues (1404) and (b) the first 50 eigenvalues of the covariance matrix sorted in descending order.
Sensors 14 10691f2 1024
Figure 3. ROC curves for some of the classifiers.
Figure 3. ROC curves for some of the classifiers.
Sensors 14 10691f3 1024
Figure 4. Total acceleration of the waist sensor during the fall actions: (a) back sitting; (b) back lying; and (c) rolling out of bed. The average total acceleration for female/male volunteers and the overall minimum/maximum total acceleration values that occurred during the experiments are shown.
Figure 4. Total acceleration of the waist sensor during the fall actions: (a) back sitting; (b) back lying; and (c) rolling out of bed. The average total acceleration for female/male volunteers and the overall minimum/maximum total acceleration values that occurred during the experiments are shown.
Sensors 14 10691f4 1024
Table 1. Fall and non-fall actions (ADLs) considered in this study.
Table 1. Fall and non-fall actions (ADLs) considered in this study.
Fall Actions

#LabelDescription
1front-lyingfrom vertical falling forward to the floor
2front-protecting-lyingfrom vertical falling forward to the floor with arm protection
3front-kneesfrom vertical falling down on the knees
4front-knees-lyingfrom vertical falling down on the knees and then lying on the floor
5front-rightfrom vertical falling down on the floor, ending in right lateral position
6front-leftfrom vertical falling down on the floor, ending in left lateral position
7front-quick-recoveryfrom vertical falling on the floor and quick recovery
8front-slow-recoveryfrom vertical falling on the floor and slow recovery
9back-sittingfrom vertical falling on the floor, ending sitting
10back-lyingfrom vertical falling on the floor, ending lying
11back-rightfrom vertical falling on the floor, ending lying in right lateral position
12back-leftfrom vertical falling on the floor, ending lying in left lateral position
13right-sidewayfrom vertical falling on the floor, ending lying
14right-recoveryfrom vertical falling on the floor with subsequent recovery
15left-sidewayfrom vertical falling on the floor, ending lying
16left-recoveryfrom vertical falling on the floor with subsequent recovery
17syncopefrom standing falling on the floor following a vertical trajectory
18syncope-wallfrom standing falling down slowly slipping on a wall
19podiumfrom vertical standing on a podium going on the floor
20rolling-out-bedfrom lying, rolling out of bed and going on the floor

Non-Fall Actions (ADLs)

#LabelDescription

21lying-bedfrom vertical lying on the bed
22rising-bedfrom lying to sitting
23sit-bedfrom vertical to sitting with a certain acceleration onto a bed (soft surface)
24sit-chairfrom vertical to sitting with a certain acceleration onto a chair (hard surface)
25sit-sofafrom vertical to sitting with a certain acceleration onto a sofa (soft surface)
26sit-airfrom vertical to sitting in the air exploiting the muscles of legs
27walking-fwwalking forward
28joggingrunning
29walking-bwwalking backward
30bendingbending about 90 degrees
31bending-pick-upbending to pick up an object on the floor
32stumblestumbling with recovery
33limpwalking with a limp
34squatting-downsquatting, then standing up
35trip-overbending while walking and then continuing walking
36coughing-sneezingcoughing or sneezing
Table 2. Comparison of the results and the computational requirements of the six machine learning techniques in terms of the training and testing times for a single fold (P: positive, N: negative).
Table 2. Comparison of the results and the computational requirements of the six machine learning techniques in terms of the training and testing times for a single fold (P: positive, N: negative).
k-NNLSMSVMBDMDTWANN

Confusion Matrices

PNPNPNPNPNPN
TrueP14000140001393.96.1139821381.418.61364.635.4
N2.31117.78.71111.37111316.71103.335.51084.573.51046.5

Se (%)10010099.5699.8698.6797.47
Sp (%)99.7999.2299.3898.5196.8393.44
Acc (%)99.9199.6599.4899.2697.8595.68

Computational Time (ms)

Training318.22.2893.71.92.510,089.0
Test76.632.716.272.633,816.613.5
Table 3. Classifier results over 10 successive rounds. AVG: average, STD: standard deviation (continued).
Table 3. Classifier results over 10 successive rounds. AVG: average, STD: standard deviation (continued).
Run12345678910AVGSTD
Se (%)1001001001001001001001001001001000
Sp (%)99.7399.8299.8299.7399.7399.8299.8299.8299.8299.8299.790.0431
Acc (%)99.8899.9299.9299.8899.8899.9299.9299.9299.9299.9299.910.0192
TN11171118111811171117111811181118111811181117.70.4830
FP32233222222.30.4830
TP140014001400140014001400140014001400140014000
FN000000000000

(a) k-NN

Run12345678910AVGSTD

Se (%)1001001001001001001001001001001000
Sp (%)99.2999.2999.2099.2099.2099.1199.1199.3899.2099.2999.220.0847
Acc (%)99.6899.6899.6499.6499.6499.6099.6099.7299.6499.6899.650.0376
TN11121112111111111111111011101113111111121111.30.9487
FP8899910107988.70.9487
TP140014001400140014001400140014001400140014000
FN000000000000

(b) LSM

Run12345678910AVGSTD

Se (%)99.6499.5099.6499.5799.5099.5799.5099.5099.5799.6499.560.0625
Sp (%)99.4699.2998.9399.4699.4699.5599.2999.5599.2999.4699.380.1882
Acc (%)99.5699.4099.3399.5299.4899.5699.4099.5299.4499.5699.480.0825
TN111411121108111411141115111211151112111411132.1082
FP6812665858672.1082
TP13951393139513941393139413931393139413951393.90.8756
FN57567677656.10.8756

(c) SVM

Run12345678910AVGSTD

Se (%)99.8699.8699.8699.8699.8699.8699.8699.8699.8699.8699.860
Sp (%)98.5798.5798.4899.4898.3998.5798.4898.5798.4898.4898.510.0603
Acc (%)99.2999.2999.2599.2599.2199.2999.2599.2999.2599.2599.260.0268
TN11041104110311031102110411031104110311031103.30.6749
FP1616171718161716171716.70.6749
TP139813981398139813981398139813981398139813980
FN222222222220

(d) BDM

Run12345678910AVGSTD

Se (%)98.7198.7198.7998.7998.5798.6498.7998.4398.5098.7998.670.1313
Sp (%)97.7997.9697.2397.1496.6197.2396.9696.6196.2596.5296.830.3321
Acc (%)97.8697.9498.1098.0697.7098.0297.9897.6297.5097.7897.850.1992
TN10841086108910881182108910861082107810811084.53.7193
FP3634313238313438423935.53.7193
TP13821382138313831380138113831378137913831381.41.8379
FN1818171720191722211718.61.8379

(e) DTW

Run12345678910AVGSTD

Se (%)97.6497.9396.5798.0097.2997.5097.8697.0097.2197.7197.470.4545
Sp (%)93.3993.2194.1193.7592.8693.5793.8494.3892.8692.4193.440.6132
Acc (%)95.7395.8395.4896.1195.3295.7596.0795.8395.2895.3695.680.3048
TN10461044105410501040104810511057104010351046.56.8678
FP7476667080726963808573.56.8678
TP13671371135213721362136513701358136113681364.66.3631
FN3329482838353042393235.46.3631

(f) ANN

Share and Cite

MDPI and ACS Style

Özdemir, A.T.; Barshan, B. Detecting Falls with Wearable Sensors Using Machine Learning Techniques. Sensors 2014, 14, 10691-10708. https://doi.org/10.3390/s140610691

AMA Style

Özdemir AT, Barshan B. Detecting Falls with Wearable Sensors Using Machine Learning Techniques. Sensors. 2014; 14(6):10691-10708. https://doi.org/10.3390/s140610691

Chicago/Turabian Style

Özdemir, Ahmet Turan, and Billur Barshan. 2014. "Detecting Falls with Wearable Sensors Using Machine Learning Techniques" Sensors 14, no. 6: 10691-10708. https://doi.org/10.3390/s140610691

Article Metrics

Back to TopTop