Next Article in Journal
Maximum Power Point Tracking of a Grid Connected PV Based Fuel Cell System Using Optimal Control Technique
Next Article in Special Issue
A Multi-Criteria Analysis Approach to Identify Flood Risk Asset Damage Hotspots in Western Australia
Previous Article in Journal
Water Pollution and Pollution–Control Capacity in Chinese Provinces: Panel Estimations of Provincial Environmental Kuznets Curves
Previous Article in Special Issue
Smart Diagnosis of Adenocarcinoma Using Convolution Neural Networks and Support Vector Machines
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Cost-Effective Fall-Detection Framework for the Elderly Using Sensor-Based Technologies

1
Department of Creative Technologies, Air University, Islamabad 44000, Pakistan
2
Department of Computer Sciences, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh 11671, Saudi Arabia
3
Department of Computer Science, COMSATS University, Islamabad 44000, Pakistan
4
Faculty of Computing, Riphah International University, Islamabad 45210, Pakistan
5
Department of Information Technology, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh 11671, Saudi Arabia
6
School of Digital Science, Universiti Brunei Darussalam, Jalan Tungku Link, Gadong BE1410, Brunei
7
Department of Information and Communication Technology, University of Agder (UiA), N-4898 Grimstad, Norway
8
Department of Software Engineering, Capital University of Science and Technology, Islamabad 44000, Pakistan
*
Authors to whom correspondence should be addressed.
Sustainability 2023, 15(5), 3982; https://doi.org/10.3390/su15053982
Submission received: 26 January 2023 / Revised: 19 February 2023 / Accepted: 19 February 2023 / Published: 22 February 2023

Abstract

:
Falls are critical events among the elderly living alone in their rooms and can have intense consequences, such as the elderly person being left to lie for a long time after the fall. Elderly falling is one of the serious healthcare issues that have been investigated by researchers for over a decade, and several techniques and methods have been proposed to detect fall events. To overcome and mitigate elderly fall issues, such as being left to lie for a long time after a fall, this project presents a low-cost, motion-based technique for detecting all events. In this study, we used IRA-E700ST0 pyroelectric infrared sensors (PIR) that are mounted on walls around or near the patient bed in a horizontal field of view to detect regular motions and patient fall events; we used PIR sensors along with Arduino Uno to detect patient falls and save the collected data in Arduino SD for classification. For data collection, 20 persons contributed as patients performing fall events. When a patient or elderly person falls, a signal of different intensity (high) is produced, which certainly differs from the signals generated due to normal motion. A set of parameters was extracted from the signals generated by the PIR sensors during falling and regular motions to build the dataset. When the system detects a fall event and turns on the green signal, an alarm is generated, and a message is sent to inform the family members or caregivers of the individual. Furthermore, we classified the elderly fall event dataset using five machine learning (ML) classifiers, namely: random forest (RF), decision tree (DT), support vector machine (SVM), naïve Bayes (NB), and AdaBoost (AB). Our result reveals that the RF and AB algorithms achieved almost 99% accuracy in elderly fall-d\detection.

1. Introduction

Elderly falling is a serious issue and has been investigated by researchers for over a decade. In a Centers for Disease Control report, it was reported that, in the United States, falls among the elderly each year cause multiple deaths and injuries. Approximately 61% of falls occurred indoors, which caused around 10,000 deaths [1]. Providing quick assistance to the victims of such events significantly reduces the hospitalization risk (by around 26%) and decreases the death rate to around 80% [1]. In favor of this, several mechanisms that detect fall incidents and provide quick assistance have been introduced to control and overcome this problem instead of leaving the person lying for a long time after falling. In [1], the author proposed smart tiles for elderly fall detection. However, it is an expensive solution.
The World Health Organization stated that accidental falls of the elderly are the main reason for or the cause of injuries and even death in some cases [2]. When a fall happens, instant help is a need. For this, researchers have put in a lot of effort and appreciable work to detect and prevent elderly falls [3,4,5]; however, some limitations exist in their work. In [6,7,8], the authors used a pressure- and vibration-based scheme in which sensors were placed on the wall or floor in a specific direction to analyze patient movement. However, these sensors, RFID systems, and smart tiles are not cost-effective solutions for detecting elderly falls. As in the case of RFID technologies, they detect elderly falls but do not protect them from fall injury. RFID fall solutions are not cost-effective solutions [9]. We need to provide the elderly with better shelter to provide quality of life and minimize the risks of living alone. Furthermore, an accelerometer system has been introduced to accelerate wearable airbags to prevent the patient from fall-related injuries [10,11,12]. In a wearable scenario, wearing the airbag all the time is frustrating and annoying to human nature. The elderly also need to remember to wear the device, which, with growing age, is difficult to remember. Over the last few decades, sensors and technologies have gained importance as they do not affect the privacy of either the elderly or patients who have never worn wearable devices. The PIR sensor is the most useable technique for detecting elderly falls; the authors used PIR sensors to detect human motion in [13,14].

1.1. Motivation

Elderly falling is a serious issue. In relation to this, several mechanisms have been introduced to control and overcome this problem by detecting fall incidents and providing quick assistance instead of leaving the person lying for a long time after falling. In this regard, several techniques and methods have been proposed by researchers to detect fall events, such as smart tiles [1], RFID tags [9], accelerometers [12], and home monitoring-based techniques [13]. However, always keeping an eye on the elderly affects the privacy of the elderly, and they do not feel comfortable in life; additionally, these are expensive solutions. To overcome and mitigate elderly fall issues, such as being left to lie for a long time after a fall, this study presents low-cost, sensor-based approaches for fall event detection.

1.2. Paper Organization

The remainder of the paper is prepared as follows: Related work is described in Section 2. Section 3 describes the proposed methodology. Section 4 gives implementation details. ML classifiers are described in Section 5. The results of the experiment are presented in Section 6. Finally, in Section 7, conclusions and future work are defined.

2. Related Work

In this section, we focus especially on research efforts that are related to elderly fall detection. Several papers have been written on the topic of detecting elderly falls. Researchers are working on elderly fall detection. Researchers have designed and implemented several motion detection systems to detect elderly fall events using radio-frequency identification tags (RFID). In the tags, the frequency values of static, normal movements and sudden falls are varied, and impact is shown in received signal strength (RSS) and Doppler frequency value (DFV), and, in this way, the authors could detect elderly fall events. Battery-less technologies, such as RFID and readers, were incorporated into smart floor carpets to detect elderly falls and, subsequently, to inform caregivers [15]. However, the authors used distinct equipment and devices, and the accuracy was not adequately high. In [16], the author also used the floor-based RFID technique, with RFID arranged in a two-dimensional grid on a smart carpet to form an unobtrusive monitoring zone. Heuristic algorithms and ML are used to identify falls and prevent a situation where the elderly person is left to lie for a long time.
The vision-based system is frequently used to detect patient falls; researchers have used a digital camera to monitor the patient’s activities and detect fall events. In [17], cameras were installed in the ceiling which could see 77% of fall cases. In [18,19,20,21], the authors also used the vision-based system and analyzed the data using the classifiers PCA and SVM, and they achieved 89.2% and 90.3% sensitivity and specificity, respectively, in the detection of patient fall events. In [22], through the SVM classifier, the author achieved 83.2% accuracy in the detection of fall events. Some other practitioners used neural network techniques [23,24,25,26] to analyze patient fall data obtained through a vision-based system [27,28,29,30]. Three-dimensional depth image investigation, i.e., elderly fall detection based on 3D image shape analysis, of images captured by the kinetic sensor in the room environment was proposed in [31]. Subtraction methods analyze depth images. The human body centroids measure the angle between the human body and the floor level. When the angle is smaller than the defined thresholds, the fall events are detected, and several other techniques have been proposed [32,33]. However, in all the abovementioned cases, particular attention has not been devoted to addressing fall-relevant injuries.
In [34,35,36,37], the authors proposed wearable-based solutions, for example, sensors and wearable devices, to detect fall events and prevent those individuals from falling injuries. The abovementioned methods introduced wearable devices to protect the wearers from damage. The key feature of fall sensors is the fall-sensing algorithm, which uses both an accelerometer and angular velocity [38]. The fall-detection algorithm detects the fall signal. These airbags protect the head and thighs of a falling person [39]. However, it seems impractical to always wear an airbag and device. A micro-inertial measurement unit (mIMU) that consisted of three-dimensional MEMS Bluetooth, gyroscopes, accelerometers, a microcontroller unit (MCU), and high-speed cameras was used to record and analyze human motion [37,38,39,40]. The proposed technique was used to protect the thighs of a falling person; it may never protect the elder from a head injury as in [41]. In [42], the Elman neural network algorithm was used for fall detection. This algorithm was implemented in a wearable device combined with low-energy Bluetooth, which detected fall signals based on the second-order train method and sent a response to a remote PC.
In [43], the authors established an intellectual fall detector system based on infrared array detectors. This system monitors the elders in the house and generates an alarm in case a person is out of range or falls. Infrared sensors are extensively used to detect human motion due to their sensitivity. This novel sensing scheme also uses PIR sensors to detect elderly falls. Stereo infrared is intended to expand the sensing pitch to capture IRC signals entirely [44]. Fall-based image classification achieved by observing the human activities and classifying them using ML techniques to send the notification to the caretakers was performed in [45,46,47,48] to detect elderly falls. As discussed above, multiple methods have been proposed to detect falls, but very few services are used in practical life.
A vision-based system to detect falls was highlighted in [49,50]. It is a type of nonobtrusive fall-detection structure. This system uses image-processing methods for fall detection based on an image series or images captured from video clips recorded through a camera. The image analysis identifies the image of the elder so efficiently identifies an elderly fall. However, a visual system cannot provide good results in dark and shady environments [51,52,53].
Furthermore, connecting a camera affects elder privacy. A researcher used RFID to design numerous motion detection systems to detect elderly fall events [54,55,56,57,58]. Instead of utilizing on-body sensors, the proposed project uses PIR sensors and mounts them on the room’s walls. In [59,60], the author used a wearable accelerometer to detect elderly falls using three datasets to detect the elderly fall accurately, while, in another, a Bluetooth-based, low-energy-enabled accelerometer sensor for the detection of patient bed falls was used. However, the authors worked on detecting falls and did not inform caregivers that these solutions were not cost efficient. In [61,62], the authors used a video camera to take human skeleton images and ultrasonic sensors and hybridized video techniques to detect elderly falls. The author used the same video-based fall-detection method as used in [63]. First, a person’s silhouette is extracted through a feature extraction technique and then the characteristics set is measured to determine whether or not a fall occurred. A finite-state machine (FSM) was introduced to estimate the head position to compute the head vertical velocity. This algorithm was evaluated using the L2ei dataset, containing over 2700 labelled frames, to train three distinct classifiers. In [64,65], the authors also used video-based fall-detection techniques using the neural network approach. The authors of [66] proposed intellectual healthcare frameworks to examine the contribution and effectiveness of the axes of a tri-axial accelerometer sensor for accurate activity recognition [67]. Machine-learning-assisted cognitive strength assessment in smart houses is a multi-agent for healthcare systems based on contextual logic. Situational awareness in BDI, perceptive to detecting early COVID-19 symptoms through smartwatches, for improving the healthcare sector was explored in [68]. However, these techniques are not cost effective. We use a sensor-based, cost-effective technique for elderly fall detection and generate the alarm to inform caregivers. In this way, we can save the patient’s life and enhance the patient’s fall accuracy. Another solution is a vision-based solution [69,70], used to detect the elderly fall, and it is ideal as it covers an extensive area. However, always keeping an eye on the elderly affects the privacy of the elderly, and they do not feel comfortable in life. ML and deep learning models have been successfully used in several areas for classifying fall datasets [71,72,73,74,75].
In our proposed system, we use a low-cost, sensor-based system. We mounted the sensor on the wall to detect patient falls and analyzed the obtained data using highly mature, state-of-the-art, and representative algorithms.
In this regard, we used a PIR sensor and Arduino Uno to detect elderly falls. We collected datasets of normal motions and fall events for classification. So, to form the dataset, 20 persons contributed to performing normal motion and fall events, and the intensity of the signal generated by the PIR sensors in normal motions and the fall events was varied. This data were saved in Arduino SD. Additionally, a set of parameters was extracted to build datasets. The dataset which was obtained was classified using DT, SVM, RF, naïve Bayes (NB), and AdaBoost (AB) to increase the elderly fall detection event accuracy. Our proposed solution is cost effective. We can detect elderly falls at a very cheap cost, such as approximately PKR 1945/- per feet square. The cost of the equipment is shown in Table 1.

3. Proposed Methodology

In the proposed solution, we intend to reduce health issues to support the autonomous living of elders. The proposed project aims to use pyroelectric infrared (PIR) sensors to identify and detect human motions. The analog of PIR sensors is contingent on numerous aspects, such as the distance from the body, the direction of the sensor, the movement speed, and the body’s shape, etc. By using the output of infrared sensors, we can detect near-to-fall events of a body in a unit of time in a unit area.
In our proposed system, the MC programing language is used to program Arduino Uno using the Arduino tool and is integrated with the circuit. The human-motion-detecting device consists of PIR sensors, Arduino Uno, Arduino SD, and Arduino GSM. The stages of our working methodology are revealed in Figure 1.
The PIR sensors are mounted on the wall near the patient’s or elderly person’s bed. First, the data of normal and falling motions are collected through the PIR sensors. Typically, when a human walks around the room usually, the sensors identify the movements as usual motion based on speed and displacement in unit time. When an individual falls, the sign of the body increases significantly, and the sensors detect the greater speed than usual that the human body has increased to; consequently, the distance between the body and sensors decreases very fast. To differentiate normal walking from fall actions, machine-learning-based classifiers, such as SVM, DT, NV, RF, and AB for data reduction and classification, are used to classify the events as walking or falling. The distance and speed of the body are compared with the already stored datasets. In case the classifier detects an event such as a fall, a green signal is issued, and an alarm is generated to inform the family members or caregivers of the elderly person to prompt further assistance.
The workflow of our projected architecture is shown in Figure 2. Figure 2a shows the view of the PIR sensors; the experimental setup for statistics collection of normal motion is revealed in Figure 2b. PIR sensors generate an analog signal by detecting the speed of normal motion. We developed an operational amplifier (op-amp) circuit to amplify the PIR signal. In the proposed architecture, we use the following equipment to detect elderly fall, as shown in Figure 3: PIR sensors, Arduino Uno, Arduino SD, and Arduino GSM, transistor, photodiode, transformer, capacitors, resistors, diodes, buzzer, and jumper wires male–female.

Data Extraction Framework

We collected the datasets of regular motions and fall events for classification. For data collection, 20 persons contributed by performing normal movement and falling. From the signals generated by the PIR sensors, a set of parameters was extracted in different scenarios to build datasets. Experiments were performed in a living room. The PIR sensor module was mounted to the wall in the horizontal field of view to cover the maximum area, especially around the elderly person’s bed. When an elderly person makes a regular motion, the PIR sensor detects the thermal radiation emitted from the human body and shows the different intensity of the analog output signals. These signals are compared with existing datasets and analyzed if certain conditions, such as current output signal range or intensity and near-to-fall dataset range intensity, are the same. This means that, if the intensity of the current signal is equal to the intensity of falling signals, then an alarm is generated and informs the caregivers. In addition, ML classifiers, i.e., NB, SVM, DT, RF, and AB, are used to increase elderly fall detection accuracy.
We collected data on normal motion and falling events using the PIR sensors and extracting the data generated by the PIR sensors to classify the data of falling events and the data of normal motion and detect elderly fall events. Feature extraction identified the signal intensity of falling and normal motion data range from the collected data. Unlike normal motion, fall motion activity consists of abrupt, fast motion velocity v. If the elder is near to the sensor, the field of view decreases, and, if the elder is far from the sensor, the field of view expands, as the angular size appears smaller to the sensor. PIR sensor p identifies and detects the normal motion, generating a different output signal. Further, we classified the normal motion and fall event from the obtained dataset using ML classifiers. In vector form, the productivity of the sensor p is as follows in Equation (1):
v p , t = φ p . x t i = 1 D φ p ,   i   .   x i ,   t ,   p   є   { 1 , 2 , 3   , . ,   O }
where the “.” symbol stands for the dot product, the D—dimensional row vector φ, p, and O represent the number of PIR sensors, respectively, and denote the visibility of all D cells to the path PIR sensor. The visibility function φ p, and i specify whether the ith cell is visible to the PIR sensor (p). To be more precise, the ith cell is invisible to sensor φ p if i is valued at 0 and visible if p, i is valued at 1.

4. Implementation Details

We created a circuit board as shown in Figure 4 using jumper male–female wires to obtain amplified PIR sensor signals with op-amp circuits and PIR sensors, the outputs of which were associated to the analog inputs of an Arduino Uno, Arduino SD. Arduino GSM was used to inform caregivers. The analog signals were shown on an oscillator or a laptop by installing an Arduino PIR plotter, as shown in Figure 5. Arduino Uno was constituted to maintain each analog input as a time sequence on an Arduino SD card or memory card.

5. Analysis and Discussion

In this section, a state-of-the-art analysis is shown in Table 2, and experimental results are discussed.

5.1. PIR Analog Signals

The PIR analog signal simulation results for normal motion and high-intensity motion (falling/moving fast) are revealed in Figure 6 and Figure 7. As you can see, the difference between Figure 6 and Figure 7 is that, when the movement speed increased, the length of the signal increased compared to the normal motion. The analog data were extracted using one PIR sensor, then two and three PIR sensors, respectively, to make it more precise and accurate and then were converted into numeric data for analysis and classification. The PIR sensors generated the data in analog signals, as shown in Figure 6 and Figure 7, and then Arduino converted the generated data from an analog signal to digital form as Arduino has a built-in analog-to-digital converter. Then, we used hand-crafted feature techniques. The main reason for adopting these hand-crafted features is their efficient, state-of-the-art performance. The data were used to train the approaches to detect and classify elderly falls. When we gained the data through Arduino, we applied ML approaches and split the data into training and testing at 80:20 ratios using the Python Jupyter Notebook platform to spread the ML approaches to gain accuracy according to different approaches, e.g., NB, AB, SVM, DT, and RF.

5.2. Machine Learning Classifiers

ML is the use of computerized perception to give context and has the capacity to certainly take in and improve a fact deprived of being expressly modified. ML centers around the advancement of PC curricula to obtain and utilize information for their learning. By applying ML, classifiers compared the current data value to the predefined threshold. Based on this threshold, it could distinguish between different postures, falling, sitting, and standing. Feature or parameter extraction relevant to attributes or characteristics was identified from existing data and collected data [76]. Feature extraction efficiently identified and classified the normal motion and falling events. The feature had to be carefully picked to obtain smaller and more descriptive output datasets. Next, we extracted the practical value.

5.3. Support Vector Machine (SVM)

This section briefly introduces the SVM technique of ML for binary classification. The binary classifier can be articulated as a function f: Rn → ±1, which maps patterns y onto their accurate classification x as x = f(y). In the case of the SVM, the function f is formed as in [77] Equation (2):
f ( x ) = i = 1 N x i a i   k ( y ,   y i     ) + b
where N represents the training patterns, i is its classification, (xi, yi) are training patterns, learned αi and b represent weights, and k() shows the kernel function. We used the linear function k ( y ,   y i   ) = y y i   and the radial basis function ( y ,   y i   ) = e | | y y i   | | / 2 σ 2 . The patterns αi > 0 are symbolized support vectors.
The kernel k() communicates a hyperplane through into the component space through the surface f(y) = 0. The ranges from the support vectors and the hyperplane are maximized, and the weights I and b are chosen to reduce the number of incorrect classifications inside this training set. Solving the optimization model allows for this [78] to be maximized using Equation (3):
L D = i = 1 N a i 1 2   i = 1 N i = 1 N y i   y j   a i a j k ( x i ,   x j   )
Subject to Equation (4):
0 a i C , i = 1 N y i a i
The tolerance to incorrect classifications is influenced by the constant C. Equation (2), with either support vector (xi, yi), as in the data, can be used to find b using the ideal parameters i for a comprehensive explanation of the SVM. The SVM recognizes the event that has occurred and detects the signal change [79].

5.4. Decision Tree

A DT classifier is used for regression as well as classification. In a decision analysis system, a DT can be utilized to explicitly and visually represent decisions and decision making. In a DT, the continuous values or real numbers taken in the target variable are called regression trees. A DT is a tree-like or flowchart-like structure, where internal nodes represent an examination of a feature, the class label is represented by each leaf node (computing all features for decision), and conjunctions represent the branches of the structures that govern the class labels. The paths or the way from root towards leaf show the classification principals [80].

5.5. Random Forest

RF is a supervised ML classifier. RF classifiers are built up of multiple trees and merged to make a decision. So, the decision becomes more accurate and precise. It makes a forest and makes it somehow random. The “forest” it hypothesizes is a group of DT, more often than not equipped with the “bagging” approach. The universal thought of the bagging technique is that a combination of learning models increases the overall result [81].

5.6. Naïve Bayes

An NB approach is a probabilistic classifier; it is used for the classification of relevant tasks. The NB classifier is based on the Bayes theorem. By using the Bayes theorem, we determine the probability of c occurrence given that x has happened. Here, x is the proof, and c is the speculation. The supposition is made that the predictors or the features are self-determining. It means that the presence of one precise feature does not influence or affect the other feature. However, it may help to discover more. Subsequently, it is called naïve [82]. As shown in Equation (5), we use P(c|x), showing posterior probability. P(c) is class prior probability, P(x|c) is likelihood, and P(x) is predictor prior probability.
P ( c | x ) = P ( x | c ) P ( c ) P ( x )

5.7. Adaboost

AB is commonly used for classification problems and intends to change a weak classifier into a set of strong classifiers. The final classification can be represented in Equation (6) as:
F ( x ) = sign ( i = 1 N 0 m   f m ( x ) )
where f(n) donates for the nth weak approach, and theta(n) is the correspondence weight. It is precisely the weighted blend of N weak classifiers given a dataset containing “n” points, where:
x i   є   R   d ,   є   y i є   { 1 ,   1 }
In Equation (7), the negative class is denoted by −1, and the positive class is represented by 1. The weight of each data point is initialized as shown in Equation (8):
w ( x i , y i ) = 1 n ,   i = 1   ,   ,   n
Most detection approaches are based on hypothesis testing and statistical detection [83]. In our proposed approach, we use SVM, DT, RF, NB, and AB to detect the change that occurs in the signal characteristics and notify that the elder falling event has happened. As, by programming, we obtain the value “1” on the falling event, this value generates the alarm and informs the caregivers (using Arduino GSM) by sending a message.

5.8. Classifier Performance Assessment

Five ML algorithms (SVM, DT, NB, RF, and AB) were used to analyze the PIR sensor data we collected; their performance was then compared using the precision, recall, accuracy, and F-measure metrics shown in Table 3.
In this study, we primarily use accuracy, specificity, sensitivity, and the F-measure [84] to assess the ML classifier’s efficiency. As a result, we calculate the focused class’s specificity (precision) and sensitivity (recall) to evaluate the algorithm’s predicted accuracy. For ML, the “TP—true positive”, “FP—false positive”, “TN—true negative”, and “FN—false negative” rate is used to determine accuracy, recall, precision, and the F-measure. Each set of accurate predictions is further divided into all positive and all negative forecasts. TP, TN, FN, and FP were all predicted by all models. TP represents the vertical distance that an object actually falls. A predicted non-fall, or FN, is an expected non-fall. Falling prey to FP is like planning for a fall that never happens. In the real world, and in the foreseeable future, TN does not constitute a fall.
Accuracy is calculated as the number of accurately classified instances separated by the total number of cases inr the dataset, as shown in Equation (9):
Accuracy = TP + TN TP + FP + TN + FN  
Specificity: the average possibility of relevant retrieval, as described in Equation (10):
Specificity   = TP TP + FP  
Sensitivity is the average prospect of complete retrieval, defined in Equation (11):
Sensitivity   = TP TP + FN  
F-measure is calculated through precision, as well as recall, as shown in Equation (12):
F Measure   = ( 2 * ( Precision * Recall ) ) Precision * Recall  

6. Experimental Results and Discussion

We experimented by applying one, two, and three PIR sensors. The results demonstrate that, by growing the number of sensors, we obtained positive effects by increasing the accuracy, having a more significant number of sensors covering the maximum area of the fall detection, and the matching ratio was increased, and we obtained the signal promptly. Moreover, when we considered the classifiers, we observed that the boosting and ensemble algorithms performed better because they were built by combining two or more than two classifiers, e.g., RF classifiers worked like the DT. Still, they had groups of DT, so the chance of accuracy was increased, and the SVM was the vector classification method. In such datasets, vector classification gives better results as it makes the classification better. The advantage of the proposed framework is that it is a low-cost framework, and the elderly do not need to wear a device all the time, and it also maintains the privacy of the elderly.
The background circumstance may alter the results, such as birds’ movement, interior walls, and substances causing muddles and ghost boards due to scattered signals. We tested our system in such situations to obtain the effectiveness of our proposed model. In this regard, we obtained the minimum and maximum accuracy of one, two, and three PIR sensors by applying ML classifiers, as revealed in Figure 8, Figure 9 and Figure 10. Moreover, in such situations, our experimental results indicated that the algorithm AdaBoost (AB) performed best, achieving a minimum accuracy of 87% on the PIR sensor elderly fall dataset classification, as shown in Figure 8.
In normal scenarios, the experimental results revealed that the algorithm AdaBoost (AB) produced better results, respectively, 98% and 99% in the situations where two or three IR sensors were installed or mounted in the room, as shown in Figure 11 and Figure 12. However, in the case of a single sensor, the SVM gave 89% accuracy on the elderly fall dataset classification, as shown in Figure 13. The specificity and sensitivity of these ML classifiers are shown in Figure 14, Figure 15 and Figure 16, respectively, for three, two, and one planted sensor.

7. Conclusions and Future Work

This paper proposed a sensor-based fall-detection scheme. The system detects elderly falls using IRA-E700ST0 pyroelectric infrared sensors which are mounted on the wall in a horizontal field of view. We considered physiological falls, lower-level falls, falls on a single level, and swing falls, enhancing the patient’s falling accuracy and saving the patient’s life. When the person is falling, the PIR sensor detects the high-intensity signal, turns on the green light, and generates an alarm to inform the caregivers that a fall event is happening. Furthermore, fall event dataset classification was performed using SVM, DT, NB, RF, and AB ML classifiers, and it was observed that ensemble and boosting algorithms more effectively classify fall event data.
In future work, we will integrate an ultrasonic sensor into the system to increase fall detection accuracy and enhance privacy and data security by combining blockchain technologies with the system to save data relating to the elderly person. Moreover, we will also try to incorporate floor airbags with PIR sensors operated using IoT devices for reliable and efficient services so protection from injuries as a result of falls is controlled. Earlier researchers used wearable airbags, and they lacked the ability to protect the entire body. Moreover, having to wear airbags at all times seems bothersome for the elderly who may already have declining health. So, in future, we will try to integrate floor airbags with the PIR sensors. When the elderly person falls, the sensor perceives the elderly person’s fall motion and starts the motor to fill the airbag. So, when the elderly person falls, the airbag will blow up, the elderly person will fall on it, and, this way, we will prevent the elderly person from falling injuries.

Author Contributions

Writing—original draft, C.A.U.H., F.K.K., A.A., J.I., H.E., S.H., S.S.U. and M.S.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research project was funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2023R300), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.

Data Availability Statement

Not applicable.

Acknowledgments

Authors acknowledge the Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2023R300), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Daher, M.; Diab, A.; El Najjar, M.E.B.; Khalil, M.A.; Charpillet, F. Elder tracking and fall detection system using smart tiles. IEEE Sens. J. 2016, 17, 469–479. [Google Scholar] [CrossRef]
  2. Falls. Available online: http://www.who.int/mediacentre/factsheets/fs344/en/ (accessed on 24 January 2019).
  3. Igual, R.; Medrano, C.; Plaza, I. Challenges, issues and trends in fall detection systems. Biomed. Eng. Online 2013, 12, 66. [Google Scholar] [CrossRef] [Green Version]
  4. Zhang, Z.; Conly, C.; Athitsos, V. A survey on vision-based fall detection. In Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Corfu, Greece, 1–3 July 2015; pp. 1–7. [Google Scholar]
  5. Mubashir, M.; Shao, L.; Seed, L. A survey on fall detection: Principles and approaches. Neurocomputing 2013, 100, 144–152. [Google Scholar] [CrossRef]
  6. Alwan, M.; Rajendran, P.J.; Kell, S.; Mack, D.; Dalal, S.; Wolfe, M.; Felder, R. A smart and passive floor-vibration based fall detector for elderly. In Proceedings of the 2006 2nd International Conference on Information & Communication Technologies, Damascus, Syria, 24–28 April 2006; IEEE: New York, NY, USA, 2006; Volume 1, pp. 1003–1007. [Google Scholar]
  7. Chaccour, K.; Darazi, R.; El Hassans, A.H.; Andres, E. Smart carpet using differential piezoresistive pressure sensors for elderly fall detection. In Proceedings of the 2015 IEEE 11th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), Abu Dhabi, United Arab Emirates, 19–21 October 2015; IEEE: New York, NY, USA, 2015; pp. 225–229. [Google Scholar]
  8. Tzeng, H.W.; Chen, M.Y.; Chen, J.Y. Design of fall detection system with floor pressure and infrared image. In Proceedings of the 2010 International Conference on System Science and Engineering, Taipei, Taiwan, 1–3 July 2010; IEEE: New York, NY, USA, 2010; pp. 131–135. [Google Scholar]
  9. Zhu, L.; Wang, R.; Wang, Z.; Yang, H. TagCare: Using RFIDs to monitor the status of the elderly living alone. IEEE Access 2017, 5, 11364–11373. [Google Scholar] [CrossRef]
  10. Palmerini, L.; Bagalà, F.; Zanetti, A.; Klenk, J.; Becker, C.; Cappello, A. A wavelet-based approach to fall detection. Sensors 2015, 15, 11575–11586. [Google Scholar] [CrossRef] [Green Version]
  11. Álvarez-García, J.A.; Soria Morillo, L.M.; Concepción, M.Á.Á.D.L.; Fernández-Montes, A.; Ortega Ramírez, J.A. Evaluating wearable activity recognition and fall detection systems. In Proceedings of the 6th European Conference of the International Federation for Medical and Biological Engineering, Dubrovnik, Croatia, 7–11 September 2014; Springer: Cham, Switzerland, 2015; pp. 653–656. [Google Scholar]
  12. Aziz, O.; Musngi, M.; Park, E.J.; Mori, G.; Robinovitch, S.N. A comparison of accuracy of fall detection algorithms (threshold-based vs. machine learning) using waist-mounted tri-axial accelerometer signals from a comprehensive set of falls and non-fall trials. Med. Biol. Eng. Comput. 2017, 55, 45–55. [Google Scholar] [CrossRef]
  13. Guan, Q.; Li, C.; Guo, X.; Shen, B. Infrared signal based elderly fall detection for in-home monitoring. In Proceedings of the 2017 9th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 26–27 August 2017; IEEE: New York, NY, USA, 2017; Volume 1, pp. 373–376. [Google Scholar]
  14. Shinmoto Torres, R.L.; Wickramasinghe, A.; Pham, V.N.; Ranasinghe, D.C. What if your floor could tell someone you fell? A device free fall detection method. In Proceedings of the Conference on Artificial Intelligence in Medicine in Europe, Pavia, Italy, 17–20 June 2015; Springer: Cham, Switzerland, 2015; pp. 86–95. [Google Scholar]
  15. Wickramasinghe, A.; Torres, R.L.S.; Ranasinghe, D.C. Recognition of falls using dense sensing in an ambient assisted living environment. Pervasive Mob. Comput. 2017, 34, 14–24. [Google Scholar] [CrossRef]
  16. Lee, T.; Mihailidis, A. An intelligent emergency response system: Preliminary development and testing of automated fall detection. J. Telemed. Telecare 2005, 11, 194–198. [Google Scholar] [CrossRef]
  17. Rougier, C.; Meunier, J.; St-Arnaud, A.; Rousseau, J. Robust video surveillance for fall detection based on human shape deformation. IEEE Trans. Circuits Syst. Video Technol. 2011, 21, 611–622. [Google Scholar] [CrossRef]
  18. Wang, S.; Chen, L.; Zhou, Z.; Sun, X.; Dong, J. Human fall detection in surveillance video based on PCANet. Multimed. Tools Appl. 2016, 75, 11603–11613. [Google Scholar] [CrossRef]
  19. Lee, D.W.; Jun, K.; Naheem, K.; Kim, M.S. Deep Neural Network–Based Double-Check Method for Fall Detection Using IMU-L Sensor and RGB Camera Data. IEEE Access 2021, 9, 48064–48079. [Google Scholar] [CrossRef]
  20. Gutiérrez, J.; Rodríguez, V.; Martin, S. Comprehensive review of vision-based fall detection systems. Sensors 2021, 21, 947. [Google Scholar] [CrossRef]
  21. Zhou, Z.; Stone, E.E.; Skubic, M.; Keller, J.; He, Z. Nighttime in-home action monitoring for eldercare. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–1 September 2011; IEEE: New York, NY, USA, 2011; pp. 5299–5302. [Google Scholar]
  22. Alhimale, L.; Zedan, H.; Al-Bayatti, A. The implementation of an intelligent and video-based fall detection system using a neural network. Appl. Soft Comput. 2014, 18, 59–69. [Google Scholar] [CrossRef]
  23. Chen, J.; Romero, R.; Thompson, L.A. Motion Analysis of Balance Pre and Post Sensorimotor Exercises to Enhance Elderly Mobility: A Case Study. Appl. Sci. 2023, 13, 889. [Google Scholar] [CrossRef]
  24. Wang, X.; Ellul, J.; Azzopardi, G. Elderly fall detection systems: A literature survey. Front. Robot. AI 2020, 7, 71. [Google Scholar] [CrossRef]
  25. Khraief, C.; Benzarti, F.; Amiri, H. Elderly fall detection based on multi-stream deep convolutional networks. Multimed. Tools Appl. 2020, 79, 19537–19560. [Google Scholar] [CrossRef]
  26. Yacchirema, D.; de Puga, J.S.; Palau, C.; Esteve, M. Fall detection system for elderly people using IoT and ensemble machine learning algorithm. Pers. Ubiquitous Comput. 2019, 23, 801–817. [Google Scholar] [CrossRef]
  27. Hussain, F.; Umair, M.B.; Ehatisham-ul-Haq, M.; Pires, I.M.; Valente, T.; Garcia, N.M.; Pombo, N. An efficient machine learning-based elderly fall detection algorithm. arXiv 2019, arXiv:1911.11976. [Google Scholar]
  28. Bridenbaugh, S.A.; Kressig, R.W. Laboratory review: The role of gait analysis in seniors’ mobility and fall prevention. Gerontology 2011, 57, 256–264. [Google Scholar] [CrossRef] [Green Version]
  29. Baldewijns, G.; Claes, V.; Debard, G.; Mertens, M.; Devriendt, E.; Milisen, K.; Tournoy, J.; Croonenborghs, T.; Vanrumste, B. Automated in-home gait transfer time analysis using video cameras. J. Ambient. Intell. Smart Environ. 2016, 8, 273–286. [Google Scholar] [CrossRef]
  30. Yang, L.; Ren, Y.; Zhang, W. 3D depth image analysis for indoor fall detection of elderly people. Digit. Commun. Netw. 2016, 2, 24–34. [Google Scholar] [CrossRef] [Green Version]
  31. Leone, A.; Rescio, G.; Caroppo, A.; Siciliano, P.; Manni, A. Human Postures Recognition by Accelerometer Sensor and ML Architecture Integrated in Embedded Platforms: Benchmarking and Performance Evaluation. Sensors 2023, 23, 1039. [Google Scholar] [CrossRef]
  32. Youssfi Alaoui, A.; Tabii, Y.; Oulad Haj Thami, R.; Daoudi, M.; Berretti, S.; Pala, P. Fall Detection of Elderly People Using the Manifold of Positive Semidefinite Matrices. J. Imaging 2021, 7, 109. [Google Scholar] [CrossRef]
  33. Tamura, T.; Yoshimura, T.; Sekine, M.; Uchida, M.; Tanaka, O. A wearable airbag to prevent fall injuries. IEEE Trans. Inf. Technol. Biomed. 2009, 13, 910–914. [Google Scholar] [CrossRef]
  34. Shi, G.; Chan, C.S.; Luo, Y.; Zhang, G.; Li, W.J.; Leong, P.H.; Leung, K.S. Development of a human airbag system for fall protection using mems motion sensing technology. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; IEEE: New York, NY, USA, 2006; pp. 4405–4410. [Google Scholar]
  35. Shi, G.; Chan, C.S.; Li, W.J.; Leung, K.S.; Zou, Y.; Jin, Y. Mobile human airbag system for fall protection using MEMS sensors and embedded SVM classifier. IEEE Sens. J. 2009, 9, 495–503. [Google Scholar] [CrossRef]
  36. Saleh, M.; Jeannès, R.L.B. Elderly fall detection using wearable sensors: A low cost highly accurate algorithm. IEEE Sens. J. 2019, 19, 3156–3164. [Google Scholar] [CrossRef]
  37. Wang, Z.; Ramamoorthy, V.; Gal, U.; Guez, A. Possible life saver: A review on human fall detection technology. Robotics 2020, 9, 55. [Google Scholar] [CrossRef]
  38. Chuma, E.L.; Roger, L.L.B.; De Oliveira, G.G.; Iano, Y.; Pajuelo, D. Internet of things (IoT) privacy–protected, fall-detection system for the elderly using the radar sensors and deep learning. In Proceedings of the 2020 IEEE International Smart Cities Conference (ISC2), Piscataway, NJ, USA, 28 September–1 October 2020; IEEE: New York, NY, USA, 2020; pp. 1–4. [Google Scholar]
  39. Yu, X.; Jang, J.; Xiong, S. Machine learning-based pre-impact fall detection and injury prevention for the elderly with wearable inertial sensors. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, virtually, 25–29 July 2021; Springer: Cham, Switzerland, 2021; pp. 278–285. [Google Scholar]
  40. Sankaran, S.; Thiyagarajan, A.P.; Kannan, A.D.; Karnan, K.; Krishnan, S.R. Design and Development of Smart Airbag Suit for Elderly with Protection and Notification System. In Proceedings of the 2021 6th International Conference on Communication and Electronics Systems (ICCES), Coimbatre, India, 8–10 July 2021; IEEE: New York, NY, USA, 2021; pp. 1273–1278. [Google Scholar]
  41. Chu, C.T.; Chang, C.H.; Chang, T.J.; Liao, J.X. Elman neural network identify elders fall signal base on second-order train method. In Proceedings of the 2017 6th International Symposium on Next Generation Electronics (ISNE), Keelung, Taiwan, 23–25 May 2017; IEEE: New York, NY, USA, 2017; pp. 1–4. [Google Scholar]
  42. Sixsmith, A.; Johnson, N. A smart sensor to detect the falls of the elderly. IEEE Pervasive Comput. 2004, 3, 42–47. [Google Scholar] [CrossRef]
  43. Hayashida, A.; Moshnyaga, V.; Hashimoto, K. New approach for indoor fall detection by infrared thermal array sensor. In Proceedings of the 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), Boston, MA, USA, 6–9 August 2017; IEEE: New York, NY, USA, 2017; pp. 1410–1413. [Google Scholar]
  44. Jeffin Gracewell, J.; Pavalarajan, S. Fall detection based on posture classification for smart home environment. J. Ambient. Intell. Humaniz. Comput. 2021, 12, 3581–3588. [Google Scholar] [CrossRef]
  45. Badgujar, S.; Pillai, A.S. Fall Detection for Elderly People using Machine Learning. In Proceedings of the 2020 11th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kharagpur, India, 1–3 July 2020; IEEE: New York, NY, USA, 2020; pp. 1–4. [Google Scholar]
  46. Kalinga, T.; Sirithunge, C.; Buddhika, A.G.; Jayasekara, P.; Perera, I. A Fall Detection and Emergency Notification System for Elderly. In Proceedings of the 2020 6th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 20–23 April 2020; IEEE: New York, NY, USA, 2020; pp. 706–712. [Google Scholar]
  47. Nahian, M.; Raju, M.H.; Tasnim, Z.; Mahmud, M.; Ahad, M.A.R.; Kaiser, M.S. Contactless fall detection for the elderly. In Contactless Human Activity Analysis; Springer: Cham, Switzerland, 2021; pp. 203–235. [Google Scholar]
  48. Edgcomb, A.; Vahid, F. Automated fall detection on privacy-enhanced video. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; IEEE: New York, NY, USA, 2012; pp. 252–255. [Google Scholar]
  49. Ngo, Y.T.; Nguyen, H.V.; Pham, T.V. Study on fall detection based on intelligent video analysis. In Proceedings of the 2012 International Conference on Advanced Technologies for Communications, Ha Noi, Vietnam, 10–12 October 2012; IEEE: New York, NY, USA, 2013; pp. 114–117. [Google Scholar]
  50. Shieh, W.Y.; Huang, J.C. Speedup the multi-camera video-surveillance system for elder falling detection. In Proceedings of the 2009 International Conference on Embedded Software and Systems, Hangzhou, China, 25–27 May 2009; IEEE: New York, NY, USA, 2009; pp. 350–355. [Google Scholar]
  51. Anderson, D.; Luke, R.H.; Keller, J.M.; Skubic, M.; Rantz, M.; Aud, M. Linguistic summarization of video for fall detection using voxel person and fuzzy logic. Comput. Vis. Image Underst. 2009, 113, 80–89. [Google Scholar] [CrossRef] [Green Version]
  52. Foroughi, H.; Aski, B.S.; Pourreza, H. Intelligent video surveillance for monitoring fall detection of elderly in home environments. In Proceedings of the 2008 11th International Conference on Computer and Information Technology, Khulna, Bangladesh, 24–27 December 2008; IEEE: New York, NY, USA, 2009; pp. 219–224. [Google Scholar]
  53. Ruan, W.; Yao, L.; Sheng, Q.Z.; Falkner, N.J.; Li, X. Tagtrack: Device-free localization and tracking using passive rfid tags. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, London, UK, 2–5 December 2014; pp. 80–89. [Google Scholar]
  54. Su, B.Y.; Ho, K.C.; Rantz, M.J.; Skubic, M. Doppler radar fall activity detection using the wavelet transform. IEEE Trans. Biomed. Eng. 2014, 62, 865–875. [Google Scholar] [CrossRef]
  55. Liu, L.; Popescu, M.; Rantz, M.; Skubic, M. Fall detection using doppler radar and classifier fusion. In Proceedings of the 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics, Hong Kong, China, 5–7 January 2012; IEEE: New York, NY, USA, 2012; pp. 180–183. [Google Scholar]
  56. Liu, L.; Popescu, M.; Skubic, M.; Rantz, M.; Cuddihy, P. An automatic in-home fall detection system using Doppler radar signatures. J. Ambient. Intell. Smart Environ. 2016, 8, 453–466. [Google Scholar] [CrossRef] [Green Version]
  57. Kim, Y.; Moon, T. Human detection and activity classification based on micro-Doppler signatures using deep convolutional neural networks. IEEE Geosci. Remote Sens. Lett. 2015, 13, 8–12. [Google Scholar] [CrossRef]
  58. Goodfellow, I.; Bengio, Y.; Courville, A. Machine learning basics. In Deep Learning; MIT Press: Cambridge, MA, USA, 2016; pp. 98–164. [Google Scholar]
  59. Lin, W.Y.; Chen, C.H.; Lee, M.Y. Design and Implementation of a Wearable Accelerometer-Based Motion/Tilt Sensing Internet of Things Module and Its Application to Bed Fall Prevention. Biosensors 2021, 11, 428. [Google Scholar] [CrossRef]
  60. Ramirez, H.; Velastin, S.A.; Meza, I.; Fabregas, E.; Makris, D.; Farias, G. Fall detection and activity recognition using human skeleton features. IEEE Access 2021, 9, 33532–33542. [Google Scholar] [CrossRef]
  61. Hsu, F.S.; Chang, T.C.; Su, Z.J.; Huang, S.J.; Chen, C.C. Smart Fall Detection Framework Using Hybridized Video and Ultrasonic Sensors. Micromachines 2021, 12, 508. [Google Scholar] [CrossRef]
  62. Zheng, L.; Zhao, J.; Dong, F.; Huang, Z.; Zhong, D. Fall detection algorithm based on inertial sensor and hierarchical decision. Sensors 2023, 23, 107. [Google Scholar] [CrossRef]
  63. Sehairi, K.; Chouireb, F.; Meunier, J. Elderly fall detection system based on multiple shape features and motion analysis. In Proceedings of the 2018 International Conference on Intelligent Systems and Computer Vision (ISCV), Fez, Morocco, 2–4 April 2018; IEEE: New York, NY, USA, 2018; pp. 1–8. [Google Scholar]
  64. Amsaprabhaa, M. Multimodal spatiotemporal skeletal kinematic gait feature fusion for vision-based fall detection. Expert Syst. Appl. 2023, 2, 118–681. [Google Scholar]
  65. Chen, Z.; Wang, Y.; Yang, W. Video Based Fall Detection Using Human Poses. In Proceedings of the Big Data: 9th CCF Conference, BigData 2021, Guangzhou, China, 8–10 January 2022; Springer: Singapore, 2022; pp. 283–296. [Google Scholar]
  66. Javed, A.R.; Sarwar, M.U.; Beg, M.O.; Asim, M.; Baker, T.; Tawfik, H. A collaborative healthcare framework for shared healthcare plan with ambient intelligence. Hum. Cent. Comput. Inf. Sci. 2020, 10, 40. [Google Scholar] [CrossRef]
  67. Javed, A.R.; Sarwar, M.U.; Khan, S.; Iwendi, C.; Mittal, M.; Kumar, N. Analyzing the effectiveness and contribution of each axis of tri-axial accelerometer sensor for accurate activity recognition. Sensors 2020, 20, 2216. [Google Scholar] [CrossRef] [Green Version]
  68. Javed, A.R.; Shahzad, F.; ur Rehman, S.; Zikria, Y.B.; Razzak, I.; Jalil, Z.; Xu, G. Future smart cities requirements, emerging technologies, applications, challenges, and future aspects. Cities 2022, 129, 103794. [Google Scholar] [CrossRef]
  69. Hussain, T.; Hussain, D.; Hussain, I.; AlSalman, H.; Hussain, S.; Ullah, S.S.; Al-Hadhrami, S. Internet of Things with Deep Learning-Based Face Recognition Approach for Authentication in Control Medical Systems. Comput. Math. Methods Med. 2022, 2022, 5137513. [Google Scholar] [CrossRef]
  70. Hussain, I.; Hussain, D.; Kohli, R.; Ismail, M.; Hussain, S.; Sajid Ullah, S.; Alroobaea, R.; Ali, W.; Umar, F. Evaluation of Deep Learning and Conventional Approaches for Image Recaptured Detection in Multimedia Forensics. Mob. Inf. Syst. 2022, 2022, 2847580. [Google Scholar] [CrossRef]
  71. Bourouis, S.; Sallay, H.; Bouguila, N. A Competitive Generalized Gamma Mixture Model for Medical Image Diagnosis. IEEE Access 2021, 9, 13727–13736. [Google Scholar] [CrossRef]
  72. Faruk, O.; Ahmed, E.; Ahmed, S.; Tabassum, A.; Tazin, T.; Bourouis, S.; Khan, M.M. A Novel and Robust Approach to Detect Tuberculosis Using Transfer Learning. J. Healthc. Eng. 2021, 2021, 1002799. [Google Scholar] [CrossRef]
  73. Bourouis, S.; Laalaoui, Y.; Bouguila, N. Bayesian frameworks for traffic scenes monitoring via view-based 3D cars models recognition. Multimed. Tools Appl. 2019, 78, 18813–18833. [Google Scholar] [CrossRef]
  74. Bourouis, S.; Pawar, Y.; Bouguila, N. Entropy-Based Variational Scheme with Component Splitting for the Efficient Learning of Gamma Mixtures. Sensors 2022, 22, 186. [Google Scholar] [CrossRef]
  75. Kumar, S.; Jain, A.; Rani, S.; Alshazly, H.; Ahmed Idris, S.; Bourouis, S. Deep neural network based vehicle detection and classification of aerial images. Intell. Autom. Soft Comput. 2022, 34, 119–131. [Google Scholar] [CrossRef]
  76. Sidenbladh, H. Detecting human motion with support vector machines. In Proceedings of the 17th International Conference on Pattern Recognition ICPR, Cambridge, UK, 26 August 2004; IEEE: New York, NY, USA, 2004; Volume 2, pp. 188–191. [Google Scholar]
  77. Grahn, J.; Kjellstromg, H. Using SVM for efficient detection of human motion. In Proceedings of the 2005 IEEE International Workshop on Visual Surveillance and Performance Evaluation of Tracking and Surveillance, Beijing, China, 15–16 October 2005; IEEE: New York, NY, USA, 2006; pp. 231–238. [Google Scholar]
  78. Cao, D.; Masoud, O.T.; Boley, D.; Papanikolopoulos, N. Human motion recognition using support vector machines. Comput. Vis. Image Underst. 2009, 113, 1064–1075. [Google Scholar] [CrossRef]
  79. Charbuty, B.; Abdulazeez, A.; Abdulazeez, A. Classification based on decision tree algorithm for machine learning. J. Appl. Sci. Technol. Trends 2021, 2, 20–28. [Google Scholar] [CrossRef]
  80. Liu, J.; Ulishney, C.; Dumitrescu, C.E. Random forest machine learning model for predicting combustion feedback information of a natural gas spark ignition engine. J. Energy Resour. Technol. 2021, 143, 012301. [Google Scholar] [CrossRef]
  81. Vangara, V.; Vangara, S.P.; Thirupathur, K. Opinion Mining Classification using Naive Bayes Algorithm. Int. J. Innov. Technol. Explor. Eng. (IJITEE) 2020, 9, 495–498. [Google Scholar] [CrossRef]
  82. Shahraki, A.; Abbasi, M.; Haugen, Ø. Boosting algorithms for network intrusion detection: A comparative evaluation of Real AdaBoost, Gentle AdaBoost and Modest AdaBoost. Eng. Appl. Artif. Intell. 2020, 94, 103770. [Google Scholar] [CrossRef]
  83. Al Nahian, J.; Ghosh, T.; Al Banna, H.; Aseeri, M.A.; Uddin, M.N.; Ahmed, M.R.; Mahmud, M.; Kaiser, M.S. Towards an accelerometer-based elderly fall detection system using cross-disciplinary time series features. IEEE Access 2021, 9, 39413–39431. [Google Scholar] [CrossRef]
  84. Hashim, H.A.; Mohammed, S.L.; Gharghan, S.K. Accurate fall detection for patients with Parkinson’s disease based on a data event algorithm and wireless sensor nodes. Measurement 2020, 156, 107573. [Google Scholar] [CrossRef]
Figure 1. A low-cost fall-detection framework for the elderly.
Figure 1. A low-cost fall-detection framework for the elderly.
Sustainability 15 03982 g001
Figure 2. Data collection based on normal motion.
Figure 2. Data collection based on normal motion.
Sustainability 15 03982 g002
Figure 3. Equipment used to detect elderly falls.
Figure 3. Equipment used to detect elderly falls.
Sustainability 15 03982 g003
Figure 4. Circuit created to detect elderly falls.
Figure 4. Circuit created to detect elderly falls.
Sustainability 15 03982 g004
Figure 5. Circuit and analog signals are shown on the laptop and oscillator.
Figure 5. Circuit and analog signals are shown on the laptop and oscillator.
Sustainability 15 03982 g005
Figure 6. Simulation results of normal motion analog signals to detect elderly falls.
Figure 6. Simulation results of normal motion analog signals to detect elderly falls.
Sustainability 15 03982 g006
Figure 7. Simulation results of high-intensity motion analog signals to detect elderly falls.
Figure 7. Simulation results of high-intensity motion analog signals to detect elderly falls.
Sustainability 15 03982 g007
Figure 8. Accuracy on elderly fall dataset using three PIR sensors.
Figure 8. Accuracy on elderly fall dataset using three PIR sensors.
Sustainability 15 03982 g008
Figure 9. Accuracy on elderly fall dataset using two PIR sensors.
Figure 9. Accuracy on elderly fall dataset using two PIR sensors.
Sustainability 15 03982 g009
Figure 10. Accuracy on elderly fall dataset using one PIR sensor.
Figure 10. Accuracy on elderly fall dataset using one PIR sensor.
Sustainability 15 03982 g010
Figure 11. Accuracy using three IR sensors.
Figure 11. Accuracy using three IR sensors.
Sustainability 15 03982 g011
Figure 12. Accuracy using two IR sensors.
Figure 12. Accuracy using two IR sensors.
Sustainability 15 03982 g012
Figure 13. Accuracy using one IR sensor.
Figure 13. Accuracy using one IR sensor.
Sustainability 15 03982 g013
Figure 14. Specificity and sensitivity of elderly fall dataset of three PIR sensors.
Figure 14. Specificity and sensitivity of elderly fall dataset of three PIR sensors.
Sustainability 15 03982 g014
Figure 15. Specificity and sensitivity of elderly fall dataset of two PIR sensors.
Figure 15. Specificity and sensitivity of elderly fall dataset of two PIR sensors.
Sustainability 15 03982 g015
Figure 16. Specificity and sensitivity of elderly fall dataset of one PIR sensor.
Figure 16. Specificity and sensitivity of elderly fall dataset of one PIR sensor.
Sustainability 15 03982 g016
Table 1. Equipment cost.
Table 1. Equipment cost.
EquipmentPrice
Infrared Sensor500/- × 3 =1500/-
Arduino Uno Microcontroller 200/- × 1 = 200/-
Transistor=20/- × 2 = 40/-
Photodiode=20/- × 3 = 60/-
Transformer=80/- × 1 = 80/-
Capacitors=5/- × 5 = 25/-
Resistors=5/- × 5 = 25/-
Diodes=3/- × 5 = 15/-
LCD Display=400/- × 1 =400/-
Buzzer=20/- × 1 =20/-
Table 2. State of the art.
Table 2. State of the art.
AuthorMethods/ClassifiersHardware/Evaluation
Parameters
Limitations
[15]Radio-frequency identification tags (RFID)RFID tags, received signal strength (RSS), and Doppler frequency value (DFV)The authors used distinct equipment and devices; however, the accuracy was not adequately high
[16]Metaheuristic algorithms are usedFloor-based RFID technique, RFID tags arranged in a two-dimensional grid on a smart carpetInadequate accuracy
[17,18,19,20,21,22,23,24,25,26,27,28,29,30,31]Digital camera, 3D image shape analysis, analysis using the PCA, SVM, NN algorithms Vision-based systemCameras were installed in the ceiling, detecting 77% of fall cases with 90% accuracy. Additionally, this affected the privacy of the elderly by monitoring the patient activities
[34,35,36,37]Wearable-based solutions to protect the head and thighs Sensors and wearable devices which use both an accelerometer and angular velocity to detect fall events It seems impractical to wear an airbag and device all the time
[37,38,39,40,41]Three-dimensional MEMS Bluetooth, accelerometers,
microcontroller unit (MCU), gyroscopes, and high-speed cameras
High-speed cameras are used to record and analyze human motionIt seems impractical to use the device all the time
[42]Neural network algorithm is used for fall detectionImplemented in a wearable device combined with low-energy Bluetooth It seems impractical to wear the expedient all the time
[43]Array-based detectors of smart inactivity Intelligent fall indicator system based on infrared array detectorsInfrared radiation changes impact on elderly fall detection
Proposed MethodologyIRA-E700ST0 pyroelectric infrared sensors (PIR), Arduino Uno, SVM, DT, RF, NB, ABAccuracy, precision (specificity), recall (sensitivity).
RF achieves 99% accuracy in the detection of elderly fall events
Low-cost, sensor-based system
with highly mature, state-of-the-art, and representative algorithms
Table 3. Outcomes of ML models.
Table 3. Outcomes of ML models.
ML AlgorithmAccuracyPrecision (Specificity)Recall (Sensitivity)F-Measure
Three PIR Sensors Dataset
SVM0.9711 (97%)0.970.970.97
DT0.9708 (97%)0.970.970.97
NB0.8942 (89%)0.900.890.89
RF0.9809 (98%)0.980.980.98
AB0.9904 (99%)0.990.990.99
Two PIR Sensors Dataset
SVM93%0.930.930.93
DT92%0.910.920.92
NB86%0.860.860.86
RF96%0.960.960.96
AB98%0.980.980.98
One PIR Sensor Dataset Accuracy
SVM89%0.890.890.89
DT89%0.890.890.89
NB82%0.820.810.81
RF87%0.870.870.87
AB86%0.860.860.86
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hassan, C.A.U.; Karim, F.K.; Abbas, A.; Iqbal, J.; Elmannai, H.; Hussain, S.; Ullah, S.S.; Khan, M.S. A Cost-Effective Fall-Detection Framework for the Elderly Using Sensor-Based Technologies. Sustainability 2023, 15, 3982. https://doi.org/10.3390/su15053982

AMA Style

Hassan CAU, Karim FK, Abbas A, Iqbal J, Elmannai H, Hussain S, Ullah SS, Khan MS. A Cost-Effective Fall-Detection Framework for the Elderly Using Sensor-Based Technologies. Sustainability. 2023; 15(5):3982. https://doi.org/10.3390/su15053982

Chicago/Turabian Style

Hassan, Ch. Anwar Ul, Faten Khalid Karim, Assad Abbas, Jawaid Iqbal, Hela Elmannai, Saddam Hussain, Syed Sajid Ullah, and Muhammad Sufyan Khan. 2023. "A Cost-Effective Fall-Detection Framework for the Elderly Using Sensor-Based Technologies" Sustainability 15, no. 5: 3982. https://doi.org/10.3390/su15053982

APA Style

Hassan, C. A. U., Karim, F. K., Abbas, A., Iqbal, J., Elmannai, H., Hussain, S., Ullah, S. S., & Khan, M. S. (2023). A Cost-Effective Fall-Detection Framework for the Elderly Using Sensor-Based Technologies. Sustainability, 15(5), 3982. https://doi.org/10.3390/su15053982

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop