Next Article in Journal
A Deep-Hole Microdrilling Study of Pure Magnesium for Biomedical Applications
Next Article in Special Issue
Research on Pedestrian Indoor Positioning Based on Two-Step Robust Adaptive Cubature Kalman Filter with Smartphone MEMS Sensors
Previous Article in Journal
A Printed Reconfigurable Monopole Antenna Based on a Novel Metamaterial Structures for 5G Applications
Previous Article in Special Issue
Design and Fabrication of an Integrated Hollow Concave Cilium MEMS Cardiac Sound Sensor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Non-Contact Fall Detection Method for Bathroom Application Based on MEMS Infrared Sensors

1
School of Computer, Guangdong University of Technology, Guangzhou 510006, China
2
Guangdong Provincial People’s Hospital, Guangdong Academy of Medical Sciences, Guangdong Institute of Gerontology, Guangzhou 510080, China
3
No. 5 Electronics Research Institute of the Ministry of Industry and Information Technology, Guangzhou 510610, China
*
Authors to whom correspondence should be addressed.
Micromachines 2023, 14(1), 130; https://doi.org/10.3390/mi14010130
Submission received: 12 November 2022 / Revised: 23 December 2022 / Accepted: 30 December 2022 / Published: 3 January 2023
(This article belongs to the Special Issue Advances in MEMS Theory and Applications, 2nd Edition)

Abstract

:
The ratio of the elderly to the total population around the world is larger than 10%, and about 30% of the elderly are injured by falls each year. Accidental falls, especially bathroom falls, account for a large proportion. Therefore, fall events detection of the elderly is of great importance. In this article, a non-contact fall detector based on a Micro-electromechanical Systems Pyroelectric Infrared (MEMS PIR) sensor and a thermopile IR array sensor is designed to detect bathroom falls. Besides, image processing algorithms with a low pass filter and double boundary scans are put forward in detail. Then, the statistical features of the area, center, duration and temperature are extracted. Finally, a 3-layer BP neural network is adopted to identify the fall events. Taking into account the key factors of ambient temperature, objective, illumination, fall speed, fall state, fall area and fall scene, 640 tests were performed in total, and 5-fold cross validation is adopted. Experimental results demonstrate that the averages of the precision, recall, detection accuracy and F1-Score are measured to be 94.45%, 90.94%, 92.81% and 92.66%, respectively, which indicates that the novel detection method is feasible. Thereby, this IOT detector can be extensively used for household bathroom fall detection and is low-cost and privacy-security guaranteed.

1. Introduction

With the development of the economy and the progress of science and technology, the human lifespan continues to extend, and the corresponding issue of population aging has become increasingly prominent, which is a worldwide problem [1]. The population of the elderly is predicted to increase to 1.4 billion by 2030 and 2.1 billion by 2050 [2]. People aging 65 years and older are more vulnerable to fall, and people aged 65 have a risk of 28–35% of falling [3]. The ratio of the elderly to the total population around the world is larger than 10%, which is increasing gradually. According to the World Health Organization (WHO), about 30% of the elderly are injured by falls each year [3], and accidental falls account for a large proportion. Obviously, bathroom falling is one of the most common fall events. Therefore, the capability to detect fall events of the elderly is of great importance since it may cause their long-term stay in hospitals, even death. So far, there are three main fall detection techniques, namely wearable, vision-based, and ambient-based [4,5].
The wearable techniques are mainly based on gyroscopes, accelerometers or an Inertial Measurement Unit (IMU) [6,7,8]. These sensors are embedded in various products, such as belts, watches, necklaces, rings, shoes, bracelets, or wristbands [9,10,11,12,13]. According to the changes of movement characteristics, different fall events can be recognized. Generally, the activity signals are easy to acquire, and the detection accuracies are high. Unfortunately, wearable devices are intrusive-measuring devices since they are attached to the body, causing possible discomfort. Besides, they are power-consuming, and the elders are apt to forget to charge them.
Vision-based techniques are mainly based on video cameras, depth cameras, or thermal cameras [14,15,16,17,18]. They can continuously record the movement images and perform data processing by means of the algorithms of pattern recognition. Once dangerous fall actions are captured, the alarm is triggered. Although they are non-contact detection techniques avoiding potential discomfort, the key limitations are the limited space of application (within the field of view of cameras), high cost, and privacy violation.
The ambient-based techniques are mainly based on pressure sensors, WiFi, or radar sensors [19,20,21]. They can be embedded in the furniture, clothing, floor, and so on. The fall events are identified by the changes in the pressure signals, vital signs signals, or movement signals in specific places. They are unobtrusive and non-invasive; however, they are very expensive, and the installation is also high-cost.
Thereby, aiming at the limitations mentioned above, low-resolution infrared (IR) sensors are adopted to achieve fall detection [22,23]. They have a series of merits, such as low-cost, non-wearable, unobtrusive, non-invasive, and privacy-security guaranteed. Nevertheless, if the number of IR pixels is less and the field of view (FOV) is small, such as 8 by 8 pixels with 60° by 60° FOV, 16 by 4 pixels with 120° by 25° FOV, or 32 by 32 pixels with 33° by 33° FOV, the resolutions and sensitivities will be low [24,25,26,27]. Thus, these IR sensors can be only applied to localized fall detection. To find a balance between privacy security and the image resolution, as well as a compromise between monitoring area and the sensor’s FOV, is very vital. In addition, the motions of the elderly are closely related to fall judgments, whereas it is difficult to detect small motions of the elderly because there is no difference in the thermal image of the low-resolution IR sensor.
Pyroelectric Infrared (PIR) sensors are a useful motion detector [28], which is sensitive to the Infrared Radiation Changes (IRC) induced by human motion. The advantages of PIR sensors are similar to those of thermopile IR sensors, whereas PIR sensors cannot measure the thermogram before or after fall happens. The lack of the human’s contour recognition with the tested thermogram may lead to a misjudgment of fall detection. Therefore, a single sensor, such as a IR or PIR sensor, is hardly competent for fall detection. Recently, multiple fusion sensors, comprising two or more sensors, such as the IR sensor, gyroscope, accelerometer, ElectroCardioGraph (ECG), ultrasonic sensor, depth sensor, or other sensors [29,30,31,32], has been proposed and proven effective to advance the fall detection accuracy.
Given that the consequences of bathroom falls are so severe for the elderly, this paper will focus on studying the fall detection method for bathroom application. According to the analysis mentioned above, this paper will design a novel non-contact fall detector based on the fusion of thermopile IR array sensor and PIR sensor, which is high-accuracy, unobtrusive, non-wearable, non-invasive, low-cost, and privacy-security guaranteed.
As for IR monitoring technique, the fall detection method is mainly composed of three steps [4,5], namely pre-processing, feature extraction, and pattern recognition. Pre-processing is the basis of the whole analysis, which mainly includes filter, position recognition, and contour extraction. However, there are few reports about the related research, especially for position recognition and contour extraction. In order to reduce computing amount and enhance the recognition accuracy, feature extraction is a key step for dimension reduction analysis. Kinematic features (such as contour center, position, velocity, and acceleration, etc.) and Mel-Frequency Cepstral Coefficients (MFCCs) are often extracted as the eigenvectors for fall/non-fall classification [33,34], whereas these feature extraction methods are not useful for fall detection. Finally, some pattern recognition algorithms are applied to accomplish automatic and real-time analysis, among which, support vector machine, Principal Component Analysis (PCA), random forest, fuzzy clustering, and Convolutional Neural Network (CNN) are the prevalent and effective classification algorithms [35,36,37,38,39]. However, some of them are too complex to be realized in the local Microprogrammed Control Unit (MCU).
The mentioned algorithms can be accomplished in the cloud or on the edge [40,41,42]. In view of the requirement that fall detection should be achieved in time and efficiently, edge computing is the best choice, since cloud computing inevitably fails to work once the network connection is unstable. In general, wireless communication and the remote alarm of an IOT device can be achieved by WiFi protocol [4]. That is, a Wi-Fi module can be applied for fall detection [43] or communication, which is a main component of an IOT device. However, considering that the WiFi signal is sometimes unstable and that the device easily drops out of the network, the redundant alarm mechanism can be added to conduct online alarm and positioning in combination with GSM [44,45], which can increase the reliability of life-saving alarms. Owing to the cost, it is also not suitable to transmit the big data by GSM, so edge computing is very essential. The algorithms realized in the MCU are low-cost, low-power-consumption, high-efficiency and high-reliability. However, edge computing in the local MCU requires that the processing algorithms are simple and need few computing resources. Hence, this paper will propose a novel data processing method to satisfy the requirement.

2. Materials and Methods

2.1. System Design

The system architecture of an intelligent fall detector for bathroom monitoring is depicted in Figure 1, which mainly includes three subsystems, as follows:
(1)
Power supply subsystem: Low Dropout Regulator (LDO) and DC/DC converter are powered by a power adapter, then power the whole system.
(2)
Processor subsystem: STM32F411 ARM is applied as the edge-computing MCU. WiFi module (WIFI_WRG1, powered by Tuya Co. Ltd., Hangzhou, China) is adopted to conduct remote communication. The alarm information is sent to the management system operated by the caregiver. Meanwhile, the emergency contacts registered in the APP will be contacted with IP call and message. Given that the WiFi signal is sometimes unstable and that the detector is easy to drop out of the network; hence, a 4G module (PAD_ML302, powered by China Mobile Co. Ltd., Chongqing, China) is added in the detector. In this way, the success rate of alarm can be greatly improved through WiFi and 4G dual communication. Furthermore, the positioning with WiFi and 4G modules is also conducive to rapid rescue.
(3)
Sensor subsystem: A PIR sensor and a thermopile IR array sensor are applied to detect the body movement and the thermal image, respectively, which are utilized for fall recognition. If a fall event is detected, the detector will send a remote alarm with wireless modules, and the LED indicator will light up in red.
The fall detector based on two MEMS IR sensors is shown in Figure 2. It includes an edge-computing MCU, a thermopile IR array sensor, and a PIR sensor. AS312 and 8102-2 are chosen as the PIR sensor and Fresnel lens, respectively, which are made by SENBA Sensing Technology Co., Ltd. (Shenzhen, China). The functions of Fresnel lens are as follows: Firstly, it is used to focus light and filter out light in the non-infrared band. Secondly, the detection area is divided into several bright areas and dark areas, so that the moving objects entering the detection area can generate the change in the thermal infrared signal on the PIR sensor in the form of a temperature change. The detection distance is about 5 m, while the FOV is 120°, as depicted in Figure 3.
HTPA32x32dR2L2.1 (made by HEIMANN Sensor GmbH, Dresden, Germany) is used as the thermopile IR array sensor. It outputs 32 by 32 pixels of the absolute temperature distribution in a 90° by 90° FOV via I2C bus at a maximum of 5 frames per second. The installed angles of the inclination of the PIR sensor and the IR array sensor are both 45°; thus, the detector can monitor whether there is human activity or a fall in the area below the front, as illustrated in Figure 4. Here, the detector is mounted on the sidewall at a height of 1.8 m, rather than the ceiling. The advantages of this method include: (1) The humidity on the wall being lower than that on the ceiling, so that the reliability is advanced; (2) The detector is easily mounted.
In order to improve the moisture-proof performance, the shell of the fall detector is processed by ultrasonic welding, and the sensor opening is waterproof with a silica gel ring, so as to prevent the vapor from seeping into the circuit, advancing the reliability. Now, the detector can achieve IP65 waterproof. Combined with the FOV and inclination angle, the monitored area is calculated to be about 1.8 m by 1.8 m (width by length). Considering the edge effect, the actual effective monitored area is about 1.2 m by 1.5 m, as depicted in Figure 5. In general, the wet area of the bathroom is slippery and someone in that is apt to slip, so the detector should be installed in the wet area of the bathroom. In fact, the area of the wet area is often less than 1.2 m by 1.5 m (i.e., the monitored area).
The side view of someone standing up or falling down is illustrated in Figure 6. It is clear that the human body will appear in the middle and upper part of the thermal image before falling down, whereas the body will appear in the middle and lower part of the thermal image after falling down. That is, the center of the locked body area moves down, and the locked body area will become smaller since the distance increases. Thereby, these changes can be utilized for fall recognition. Besides, combined with a PIR sensor, the detection accuracy can be improved. The fall recognition method consists of the algorithms of image processing, feature extraction, and pattern recognition.

2.2. Image Processing

Image processing includes signals filtering and body positioning.

2.2.1. Signal Filtering

The thermopile IR array sensor outputs 1024 (=32*32) objective temperature values and 1 ambient temperature value, and the sampling rate fs is set to 5 Hz. In order to filter the noise, here a first-order Low Pass Filter (LPF) is adopted, whose transfer function in the continuous frequency domain (s domain) is defined as:
Y [ s ] X [ s ] = ω l s + ω l
where the complex frequency s is equal to , and ω is the angular frequency of the signal. x(k) and y(k) are the k-th input signal and output signal, respectively. X and Y is the Laplace-transform results of the time domain signals x and y, respectively. ωl is the cut-off angular frequency of the LPF, which is set to be 1 since temperature changes slowly. Thus, the cut-off frequency is 0.16 Hz, and the bode diagram of the filter is depicted in Figure 7. After bilinear transformation with (2), the transfer function in the discrete frequency domain (z domain) can be derived as (3).
s = 2 f s ( z 1 ) z + 1
y [ k ] x [ k ] = ω l + ω l z 1 ω l + 2 f s + ( ω l 2 f s ) z 1
Hence, the high-frequency noise can be suppressed by the LPF, as depicted in Figure 8. After filtering, the thermal image is depicted in Figure 9. The background colors green and red represent a low temperature and a high temperature, respectively.

2.2.2. Body Positioning

Generally, the background temperature is lower than the human body temperature, so body positioning can be accomplished according to this basis. Meanwhile, some abnormal temperature points higher than 40 °C or lower than 0 °C should be deleted. Then the minimum and maximum of the rest points can be found as Tmin and Tmax, respectively. Thus, the average of them is set as the threshold Tth. Therefore, temperature points smaller than Tmax and larger than Tth can be selected and labeled with a block number bigger than 0. Assume that T is the current IR temperature array, whose size is 32*32. Here, double boundary scans processing is applied for positioning, which includes three steps:
(1)
The first scan: Define a label array L whose initial values are all 0. Taking boundary extension into account, the size of L is 34 × 34. Furthermore, set the block number bn to be 1. Define a new set P1 as {L[r − 1][c − 1], L[r − 1][c], L[r − 1][c + 1], L[r][c − 1], L[r][c + 1], L[r + 1][c − 1], L[r + 1][c], L[r + 1][c + 1]}. r (1 ≤ r ≤ 32) and c (1 ≤ c ≤ 32) are the row index and column index, respectively. Delete the repeated values or 0 from P1, then a new set P2 can be obtained. During progressive scanning, if P2 is empty, then bn is assigned to L[r][c], and bn is updated as (bn + 1). Otherwise, the minimum in P2 will be assigned to L[r][c]. In addition, if the size of P2 is more than 1, the corresponding blocks are adjacent, then P2 will be added to a relationship table Q. Q is a two-dimensional (2D) array used to save a series of sets. The pseudo-code is shown as follows:
b n = 1 ;   i = 1 ;   Q = ; f o r   r = 1 : 32 f o r   c = 1 : 32 {   i f   T [ r ] [ c ] [ T t h , T m a x ] {   i f ( P 2 = = ) { L [ r ] [ c ] = b n ; b n = b n + 1 ; } e l s e { L [ r ] [ c ] = m i n ( P 2 ) ;   i f ( s i z e ( P 2 ) > 1 )   Q [ i + + ] = P 2 ; } } }
f o r   i = 1 : s i z e ( Q ) f o r   j = ( i + 1 ) : s i z e ( Q ) {   i f ( Q [ i ] Q [ j ] ) { Q [ i ] = Q [ i ] Q [ j ] ;   Q [ j ] = ; } } f o r   r = 1 : 32 f o r   c = 1 : 32 {   f o r   i = 1 : s i z e ( Q ) i f ( L [ r ] [ c ] Q [ i ] ) { L [ r ] [ c ] = m i n ( Q [ i ] ) ;     c n t [ m i n ( Q [ i ] ) ] + + ; }   }
(2)
The second scan: After the first scan, there may be some adjacent blocks; as depicted in Figure 10, blocks 3, 4, and 5 are connected. Thus, they should be merged together, and the second scan is necessary. Firstly, compare the elements in Q in pairs; if their intersection is not empty, then merge them to form a union. Secondly, for each element in Q, select the points corresponding to all the block numbers in this set and then modify their labels to the minimum block number of the set. Thus, all the adjacent blocks are merged. As illustrated in Figure 11, blocks 3, 4, and 5 are merged to form block 3. The pseudo-code is depicted as (5), where cnt is a counter vector applied to record the number of the points of every block.
(3)
Owing to the environmental interference, several high-temperature blocks may be picked out. Considering that the area of the human’s block should be the largest, so finally only the largest block is reserved, and others will be all removed. The pseudo-code is shown as (6), where id is the block number of the largest block. As depicted in Figure 12, blocks 1, 2, and 6 have been eliminated. If a locked potential body area appears, the signal output by the PIR sensor will be combined together to judge whether there is a fall event, then feature extraction is important.
i d = f i n d ( m a x ( c n t ) ) ; f o r   r = 1 : 32 f o r   c = 1 : 32 {   i f ( L [ r ] [ c ] i d )   L [ r ] [ c ] = 0 ; }

2.3. Feature Extraction

A bathroom fall refers to someone slipping in the wet area of the bathroom, and it is related to space and time.
After locking the potential body area, the center coordinate (Xc, Yc) of the locked area can be calculated by averaging the abscissas and ordinates of all the locked points. Correspondingly, the difference of the center coordinates at adjacent moments (i.e., adjacent frames) is computed as (dXc, dYc). Thus, the standard deviation and average of the latest 5 center coordinates (i.e., within 1 s) are (stdXc, stdYc) and (mXc, mYc), respectively. If stdXc and stdYc are both smaller than 1, and the absolute values of dXc and dYc are both less than 2, then the locked area is stable, and this moment is named the stable moment. Then (mXc, mYc) is assigned to the stable center coordinate (SXc, SYc). If the locked area is stable, flag_sta is set as 1, otherwise it is set as 0, and (SXc, SYc) is not updated, as shown in (7). Similarly, the difference of the stable center coordinates at adjacent stable moments is computed as (dSXc, dSYc).
i f ( ( s t d X c < 1 ) & & ( s t d Y c < 1 ) & & ( d X c < 2 ) & & ( d Y c < 2 ) ) f l a g _ s t a = 1 ; e l s e f l a g _ s t a = 0 ;
Assume that TSo and TSn are the last and the current IR temperature arrays at adjacent stable moments, respectively, whose sizes are both 32*32. The changes of the temperatures of the new locked area at adjacent stable moments can be estimated by the Euclidean distance (ED), as depicted in (8). On the other hand, the mean temperature of the locked area can be calculated as Tc, and the number of the locked points is Nc. Likewise, if the locked body area is stable, Tc and Nc will be updated; otherwise, they remain unchanged. The difference of the mean temperatures of the locked areas at adjacent stable moments is recorded as dTc, while the ratio of the numbers of the locked points at adjacent stable moments (dividing the latest by the previous) is recorded as RNc. SPIR is the output of the PIR sensor; if human activity appears, SPIR is 1, and it will last for 2 more seconds; otherwise, it is 0. If condition (9) is met, there is a recognized fall action, and flag_act will be set to be 1.
E D = 0 ; f o r   r = 1 : 32 f o r   c = 1 : 32 {   i f ( L [ r ] [ c ] 0 ) E D = E D + ( T S n [ r ] [ c ] T S o [ r ] [ c ] ) 2 ; } E D = E D ;
i f ( ( d T c < 2 ) & & ( 0.5 < R N c < 1 ) & & ( 20 < N c < 200 ) & & ( d S X c < L Y / 3 ) & & ( d S Y c > L Y / 2 ) & & ( S P I R = = 1 ) & & ( 1 < S X c < 30 ) & & ( E D > 10 ) & & ( f l a g _ s t a = = 1 ) ) f l a g _ a c t = 1 ;
where LY is the last span of the locked area in the y-axis direction. Generally, there is a large displacement in the y-axis direction after falling, while the displacement in the x-axis direction is relatively small. Thus, if someone falls down, the absolute value of dSXc should be less than LY/3, while the absolute value of dSYc should be larger than LY/2. In addition, the difference in the mean temperatures of the locked area before and after falling down should be less than 2 °C. The number of the locked points after falling down should be smaller than that before falling down, but RNc must be bigger than 0.5, as depicted in Figure 13. Simultaneously, owing to the limited body’s area, Nc should be more than 20 and less than 200. The stable center cannot approximate the boundary, so SXc should be more than 1 and less than 30. For suppressing the influence induced by residual hot water or other existed heat sources, ED should be more than 10. When the new locked area has already existed at the last stable moment, then ED must be smaller than 10, and it is not a real fall change.
At the initial moment when flag_act becomes 1, the corresponding Tc and Nc are recorded as Tc0 and Nc0, respectively. Once flag_act and flag_sta both equal 1, a timer is launched to record the duration td, and a PIR sensor is applied to detect human activity. The time of body movement within the latest 1 min is measured to be tbm. If the condition (10) is satisfied, the fall action disappears, and flag_act will be set as 0.
i f ( ( d X c > 2 ) | | ( d Y c > 2 ) | | ( t d > 120 ) | | ( T c T c 0 > 2 ) | | ( N c N c 0 > N c 0 / 3 ) )   f l a g _ a c t = 0 ;
It means that if the locked center is unstable, if td is more than 120 s, if the change of Tc is more than 2 °C, or if the locked area changes by more than 1/3, then flag_act should be cleared. Hence, these constraints are conductive to avoid the interference of residual hot water on the ground or other factors.
Generally, if the body movements are more after the thermal image moves down, the monitored objective may squat down to take a bath or try to get up after falling down, thus no alarm is required in these cases. Only if the locked area is basically stable after moving down, and body movements are fewer and fewer, can it be identified as a fall. Due to the body movements detected by a PIR sensor, some abnormal interference resulting from light, hot water, and sunlight can be eliminated effectively.
In addition, the standard deviations of Nc and Tc within 1 s are calculated as stdNc and stdTc, respectively. Then stdXc, stdYc, td, tbm, stdNc, stdTc, flag_sta and flag_act will be applied to accomplish pattern recognition. To sum up, the risk of misjudgment can be reduced by a series of constraints. Considering that some key parameters are extracted based on the data in the past one minute, so the response time of the detection system is about one minute. Here, the response time of the real-time detection system is hard to advance in order to avoid misjudgment.

2.4. Pattern Recognition

A BP (back propagation) neural network has strong abilities of generalization and nonlinear mapping, so it can be widely used in the learning, prediction, and identification of nonlinear systems. In this paper, a three-layer BP neural network is adopted for fall recognition, as illustrated in Figure 14. EX is the input matrix of the input layer; EZ is the output matrix of the hidden layer, and EY is the output matrix of the output layer. In addition, tansig is the transfer function of the hidden layer, while relu is the transfer function of the output layer. Trainlm is selected as the network’s training function, and mean square error is utilized for performance evaluation. The weights matrices ω1, ω2 and the threshold matrices b1, b2 are adjusted with the steepest descent method until the training error (ERR) reaches the setting target, as shown in (11).
E R R = ( E Y T g ) × ( E Y T g ) / 2
where Tg is the expected target. After training, ω1, ω2, b1, b2 are confirmed, and the network can be applied to accomplish the fall recognition.
As mentioned above, the input matrix is [stdXc, stdYc, td, tbm, stdNc, stdTc, flag_sta, flag_act], and the output matrix is [non-fall or fall]. Hence, the number of the neurons in the input layer is 8, and the number of the neurons in the output layer is 1. In order to advance the learning effect, the number of neurons in the hidden layer is set to 20. Hence, this BP neural network is very simply realized in the MCU.

3. Experimental Results

3.1. Performance Indices

Based on the test platform illustrated in Section 2 and the detection method described in Section 3, a series of experiments can be performed. Here, recall (RE), precision (PR), detection accuracy (ACC), and F1-Score are four important performance indices used to evaluate the recognition performance, as defined in (12).
R E = T P T P + F N ,   P R = T P T P + F P A C C = T P + T N T P + T N + F P + F N F 1 - S c o r e = 2 × R E × P R R E + P R
where TP is the number of the fall events detected correctly, FN is the number of the fall events detected incorrectly, TN is the number of the non-fall events detected correctly, and FP is the number of the non-fall events detected incorrectly.

3.2. Test Scheme

The key factors for bathroom fall detection are listed in Table 1. Considering that ambient temperature has the greatest impact on the IR detection effect, five common room temperature points (such as 18 °C, 21 °C, 24 °C, 27 °C, 30 °C) can be selected to perform the experimental tests. It can be controlled by a bathroom heater or central air conditioner. The tests are accomplished in the bathroom depicted in Figure 5. In this work, a central air conditioner was finally adopted to control the ambient temperature. The controller and temperature sensor of the central air conditioner were placed in the living room, so the controlled ambient temperature was actually the temperature of the living room. If the bathroom has no running hot water, the temperature in the bathroom is basically the same as that in the living room. If the hot water is running, the temperature in the bathroom will actually be slightly higher than the ambient temperature in the living room. Thus, in this way, the actual ambient temperature can be simulated.
Furthermore, the other factors have only two levels to reduce the test amount. For the tested objectives, here a young woman with height of 1.6 m and a young man with height of 1.8 m were recruited to simulate the bathroom falls of the elderly. They have signed an informed consent form and a privacy protection agreement. For estimating the influence of illumination, LED light and sunlight are taken into account. For estimating the response speed, fast fall and slow fall are selected. Given that the state after falling down is also important, two cases of sitting and lying on the ground should be considered. In addition, falling at the boundary or in the center are the two common cases. In general, the bathroom fall may happen when taking a shower or not taking a shower. For each combination of factors, falls and non-falls should be considered; thus, the test amount is 640 (=128 × 5). For each test, nine data composed of eight input parameters and one output are obtained.

3.3. Test Results

Considering that the test cases are limited, five-fold cross validation is adopted. Based on five different ambient temperatures, the test data are separated into five sets (i.e., S1, S2, S3, S4, S5), and every set has 128 tests. Here, S1 is the data set corresponding to 18 °C, while S5 is the data set corresponding to 30 °C. Thus, for every fold, one data set is utilized for validation, and the other four data sets are used for training. Taking Fold No. 5 as an example, S5 is utilized for validation, and {S1, S2, S3, S4} are used for training.
For every fold, the TP, FN, TN, FP of the validation set are calculated and shown in Table 2, then the PR, RE, ACC,and F1-Score can be obtained, as shown in Figure 15. The averages of TP, FN, TN, and FP are 58.2, 5.8, 60.6 and 3.4, respectively. Besides, the averages of PR, RE, ACC, and F1-Score are measured to be 94.45%, 90.94%, 92.81%, and 92.66%, respectively. These results indicate that the misjudgment of non-falls is less than that of falls.
In addition, ACCs of Fold No. 2 and Fold No. 3 are both 95.31%, and the corresponding ambient temperatures are 21 °C and 24 °C. The ambient temperatures of Fold No. 4 and Fold No. 5 are 27 °C and 30 °C, and the ACCs of them are 92.19% and 87.5%, respectively. Although the ambient temperature of Fold No. 1 is the smallest (i.e., 18 °C), the training data set are {S2, S3, S4, S5}, which is worse than the training data sets of Fold No. 2 and Fold No. 3, that is why ACC of Fold No. 1 is 93.75%, less than those of Fold No. 2 and Fold No. 3. To sum up, for IR sensor application, high ambient temperature will affect fall detection accuracy.

4. Discussion

It is obvious that FN is still a little big and that this performance is not enough. Considering that the ambient temperature has a great impact on ACC, if we want to improve it, maybe more fusion sensors should be adopted. For instance, a voice alarm module can be added. MEMS microphone could be applied to acquire the ambient noise and human’s voice, then voice recognition could be performed to judge whether there was a fall event or voice alarm. These will be our next work.
A comparison of different fall detection methods is summarized and listed in Table 3. It indicates that the accuracies of wearable techniques and vision-based techniques are the best, reaching from 96% to 100%. However, wearable techniques are not easily accepted by the elderly, and vision-based techniques have high-cost and constitute a privacy violation. The accuracies of ambient-based techniques are the worst, and they are expensive. The accuracies of low-resolution IR sensors and multi-sensors are similar to that of this work, but those methods have not taken complex bathroom application scenes into consideration. Hence, this work is very significant.

5. Conclusions

In this article, a non-contact fall detector based on a MEMS PIR sensor and a thermopile IR array sensor was designed to detect bathroom falls. Besides, image processing algorithms with a low-pass filter and double boundary scans were put forward in detail. Then, the statistical features of the area, center, duration, and temperature are extracted. Finally, a three-layer BP neural network was adopted to identify the fall events. Taking into account the key factors of ambient temperature, objective, illumination, fall speed, fall state, fall area, and fall scene, in total 640 tests were performed, and five-fold cross validation was adopted. Experimental results demonstrate that the averages of the precision, recall, detection accuracy, and F1-Score were measured to be 94.45%, 90.94%, 92.81% and 92.66%, respectively, which indicates that the novel detection method is feasible. In addition, for IR sensor application, the fall detection accuracy decreases as the ambient temperature increases.
Hence, this IOT detector can be extensively used for household bathroom fall detection, which is low-cost and privacy-security guaranteed. Given that the detection accuracy is not high enough, more fusion sensors and more tests should be adopted, and these will be our next work.

Author Contributions

Conceptualization, H.W.; methodology, C.H.; software, G.Z.; validation, C.H. and G.Z.; formal analysis, J.L.; investigation, S.L.; resources, J.L.; data curation, Q.H.; writing—original draft preparation, C.H.; writing—review and editing, H.W.; visualization, S.L.; supervision, L.C.; project administration, C.H.; funding acquisition, C.H. and L.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partially supported by the National Natural Science Foundation of China (Grant Nos. U22A2012, 62104047, 62173098 and U2001201).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank all anonymous reviewers for their kind suggestions for improving this work. The authors thank everyone who provided assistance for this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. United Nations, Department of Economic and Social Affairs. World Population Ageing. 2015. Available online: http://www.un.org/en/development/desa/population/publications/pdf/ageing/WPA2015_Report.pdf (accessed on 1 June 2019).
  2. Naja, S.; Makhlouf, M.M.E.D.; Chehab, M.A.H. An ageing world of the 21st century: A literature review. Int. J. Community Med. Public Health 2017, 4, 4363. [Google Scholar] [CrossRef] [Green Version]
  3. World Health Organization. Number of People Over 60 Years Set to Double by 2050; Major Societal Changes Required. 2015. Available online: https://www.who.int/news/item/30-09-2015-who-number-of-people-over-60-years-set-to-double-by-2050-major-societal-changes-required (accessed on 1 June 2019).
  4. Singh, A.; Rehman, S.U.; Yongchareon, S.; Chong, P.H.J. Sensor Technologies for Fall Detection Systems: A Review. IEEE Sens. J. 2020, 20, 6889–6919. [Google Scholar] [CrossRef]
  5. Chaccour, K.; Darazi, R.; El Hassani, A.H.; Andres, E. From Fall Detection to Fall Prevention: A Generic Classification of Fall-Related Systems. IEEE Sens. J. 2016, 17, 812–822. [Google Scholar] [CrossRef]
  6. Yacchirema, D.; De Puga, J.S.; Palau, C.; Esteve, M. Fall detection system for elderly people using IoT and big data. Procedia Comput. Sci. 2018, 130, 603–610. [Google Scholar] [CrossRef]
  7. Al Nahian, J.; Ghosh, T.; Al Banna, H.; Aseeri, M.A.; Uddin, M.N.; Ahmed, M.R.; Mahmud, M.; Kaiser, M.S. Towards an Accelerometer-Based Elderly Fall Detection System Using Cross-Disciplinary Time Series Features. IEEE Access 2021, 9, 39413–39431. [Google Scholar] [CrossRef]
  8. Boutellaa, E.; Kerdjidj, O.; Ghanem, K. Covariance matrix based fall detection from multiple wearable sensors. J. Biomed. Inform. 2019, 94, 103189. [Google Scholar] [CrossRef]
  9. Hashim, H.A.; Mohammed, S.L.; Gharghan, S.K. Accurate fall detection for patients with Parkinson’s disease based on a data event algorithm and wireless sensor nodes. Measurement 2020, 156, 107573. [Google Scholar] [CrossRef]
  10. Kostopoulos, P.; Kyritsis, A.I.; Deriaz, M.; Konstantas, D. F2D: A Location Aware Fall Detection System Tested with Real Data from Daily Life of Elderly People. Procedia Comput. Sci. 2016, 98, 212–219. [Google Scholar] [CrossRef] [Green Version]
  11. Xi, X.; Jiang, W.; Lü, Z.; Miran, S.M.; Luo, Z.-Z. Daily Activity Monitoring and Fall Detection Based on Surface Electromyography and Plantar Pressure. Complexity 2020, 2020, 9532067. [Google Scholar] [CrossRef]
  12. Cotechini, V.; Belli, A.; Palma, L.; Morettini, M.; Burattini, L.; Pierleoni, P. A dataset for the development and optimization of fall detection algorithms based on wearable sensors. Data Brief 2019, 23, 103839. [Google Scholar] [CrossRef]
  13. Astriani, M.S.; Bahana, R.; Kurniawan, A.; Yi, L.H. Promoting Data Availability Framework by Using Gamification on Smartphone Fall Detection Based Human Activities. Procedia Comput. Sci. 2021, 179, 913–919. [Google Scholar] [CrossRef]
  14. Cippitelli, E.; Fioranelli, F.; Gambi, E.; Spinsante, S. Radar and RGB-Depth Sensors for Fall Detection: A Review. IEEE Sens. J. 2017, 17, 3585–3604. [Google Scholar] [CrossRef] [Green Version]
  15. Shu, F.; Shu, J. An eight-camera fall detection system using human fall pattern recognition via machine learning by a low-cost android box. Sci. Rep. 2021, 11, 2417. [Google Scholar] [CrossRef] [PubMed]
  16. de Miguel, K.; Brunete, A.; Hernando, M.; Gambao, E. Home Camera-Based Fall Detection System for the Elderly. Sensors 2017, 17, 2864. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Kong, X.; Meng, Z.; Nojiri, N.; Iwahori, Y.; Meng, L.; Tomiyama, H. A HOG-SVM Based Fall Detection IoT System for Elderly Persons Using Deep Sensor. Procedia Comput. Sci. 2019, 147, 276–282. [Google Scholar] [CrossRef]
  18. Rafferty, J.; Synnott, J.; Nugent, C.; Morrison, G.; Tamburini, E. Fall detection through thermal vision sensing. In Ubiquitous Computing and Ambient Intelligence; Springer: Cham, Switzerland, 2016; pp. 84–90. [Google Scholar]
  19. Chaccour, K.; Darazi, R.; el Hassans, A.H.; Andres, E. Smart carpet using differential piezoresistive pressure sensors for elderly fall detection. In Proceedings of the 2015 IEEE 11th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), Abu-Dhabi, United Arab Emirates, 19–21 October 2015; pp. 225–229. [Google Scholar]
  20. Nakamura, T.; Bouazizi, M.; Yamamoto, K.; Ohtsuki, T. Wi-fi-CSI-based fall detection by spectrogram analysis with CNN. In Proceedings of the GLOBECOM 2020–2020 IEEE Global Communications Conference, Taipei, Taiwan, 7–11 December 2020; pp. 1–6. [Google Scholar]
  21. Jokanovic, B.; Amin, M. Fall Detection Using Deep Learning in Range-Doppler Radars. IEEE Trans. Aerosp. Electron. Syst. 2017, 54, 180–189. [Google Scholar] [CrossRef]
  22. Tateno, S.; Meng, F.; Qian, R.; Hachiya, Y. Privacy-Preserved Fall Detection Method with Three-Dimensional Convolutional Neural Network Using Low-Resolution Infrared Array Sensor. Sensors 2020, 20, 5957. [Google Scholar] [CrossRef]
  23. Liu, Z.; Yang, M.; Yuan, Y.; Kan, K.Y. Fall Detection and Personnel Tracking System Using Infrared Array Sensors. IEEE Sens. J. 2020, 20, 9558–9566. [Google Scholar] [CrossRef]
  24. Fan, X.; Zhang, H.; Leung, C.; Shen, Z. Robust unobtrusive fall detection using infrared array sensors. In Proceedings of the 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Daegu, Republic of Korea, 16–18 November 2017; pp. 194–199. [Google Scholar]
  25. Adolf, J.; Macas, M.; Lhotska, L.; Dolezal, J. Deep neural network based body posture recognitions and fall detection from low resolution infrared array sensor. In Proceedings of the 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Madrid, Spain, 3–6 December 2018; pp. 2394–2399. [Google Scholar] [CrossRef]
  26. Hayashida, A.; Moshnyaga, V.; Hashimoto, K. The use of thermal ir array sensor for indoor fall detection. In Proceedings of the 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Banff, AB, Canada, 5–8 October 2017; pp. 594–599. [Google Scholar]
  27. Liang, Q.; Yu, L.; Zhai, X.; Wan, Z.; Nie, H. Activity recognition based on thermopile imaging array sensor. In Proceedings of the 2018 IEEE International Conference on Electro/Information Technology (EIT), Rochester, MI, USA, 3–5 May 2018; pp. 770–773. [Google Scholar]
  28. Guan, Q.; Li, C.; Guo, X.; Shen, B. Infrared signal based elderly fall detection for in-home monitoring. In Proceedings of the 2017 9th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 26–27 August 2017; Volume 1, pp. 373–376. [Google Scholar]
  29. Jansi, R.; Amutha, R. Detection of fall for the elderly in an indoor environment using a tri-axial accelerometer and Kinect depth data. Multidimens. Syst. Signal Process. 2020, 31, 1207–1225. [Google Scholar] [CrossRef]
  30. Nadeem, A.; Mehmood, A.; Rizwan, K. A dataset build using wearable inertial measurement and ECG sensors for activity recognition, fall detection and basic heart anomaly detection system. Data Brief 2019, 27, 104717. [Google Scholar] [CrossRef]
  31. Chen, Z.; Wang, Y. Infrared–ultrasonic sensor fusion for support vector machine–based fall detection. J. Intell. Mater. Syst. Struct. 2018, 29, 2027–2039. [Google Scholar] [CrossRef] [Green Version]
  32. Casilari, E.; Santoyo-Ramón, J.A.; Cano-García, J.M. UMAFall: A Multisensor Dataset for the Research on Automatic Fall Detection. Procedia Comput. Sci. 2017, 110, 32–39. [Google Scholar] [CrossRef]
  33. Karayaneva, Y.; Sharifzadeh, S.; Jing, Y.; Chetty, K.; Tan, B. Sparse Feature Extraction for Activity Detection Using Low-Resolution IR Streams. In Proceedings of the 2019 18th IEEE International Conference on Machine Learning and Applications (ICMLA), Boca Raton, FL, USA, 16–19 December 2019; pp. 1837–1843. [Google Scholar]
  34. Mazurek, P.; Wagner, J.; Morawski, R.Z. Use of kinematic and mel-cepstrum-related features for fall detection based on data from infrared depth sensors. Biomed. Signal Process. Control 2018, 40, 102–110. [Google Scholar] [CrossRef]
  35. Singh, K.; Rajput, A.; Sharma, S. Human Fall Detection Using Machine Learning Methods: A Survey. Int. J. Math. Eng. Manag. Sci. 2020, 5, 161–180. [Google Scholar] [CrossRef]
  36. Iuga, C.; Drăgan, P.; Bușoniu, L. Fall monitoring and detection for at-risk persons using a UAV. IFAC PapersOnLine 2018, 51, 199–204. [Google Scholar] [CrossRef]
  37. Jacob, J.; Nguyen, T.; Lie, D.Y.C.; Zupancic, S.; Bishara, J.; Dentino, A.; Banister, R.E. A fall detection study on the sensors placement location and a rule-based multi-thresholds algorithm using both accelerometer and gyroscopes. In Proceedings of the 2011 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2011), Taipei, Taiwan, 27–30 June 2011; pp. 666–671. [Google Scholar] [CrossRef]
  38. Farsi, M. Application of ensemble RNN deep neural network to the fall detection through IoT environment. Alex. Eng. J. 2021, 60, 199–211. [Google Scholar] [CrossRef]
  39. El Kaid, A.; Baïna, K.; Baïna, J. Reduce False Positive Alerts for Elderly Person Fall Video-Detection Algorithm by convolutional neural network model. Procedia Comput. Sci. 2019, 148, 2–11. [Google Scholar] [CrossRef]
  40. Mrozek, D.; Koczur, A.; Małysiak-Mrozek, B. Fall detection in older adults with mobile IoT devices and machine learning in the cloud and on the edge. Inf. Sci. 2020, 537, 132–147. [Google Scholar] [CrossRef]
  41. Divya, V.; Sri, R.L. Docker-Based Intelligent Fall Detection Using Edge-Fog Cloud Infrastructure. IEEE Internet Things J. 2020, 8, 8133–8144. [Google Scholar] [CrossRef]
  42. Chen, Y.; Kong, X.; Meng, L.; Tomiyama, H. An Edge Computing Based Fall Detection System for Elderly Persons. Procedia Comput. Sci. 2020, 174, 9–14. [Google Scholar] [CrossRef]
  43. Nakamura, T.; Bouazizi, M.; Yamamoto, K.; Ohtsuki, T. Wi-Fi-Based Fall Detection Using Spectrogram Image of Channel State Information. IEEE Internet Things J. 2022, 9, 17220–17234. [Google Scholar] [CrossRef]
  44. Dumitrache, M.; Pasca, S. Fall detection system for elderly with GSM communication and GPS localization. In Proceedings of the 2013 8th International Symposium on Advanced Topics in Electrical Engineering (ATEE), Bucharest, Romania, 23–25 May 2013; pp. 1–6. [Google Scholar]
  45. Chauhan, H.; Rizwan, R.; Fatima, M. IoT Based Fall Detection of a Smart Helmet. In Proceedings of the 2022 7th International Conference on Communication and Electronics Systems (ICCES), Coimbatore, India, 22–24 June 2022; pp. 407–412. [Google Scholar]
Figure 1. System architecture of an intelligent fall detector.
Figure 1. System architecture of an intelligent fall detector.
Micromachines 14 00130 g001
Figure 2. Fall detector based on a MEMS IR sensor and a PIR sensor.
Figure 2. Fall detector based on a MEMS IR sensor and a PIR sensor.
Micromachines 14 00130 g002
Figure 3. Key parameters of the Fresnel lens and PIR sensor.
Figure 3. Key parameters of the Fresnel lens and PIR sensor.
Micromachines 14 00130 g003
Figure 4. Fall detector is mounted on the sidewall at a height of 1.8 m.
Figure 4. Fall detector is mounted on the sidewall at a height of 1.8 m.
Micromachines 14 00130 g004
Figure 5. The actual effective monitored area in the bathroom is about 1.2 m by 1.5 m.
Figure 5. The actual effective monitored area in the bathroom is about 1.2 m by 1.5 m.
Micromachines 14 00130 g005
Figure 6. Side view of somebody standing up or falling down.
Figure 6. Side view of somebody standing up or falling down.
Micromachines 14 00130 g006
Figure 7. Bode diagram of the first-order low-pass filter.
Figure 7. Bode diagram of the first-order low-pass filter.
Micromachines 14 00130 g007
Figure 8. Noise is suppressed by the first-order low-pass filter.
Figure 8. Noise is suppressed by the first-order low-pass filter.
Micromachines 14 00130 g008
Figure 9. Thermal image of the thermopile MEMS IR array sensor.
Figure 9. Thermal image of the thermopile MEMS IR array sensor.
Micromachines 14 00130 g009
Figure 10. Every point is labeled with a block number after the first boundary scan.
Figure 10. Every point is labeled with a block number after the first boundary scan.
Micromachines 14 00130 g010
Figure 11. Adjacent blocks are merged after the second boundary scan.
Figure 11. Adjacent blocks are merged after the second boundary scan.
Micromachines 14 00130 g011
Figure 12. Only the block with the largest area is reserved.
Figure 12. Only the block with the largest area is reserved.
Micromachines 14 00130 g012
Figure 13. The locked body area moves down, and the number of locked points decreases after falling down.
Figure 13. The locked body area moves down, and the number of locked points decreases after falling down.
Micromachines 14 00130 g013
Figure 14. Architecture of a simple BP neural network system.
Figure 14. Architecture of a simple BP neural network system.
Micromachines 14 00130 g014
Figure 15. Different performance indices of 5-fold cross validation.
Figure 15. Different performance indices of 5-fold cross validation.
Micromachines 14 00130 g015
Table 1. Key factors for bathroom fall tests.
Table 1. Key factors for bathroom fall tests.
FactorLevel
Ambient temperature18 °C, 21 °C, 24 °C, 27 °C, 30 °C
Objectivefemale (1.6 m), male (1.8 m)
IlluminationLED light, sunlight
Fall speedfast, slow
Fall statesitting, lying
Fall areaat the boundary, in the center
Fall sceneshower, without shower
Table 2. Test results of bathroom fall experiments.
Table 2. Test results of bathroom fall experiments.
Fold No.TPFNTNFP
1595613
2604622
3604622
4586604
55410586
Average58.25.860.63.4
Table 3. Comparison of different fall detection methods.
Table 3. Comparison of different fall detection methods.
Detection MethodSensorAccuracyCommentReferences
Wearable techniquesinertial sensors, IMU96~100%The elderly are not willing to wear the product and are apt to forget to charge it.[6,7,8,9,10,11,12,13]
Vision-based techniquesvideo cameras, depth cameras, or thermal cameras96~100%high-cost and privacy violation[14,15,16,17,18]
Ambient-based techniquespressure sensors, WiFi, or radar sensors85~90%expensive, and the accuracy is not high[19,20,21]
IR sensorslow resolution IR sensors85~97%Complex bathroom application scenes are not considered.[22,23,24,25,26,27]
Multi-sensorsgyroscope, accelerometer, ECG, ultrasonic sensor, depth sensor, etc.90~97%Complex bathroom application scenes are not considered.[29,30,31,32]
This workPIR + low resolution IR sensor87.5~95.31%suitable for bathroom application/
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

He, C.; Liu, S.; Zhong, G.; Wu, H.; Cheng, L.; Lin, J.; Huang, Q. A Non-Contact Fall Detection Method for Bathroom Application Based on MEMS Infrared Sensors. Micromachines 2023, 14, 130. https://doi.org/10.3390/mi14010130

AMA Style

He C, Liu S, Zhong G, Wu H, Cheng L, Lin J, Huang Q. A Non-Contact Fall Detection Method for Bathroom Application Based on MEMS Infrared Sensors. Micromachines. 2023; 14(1):130. https://doi.org/10.3390/mi14010130

Chicago/Turabian Style

He, Chunhua, Shuibin Liu, Guangxiong Zhong, Heng Wu, Lianglun Cheng, Juze Lin, and Qinwen Huang. 2023. "A Non-Contact Fall Detection Method for Bathroom Application Based on MEMS Infrared Sensors" Micromachines 14, no. 1: 130. https://doi.org/10.3390/mi14010130

APA Style

He, C., Liu, S., Zhong, G., Wu, H., Cheng, L., Lin, J., & Huang, Q. (2023). A Non-Contact Fall Detection Method for Bathroom Application Based on MEMS Infrared Sensors. Micromachines, 14(1), 130. https://doi.org/10.3390/mi14010130

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop