Next Article in Journal
Neural Network Method of Analysing Sensor Data to Prevent Illegal Cyberattacks
Previous Article in Journal
High-Precision Calibration Technology and Experimental Verification for Dual-Axis Laser Communication Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Advancing Remote Life Sensing for Search and Rescue: A Novel Framework for Precise Vital Signs Detection via Airborne UWB Radar

Department of Military Biomedical Engineering, Air Force Medical University, Xi’an 710032, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2025, 25(17), 5232; https://doi.org/10.3390/s25175232
Submission received: 17 June 2025 / Revised: 25 July 2025 / Accepted: 20 August 2025 / Published: 22 August 2025

Abstract

Highlights

What are the main findings?
  • An airborne bio-radar system to remotely sense vital signs of survivors for post-disaster search and rescue was developed.
  • Theoretical analysis of the impact of interference coming from the motion of the UAV platform and echoes from the background environment on radar detection performance.
What is the implication of the main finding?
  • A signal processing framework based on blind source separation was proposed to precisely extract the respiration and heartbeat, which combines the high-order analytical tool and the feedback notch filter.
  • The remote high-resolution vital signs detection approach is suitable for real-world applications such as search and rescue.

Abstract

Non-contact vital signs detection of the survivors based on bio-radar to identify their life states is significant for field search and rescue. However, when transportation is interrupted, rescue workers and equipment are unable to arrive at the disaster area promptly. In this paper, we report a hovering airborne radar for non-contact vital signs detection to overcome this challenge. The airborne radar system supports a wireless data link, enabling remote control and communication over distances of up to 3 km. In addition, a novel framework based on blind source separation is proposed for vital signals extraction. First, range migration caused by the platform motion is compensated for by the envelope alignment. Then, the respiratory waveform of the human target is extracted by the joint approximative diagonalization of eigenmatrices algorithm. Finally, the heartbeat signal is recovered by respiratory harmonic suppression through a feedback notch filter. The field experiment results demonstrate that the proposed method is capable of precisely extracting vital signals with outstanding robustness and adaptation in more cluttered environments. The work provides a technical basis for remote high-resolution vital signs detection to meet the increasing demands of actual rescue applications.

1. Introduction

Wild search and rescue (SAR) is one of the emphases and difficulties of current emergency rescue medicine [1]. In harsh post-disaster environments, the road is damaged, the traffic is interrupted, the disaster area is vast, and the distribution of the survivors is unknown, which will pose challenges to the deployment of SAR work [2]. In recent years, unmanned aerial vehicles (UAV) have been gradually used to assist rescue of the trapped hikers and missing persons, and some incidents have shown the specific capability and superiority of the UAVs in large-scale disaster risk reduction due to its flexibility, mobility, and no casualties [3,4,5].
At present, sensors equipped by a UAV-based SAR system mainly include optical sensors (e.g., high-resolution camera [6,7,8], thermal imaging camera [9,10,11], and multi-spectral camera [12,13,14]) and radio frequency sensors (e.g., bio-radar [15,16]). The principles, advantages, and applicable scenarios of these sensors are different. The optical sensors identify targets based on the morphological features, the optical features, and the infrared radiation energy of the human body. However, optical sensors are susceptible to interference from environmental conditions, such as temperature, smoke, fog, and obstruction. Bio-radar senses the presence and physiological activities of humans by transmitting electromagnetic waves and analyzing the received echoes, which is the most effective way to detect vital signs under the above conditions.
Based on this, our previous study proposed an unmanned SAR scheme based on multi-source sensors [17], which involves two stages. First, the three optical camera-based UAV systems were released to detect human targets and acquire their position information through global positioning system (GPS) by scanning the disaster-affected area. Second, the radar-based UAV system proceeds to the corresponding positions and hovers to sense physiological information of human targets. In addition, the radar-based UAV system could also find the unconscious victim under dense smoke or under the ruins as shown in Figure 1, which would provide the scientific basis for the formulation of SAR strategies and the allocation of rescue resources.
However, two challenges make the vital signs detection problem of the airborne radar system difficult. First, the UAV platform will inevitably generate relative movement, and human physiological signals are comparatively weak, which will be overwhelmed by the platform motion [18]. Second, the background clutter from the environment where the human target is located will further interfere with the radar echoes. It means that in order to detect vital signs with this technology, we need to precisely separate the wanted signals from the aforementioned multiple interferences.
In recent years, several methods have been proposed to address the platform motion compensation problem. In [18], Cardillo et al. first extracted the radar self-motion from the signals reflected by stationary objects through identifying the clutter range without any additional sensor. In [19], Rong et al. proposed a method based on phase residual, which compensates the platform motion by calculating the residual phase between the human subject and the static background, and then decomposing the residual for vital signals extraction. In [20], Zhang et al. proposed an algorithm to extract respiratory information of the victims by segmenting the UAV motion into multiple time intervals and computing the average energy ratio of the UAV motion signal and human echo signal in the selected frequency band to estimate the compensation coefficient. In [21], a rotating radar UAV system was designed to locate the trapped individuals in collapsed buildings, and a new unwrapping algorithm was proposed to estimate and compensate for the UAV motion based on the reflected echoes. However, the methods mentioned above are mostly for indoor location and based on signals reflected by stationary objects, and the performance may degrade when there is no stationary object. Furthermore, the impact of the background clutter where the human target is located on the detection performance of the airborne radar system has not been thoroughly investigated.
In this paper, a portable airborne radar vital signs sensing system is developed for remote data collection, which adopts impulse–radio ultra-wide band (IR-UWB) radar and Mesh network technology. To address the aforementioned problems, the effect of the platform motion and the background noise on radar detection capacity are theoretically analyzed. Then, a signal processing method based on blind source separation is proposed to separate out vital signals from multiple interferences. First, range migration caused by the platform motion is coarsely compensated by the adjacent envelope alignment method based on cross correlation. Second, we analyze the statistical characteristics of the measured dynamic background clutter from the grassland scenario and find that it may obey the Gaussian distribution, or the mixed Gaussian distribution. In other words, it can be regarded as a Gaussian noise. Then, the respiratory waveform of the human target is extracted by the joint approximative diagonalization of eigenmatrices (JADE) algorithm, which is based on higher-order statistical analysis to remove Gaussian components. Finally, the heartbeat signal is recovered by respiratory harmonic suppression through a feedback notch filter. The performance of this method has been evaluated and tested by field experiments in different scenarios, and these results confirm that the proposed method improves the performance of airborne radar system without additional sensors.
The remainder of this paper is organized as follows. The airborne radar system composition is introduced in Section 2. The signal model is described in Section 3. The proposed method is deliberated in Section 4. The experimental setup and results are presented in Section 5. Finally, the conclusions and future works are given in Section 6.

2. Airborne Bio-Radar System Design

This section generally introduces the hardware and software design of the portable airborne radar system. Figure 2 presents the hardware structure block diagram of the system, which consists of onboard end and ground receiving end. The hardware components of the onboard end include the sensor module, data processing center, and data transmission module. The ground receiving end is utilized to visualize sensor data and information of the casualty on the map.
The quadcopter UAV is equipped with three types of sensors: X4M200 IR-UWB radar, high-definition camera, and GPS. The HZHY-AI313 UAV onboard computer drives the three sensors for data acquisition, and it can process data simultaneously. Then, the collected data are transmitted to the ground receiving end through a Mesh network constructed by image transmission radio stations [22].
The software control program of the system is developed based on the topic and service architecture of ROS2 (Robot Operating System). First, the ground control station operates the UAV to take off and runs the MAVROS program to obtain real-time position and attitude information of the UAV. After the UAV arrives at the search area, it hovers above the designated location, and then the sensor-driven node is operated to activate the UWB radar and visible light camera to recognize and perceive the injured target. Finally, the data transmission node transmits the position and attitude information of the UAV, the physiological information of the target, and the image information to the ground control station for further processing.
The pictures of the airborne radar system are shown in Figure 3. Its weight is about 3 kg, and its endurance of flight is about 25 min. Its communication distance is up to 3 km in an unobstructed environment without any data package loss.

3. Sensing Model

3.1. Principle of UWB Radar for Vital Signs Detection

Figure 4 shows the block diagram and detection principle of a typical UWB radar, which measures the physiological activities of a human target by demodulating the received signal phase [23]. The response of radar can be expressed as:
h t , τ = a v δ τ τ v t + i a i δ τ τ i
where a v δ τ τ t is the response of the human target, i a i δ τ τ i is the response of other static clutter.
Assuming that there is a stationary human target at a distance d 0 , when radar is mounted on a mobile platform to detect the human target, the instantaneous distance from the antenna to the surface of the human chest cavity can be expressed as [16]:
d t = d 0 + A r sin 2 π f r t + A h sin 2 π f h t + d U A V t
where A r and f r are amplitude and frequency of respiratory signal, A h and f h are amplitude and frequency of heartbeat signal. d U A V is the platform motion. In this scenario, the delay of a static object will be dynamic; the response of radar can be further expressed as [24]:
h t , τ = a v δ τ τ v t + a s δ τ τ s ( t )
Time delay τ v t of human target is equal to:
τ v t = 2 d t c
Time delay τ s ( t ) of a static object is equal to:
τ s t = 2 [ d 0 t + d U A V t ] c
where c is the speed of transmitted signal. The received signal can be expressed as:
R t , τ = s τ h t , τ = a v s τ τ v t + a s s τ τ s t
where s τ is the radar transmitted pulse.
The expression function of radar echo signal of human target is:
S t = A e j 2 π f c t + θ t + φ 0
where f c equals to 7.29 GHz and φ 0 is the initial phase, the phase signal can be calculated as [25]:
θ t = 4 π λ ( X t + d U A V ( t ) ) = 4 π λ X t + 4 π λ d U A V t
where λ is wavelength of transmitted signal, X t = A r sin 2 π f r t + A h sin 2 π f h t .
The above analysis indicates that for a moving radar system, the phase echo of the human is composed of the radar platform phase shift and the displacement of physiological activity; the phase echo of the stationary target only includes the radar platform phase shift. The aim of the proposed method is to recover the vital signals from the mixed observed signals.

3.2. Background Clutter

In the airborne radar detection problem, the interference mainly comes from two aspects: (1) The motion of the UAV platform; (2) Echoes from the background environment. When the environment of the human target is the exposed ground, the background clutter is mainly from other detected objects (stationary objects and ground). In this study, we summarize it as the static background environment. When the environment of the human target is the vegetation covered ground, the propeller rotation and wind will cause the movement of the plant blades. In this paper, we summarize it as the dynamic background environment.

3.2.1. Static Background Environment

Figure 5a is the sketch of the static background environment scenario. Previous studies have observed that echo signals from these static objects are only modulated by the platform motion. It means that the separate measurement of radar platform motion can be inferred from the static clutters in the range profile [20].
Same as Formula (8), the phase signal of a stationary object can be expressed as:
θ 1 t = 4 π λ d U A V t .
Assuming that there are several stationary objects in the environment, and the raw echo data can be represented as:
R t , τ = a v s τ τ v t + i a s i s τ τ s i t + n ( t , τ )
where n ( t , τ ) is noise signal. Representing the above equation in the following matrix forms:
R = A S R = r 1 t , τ r 2 t , τ r a t , τ A = a 11   a 12   a 21   a 22   a a 1   a a 2 S = v t , τ d U A V t , τ
where R R a × n , is the multi-range channel echo signal matrix, A R a × 2 is the weight matrix, S R 2 × n is the source signal matrix, and v t , τ is a vital signal.
Based on the above echo model, the problem of detecting human vital signals with the airborne radar system can be described as a source signals estimation problem, which estimates unknown source signals (vital signals and platform motion signals) from the mixed signals.

3.2.2. Dynamic Background Environment

In this study, we take grassland as a typical dynamic background scenario, as shown in Figure 5b. The grassland scenario has a layered structure. When electromagnetic waves irradiate this rough surface, its scattered waves are no longer plane waves [26]. Assuming that the scattered echo of a target point in the grass is e, the scattered echo of a range bin can be expressed as:
E i ( t , τ ) = j = 1 n g e j ( t , τ )
where n g is the number of scatter points.
In addition, the propeller rotation will cause the movement of the grass blades, then the grass movement signal of i range bin is given as:
N i t , τ = j = 1 n g m n j ( t , τ )
where n j ( t , τ ) is the motion signal of a blade, n g m is the number of the blades.
When the human target is also in this range bin, the radar echo signal phase shift of this range bin can be represented as:
θ t , τ = 4 π λ r t , τ + d U A V t , τ + E i t , τ + N i t , τ
In this scenario, to accurately extract target information, it is necessary to address two issues: platform motion interference and background clutter interference.

3.2.3. Statistical Characteristic Analysis of Measured Grass-Surface Clutter

It has been observed that the influence of surface conditions in the form of vegetation and roughness will significantly degrade the radar detection performance. Hence, the comprehensive understanding of the clutter statistical characterization is essential for the design of signal processing techniques.
In this section, we have taken observations and analysis of grass clutter by the experimental data processing method, which has been the main way to analyze the statistical characteristics of surface clutter [27]. The time domain waveform and spectrum of grassland clutter signal measured by airborne radar are shown in Figure 6, which has the same frequency range as respiration and heartbeat. The frequency distribution histogram of clutter is shown in Figure 7a. It can be observed that its distribution exhibits the unimodal characteristic, which can be attempted to fit a normal distribution curve.
First, the Anderson-Darling test is used to test the normality of the overall distribution of clutter data, and the obtained p-value is equal to 0.1655. It indicates that there is no significant difference between the distribution of clutter data and the normal distribution. Then, the analysis looking for the best fit of the clutter signal amplitude probability density function (PDF) is performed, and the regression equation is calculated according to the nonlinear least squares method. The determination coefficient of the regression model is equal to 0.9873, ensuring the effect and accuracy of the fitting result.
The results indicate that the clutter signal may obey the Gaussian distribution or the Gaussian mixture distribution, which could not be filtered by preprocessing. According to the result, the use of high-order cumulants as analytical tools theoretically can completely suppress the influence of Gaussian noise because the higher-order cumulants of Gaussian processes are always equal to zero.

4. Proposed Vital Signals Extraction Method

Human vital signals include respiration and heartbeat, which implicate the life state of the human body. The overall process of the proposed vital signals extraction method is shown in Figure 8, which mainly includes pre-processing, respiratory signal extraction, and heartbeat signal extraction.

4.1. Pre-Processing

4.1.1. Range Migration Compensation

The drone hovers above the human target to be detected, and the radar continuously transmits electromagnetic waves to extract vital signs from the radar echo sequences. During this process, the drone will generate relative movements, with distance variation between the array antenna and the target. The position of the target point in the echo will be changed, resulting in range migration between radar echo sequences. The aim of range migration compensation is to correct this offset, which includes envelope alignment and phase compensation. Figure 9 shows the schematic diagram of range migration compensation.
Envelope alignment is the process of translating the echo data to eliminate distance misalignment, also known as coarse compensation. For two adjacent echo sequences, the distance between the target and the radar changes very little, and their envelopes have a strong correlation. Based on this, the adjacent envelope correlation method calculates the corresponding delay through peak values of their cross-correlation function, which can realize alignment on range bins [28]. Assuming that r 1 ( t ) and r 2 ( t ) are signal sequences of adjacent range with strong correlation, their cross-correlation function is:
R 12 ( t ) = r 1 ( t ) r 2 ( t t ) d t
The displacement amount can be estimated from the maximum position of the cross-correlation function:
τ 12 = a r g m a x ( R 12 t )
Then, shifting the corresponding signal according to the value, the range alignment of these two signals can be completed. The above operation should then be repeated until all echo signals are aligned in range.
After completing envelope alignment, the next step is the phase compensation, also known as fine compensation. Taking the weighted average of the first N a envelope-aligned echoes as the benchmark, the average value of the difference between the phase of the echo signal and the benchmark is the phase compensation value of this echo signal. Figure 10 illustrates the example after range migration compensation processing.

4.1.2. Background Clutter Removal

The raw radar data contains direct-current (DC) components caused by static objects and baseline drift caused by environmental factors. These two types of noise are known as background clutter, which will cause strong interference to the wanted signal. The adopted 100 order DC drift removal method is represented by the following equation:
R D C m ,   n = R M C m , n 1 100 R M C m , n
where R D C m ,   n is the radar data after processing and R M C m , n is the data after range migration compensation. The 2D pseudo-color image of the raw UWB radar data and the data after preprocessing are displayed in Figure 11. The measurement was conducted indoors with the radar fixed at 2 m above a human target lying on the ground. The respiratory and heartbeat signals can be obtained by extracting the echo signals of the corresponding range bin and performing band-pass filtering, and the waveforms are displayed in Figure 12.

4.1.3. Human Target Localization

After the above processing, background interference has been removed, and it is necessary to locate and select the optimal range unit, which is the location of the human target. Computing the square sum of the slow time, the range unit with the maximum sum is the position of the target.
S j = i = 1 n R 2 D C ( j , i ) R T P = m a x [ S j ] j = 1 j = m
where R D C ( j , i ) is the slow time signal of one range, S j is the accumulation of R m c ( j , i ) , and R T P is the range bin with the biggest accumulation.

4.2. Respiratory Signal Extraction

4.2.1. Observed Signals Extraction

Pulse radar divides the entire detection range into equidistant bins, each of which is also known as a range gate. The distributed scatter points on the thorax of a human target can extend over several centimeters of range and appear on the multiple range gates around the range gate with the biggest accumulation [24]. The probable solution of this problem is to extract the upper and lower two range gates centered on the energy maximum range gate, and five channel signals are extracted in total. Then, the BEADS algorithm is adopted to remove baseline drift [29].

4.2.2. Joint Approximate Diagonalization of Eigenmatrices Algorithm

According to Section 3, the airborne radar vital signals detection problem can be described as a source signal separation problem. The source signals are respiratory signal, heartbeat signal, platform motion, and interference from the environment, and the observed signals are instantaneous linear mixture of the source signals. From Equation (11), it can be concluded that the observed signals can be written as:
r 1 t = a 11 s 1 t + a 12 s 2 t + + a 1 t s t t r 2 t = a 21 s 1 t + a 22 s 2 t + + a 2 t s t t r a t = a a 1 s 1 t + a a 2 s 2 t + + a a t s t t
The observed signals can be modeled in the following way [30]:
R = A S
where R is the data matrix, A is the mixing matrix, and S is the source component matrix.
Blind source separation JADE algorithm first whitens the observed signals, the whitening matrix V satisfies [31]:
V A = U
where U is the unitary matrix to be calculated. The covariance matrix of the observed data is:
R r r = E [ R ( k ) R ( k ) H ]
Assuming there is no noise, then R r r is not full rank, and it has M non-zero eigenvalues μ 1 , μ 2 ,   ,   μ m , g 1 ,   g 2 ,   , g m are corresponding feature vectors, the whitening matrix can be expressed as:
V = ( μ 1 1 2 g 1 , μ 2 1 2 g 2 , , μ m 1 2 g m ) T
When there is additive Gaussian white noise, R r r is generally full rank, arranging M feature vectors in descending order: g 1 g 2 g m g m + 1 g n , the variance of noise can be written as:
σ 2 = 1 n m i = 1 n m g m + i
V = [ ( μ 1 σ 2 ) 1 2 g 1 , ( μ 2 σ 2 ) 1 2 g 2 , , ( μ m σ 2 ) 1 2 g m ] T
The whited signal can be represented as:
r ~ k = V R k = V A S k + N k = U S k + V N ( k )
The fact that the higher-order cumulants of any Gaussian process are equal to 0 makes it theoretically possible to completely suppress the impact of Gaussian noise. To separate the wanted signal through the subsequent blind source separation algorithm, it is necessary to construct a fourth-order cumulant matrix of the whitened signal and perform eigenvalue decomposition to obtain the unitary matrix.
Denoting the i column of the unitary matrix is u i , U = u 1 , u 2 , , u i , , u m and u i = ( u i 1 , u i 2 , , u i m ) T . Define M i as follows:
M i = u i u i T , i = 1,2 , , m
where the k l -th element of M i is m i , k l = u i k u i l .
The fourth-order cumulant matrix is defined as [32]:
Q r ~ M i = k , l = 1 , n c u m ( r ~ p , r ~ q , r ~ k , r l ) m i , l k , 1 p , q n
Performing eigenvalue decomposition on the fourth-order cumulant matrix Q r ~ M , the estimated matrix U ~ of the unitary matrix U can be obtained:
Q r ~ M = U ~ Λ U ~ T
Λ = D i a g [ μ 1 u 1 M u 1 T , μ 2 u 2 M u 2 T , , μ m u m M u m T ]
Finally, the estimated signals can be obtained by the estimated matrix:
Y k = U ~ 1 r ~ = U ~ 1 V R
The specific process of JADE algorithm is as follows:
(1)
Decentralizing and whitening the observed signals matrix;
(2)
Constructing the high-order cumulant matrix Q r ~ of the whited matrix;
(3)
Performing joint approximate diagonalization on the matrix Q r ~ to obtain the estimated matrix U ~ of the unitary matrix U ;
(4)
Estimating the source signal according to Equation (31).

4.3. Heartbeat Signal Extraction

4.3.1. Bandpass Filter

After the above analysis, interference is mainly concentrated in low frequency range, and the heart rate range of a normal person is 0.85–3.3 Hz. Hence, the lower cutoff frequency of the bandpass filter is 0.85 Hz, and the upper cutoff frequency of the bandpass filter is 3.3 Hz. The processed signal R B P t can be represented as:
R B P t = R T P ( t ) H B P ( t )
where H B P ( t ) is the impulse response function of the bandpass filter.

4.3.2. Respiratory Harmonic Localization

After obtaining the fundamental frequency of the respiratory signal, locate its third, fourth, and fifth harmonics:
R F L P = + r ( t ) e j ω t d t R b x , R b y = max   v a l u e ( R F L P ) R h = k R b x
where r ( t ) is the extracted respiratory signal in Section 4.2.2, R b x is the respiratory fundamental frequency, and R b y is amplitude of R b x , k = 3 ,   4 ,   5 .

4.3.3. Feedback Notch Filter

The notch filter is given by the following equation:
G z = 1 2 cos ω * z 1 + z 2 1 2 ρ cos ω * z 1 + ρ 2 z 2
where ω * is the notch frequency, ρ is the radius of the poles of G z .
Introducing the feedback structure on traditional notch filter [33], and the transfer function can be expressed as:
H N z = ( 1 + α ) G z 1 + α G z
where α is the feedback coefficient. As shown in Figure 13, the addition of the feedback structure makes the overshoot and bandwidth of the notch filter achieve the ideal degree, improving the performance of the notch filter. The feedback notch filter is utilized to remove the third, fourth, and fifth respiratory harmonics.

5. Experiment and Results

5.1. Experimental Setup

Two field experiments in different scenarios are carried out to test and verify the proposed method, and the experimental setups are shown in Figure 14a. Scenario 1 is grassland, and scenario 2 is wall penetration. The human subject lies in the detection area with normal breathing as the test target, facing the radar transceiver antenna. The measurement time is 30 s.
The parameter setup of X4M200 UWB radar is listed in Table 1. To evaluate experimental result, the subject also wears ErgoLab contact bandage sensor, which could wirelessly collect respiratory and electrocardiogram (ECG) signals. The physiological signals collected by the contact sensor are regarded as the ground truth. Finally, we adopt the existing radar self-motion cancellation method in [19] for comparison. The reference method compensates for the platform motion by calculating residual phase between the human subject and the static clutter, and then it decomposes the residual for vital signals extraction. In our experiments, we place a metal reflector plate in the environment as the stationary object.
In addition, the signal-to-noise ratio (SNR) is used to evaluate whether the proposed method can accurately estimate the respiratory rate (RR) and the heartbeat rate (HR). The definition of SNR is:
S N R = 10 l o g 10 S ( k m a x ) 2 1 N 1 ( k = 0 k m a x 1 S ( k ) 2 + k = k m a x + 1 N S ( k ) 2 )
where S ( k ) is the signal spectrum, N is the number of sampling points of the spectrum, and k m a x is the peak coordinate of the spectrum. Considering that the respiratory and heartbeat spectrum are generally between 0 Hz and 2 Hz, we only examine the frequency information within this frequency band.

5.2. Performance in Realistic Grassland Scenario

The experimental scenario setup is shown in Figure 14a, the unprocessed radar echo data in this scenario are shown in Figure 14b, and the target localization result is shown in Figure 15. The echo of the human target is clear and visible, while the echo of the reflector is fuzzy, which is due to range ambiguities. The observed signals of extracted five range gates are shown in Figure 16, which are similar in waveform but differ in amplitude, and the signal of the middle range unit has the largest energy. The observed signals for each range bin can be considered as the linear combination of cardiac signal, chest motion, platform motion, and noise. In this experiment, the 46th-to-50th-range bins are identified and selected for extracting the vital signals.
After acquisition of raw radar data, signal processing method mentioned in Section 4 is implemented. The respiratory signal extraction results are expressed in Figure 17. According to the corresponding spectrum, it can be inferred that the RR of the bandage sensor is 0.3113 Hz, the RR of the proposed method is 0.3154 Hz, and the RR of the reference method is also 0.3154 Hz. The accuracy of UWB radar is 98.68%, and the difference is caused by time interval error and systematic error, which can be ignored generally. Figure 17b,c clearly shows that the respiratory signal extracted by the proposed method is closer to the reference waveform and has higher SNR compared with the signal extracted by the reference method. The effectiveness of the reference method depends on the quality of the echo signal of the stationary object. In addition, this method requires one or more stationary objects as the reference signal to recover the vital signals, while such restrictive conditions may result in performance degradation in a more cluttered environment.
The results of heartbeat signal extraction are shown in Figure 18. Figure 18a is the reference ECG data, and Figure 18c is the extracted heartbeat signal after removing the fifth-order respiratory harmonic. The acquired reference HR according to the peak detection method is 1.25 Hz, the estimated HR of Radar is 1.295 Hz, and the accuracy is 96.4%. Comparison of detection results indicates that the proposed method is able to detect the overall movement information of the heart, while the attenuation of morphological amplitude is significant, and this is in line with our expected estimation.
The results in the first scenario suggest that the proposed method can effectively suppress background clutter and noise interference from the environment, recover the respiratory and heartbeat signals, and improve the SNR.

5.3. Performance in Through-the-Wall Scenario

For comprehensive verification of the proposed method, we also explore the feasibility of utilizing the airborne radar system for monitoring the vital signals of a single subject through a brick wall. Figure 19 illustrates the experimental scenario description and the collected radar data. The extracted respiratory signal is shown in Figure 20. It has been observed that the captured RR of the bandage sensor is 0.3125 Hz, the RR of the proposed method is 0.332 Hz, and the RR of the reference method is also 0.332 Hz. The accuracy of UWB radar is 93.76%, and the system’s performance experiences slight degradation due to the inevitable attenuation effect of obstacles on electromagnetic waves. Table 2 presents a quantitative comparison between the proposed method and the reference method in two scenarios, and the results demonstrate the superior performance and robustness of our proposed method in respiratory waveform recovery.
Under wall-penetration detection conditions, the bio-radar echo of human physiological motion is weak, and attenuation will reduce the power of the echo, causing distortion of the echo. In addition, multipath effects will seriously affect effective micro-Doppler feature detection and separation. Reducing the center frequency of the radar can enhance its penetration capability, but the sensitivity of the radar will also decrease accordingly. Hence, through-the-wall independent heartbeat signal isolation from the mixture is also challenging for ideal environments [34,35].

5.4. Impact of Distance Between the System and the Victim

Considering that the distance between the victim and the airborne radar system is not fixed in practical applications, we also investigate whether the distance would impact the performance of the system in scenario 1. The detection distance ranges from 2 to 5 m. Table 3 shows the results of the estimation performance of RR and HR for different distances. It illustrates that the estimation performance degrades as the detection distance increases due to the energy attenuation of radar echo signal, which can be addressed by increasing transmission power.

6. Conclusions

In this paper, an airborne IR-UWB radar system for vital signs detection of the unconscious victim has been developed. The communication distance of the developed system is up to 10 km with low time delay and low packet loss rate, which satisfies the real-time requirement of remote vital signs monitor. In addition, a novel framework based on blind source separation for precise respiration and heartbeat extraction has been proposed. To resolve the signal distortion raised by the background environments, we analyze the statistical characteristics of measured grass clutter and note that the radar echo of clutter obeys the Gaussian distribution. Then, a signal processing framework based on the JADE algorithm is proposed to estimate the respiratory and heartbeat signals from the mixed signals. Extensive results in field trials demonstrate the effectiveness and accuracy of the proposed method. This work may provide a new solution for intelligent SAR in harsh environments.

Author Contributions

Conceptualization, G.L. and J.W.; methodology, Y.J. and Y.Y.; software, Y.J. and Z.L.; investigation, F.Q. and T.L.; data curation, Y.Y.; writing—original draft preparation, Y.J.; writing—review and editing, F.Q.; visualization, Y.J.; funding acquisition, G.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Chinese Comprehensive Research Foundation (KJ-2022-A000308), and the Interdisciplinary Research Fund of Fourth Military Medical University (XJ-2023-000201).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wex, F.; Schryen, G.; Feuerriegel, S.; Neumann, D. Emergency Response in Natural Disaster Management: Allocation and Scheduling of Rescue Units. Eur. J. Oper. Res. 2014, 235, 697–708. [Google Scholar] [CrossRef]
  2. Cao, X.; Li, M.; Tao, Y.; Lu, P. HMA-SAR: Multi-Agent Search and Rescue for Unknown Located Dynamic Targets in Completely Unknown Environments. IEEE Robot. Autom. Lett. 2024, 9, 5567–5574. [Google Scholar] [CrossRef]
  3. Nefros, C.; Kitsara, G.; Loupasakis, C. Geographical Information Systems and Remote Sensing Techniques to Reduce the Impact of Natural Disasters in Smart Cities. IFAC-Pap. 2022, 55, 72–77. [Google Scholar] [CrossRef]
  4. Ramírez-Ayala, O.; González-Hernández, I.; Salazar, S.; Flores, J.; Lozano, R. Real-Time Person Detection in Wooded Areas Using Thermal Images from an Aerial Perspective. Sensors 2023, 23, 9216. [Google Scholar] [CrossRef] [PubMed]
  5. Schedl, D.C.; Kurmi, I.; Bimber, O. Search and Rescue with Airborne Optical Sectioning. Nat. Mach. Intell. 2020, 2, 783–790. [Google Scholar] [CrossRef]
  6. Xu, L.; Yang, Q.; Qin, M.; Wu, W.; Kwak, K. Collaborative Human Recognition with Lightweight Models in Drone-based Search and Rescue Operations. IEEE Trans. Veh. Technol. 2023, 73, 1765–1776. [Google Scholar] [CrossRef]
  7. Kucukayan, G.; Karacan, H. YOLO-IHD: Improved Real-Time Human Detection System for Indoor Drones. Sensors 2024, 24, 922. [Google Scholar] [CrossRef]
  8. Martinez-Alpiste, I.; Golcarenarenji, G.; Wang, Q.; Alcaraz-Calero, J.M. Search and Rescue Operation using UAVs: A Case Study. Expert Syst. Appl. 2021, 178, 114937. [Google Scholar] [CrossRef]
  9. Jiang, C.; Ren, H.; Ye, X.; Zhu, J.; Zeng, H.; Nan, Y.; Sun, M.; Ren, X.; Huo, H. Object Detection from UAV Thermal Infrared Images and Videos using YOLO Models. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102912. [Google Scholar] [CrossRef]
  10. Suo, J.; Wang, T.; Zhang, X.; Chen, H.; Zhou, W.; Shi, W. HIT-UAV: A High-altitude Infrared Thermal Dataset for Unmanned Aerial Vehicle-based Object Detection. Sci. Data 2023, 10, 227. [Google Scholar] [CrossRef]
  11. Song, Z.; Yan, Y.; Cao, Y.; Jin, S.; Qi, F.; Li, Z.; Lei, T.; Chen, L.; Jing, Y.; Xia, J.; et al. An infrared dataset for partially occluded person detection in complex environment for search and rescue. Sci. Data 2025, 12, 300. [Google Scholar] [CrossRef] [PubMed]
  12. Cao, Y.; Luo, X.; Yang, J.; Cao, Y.; Yang, M.Y. Locality Guided Cross-modal Feature Aggregation and Pixel-level Fusion for Multispectral Pedestrian Detection. Inf. Fusion 2022, 88, 1–11. [Google Scholar] [CrossRef]
  13. Qi, F.; Zhu, M.; Li, Z.; Lei, T.; Xia, J.; Zhang, L.; Yan, Y.; Wang, J.; Lu, G. Automatic Air-to-ground Recognition of Outdoor Injured Human Targets based on UAV Bimodal Information: The explore study. Appl. Sci. 2022, 12, 3457. [Google Scholar] [CrossRef]
  14. Qi, F.; Xia, J.; Zhu, M.; Jing, Y.; Zhang, L.; Li, Z.; Wang, J.; Lu, G. UAV Multispectral Multi-domain Feature Optimization for the Air-to-ground Recognition of Outdoor Injured Human Targets under Cross-scene Environment. Front. Public Health 2023, 11, 999378. [Google Scholar] [CrossRef]
  15. Xue, W.; Wang, R.; Liu, L.; Wu, D. Accurate Multi-target Vital Signs Detection Method for FMCW Radar. Measurement 2023, 223, 113715. [Google Scholar] [CrossRef]
  16. Aflalo, K.; Zalevsky, Z. Penetrating Barriers: Noncontact Measurement of Vital Bio Signs Using Radio Frequency Technology. Sensors 2024, 24, 5784. [Google Scholar] [CrossRef]
  17. Cao, Y.; Qi, F.; Jing, Y.; Zhu, M.; Lei, T.; Li, Z.; Xia, J.; Wang, J.; Lu, G. Mission Chain Driven Unmanned Aerial Vehicle Swarms Cooperation for the Search and Rescue of Outdoor Injured Human Targets. Drones 2022, 6, 138. [Google Scholar] [CrossRef]
  18. Cardillo, E.; Li, C.; Caddemi, A. Vital Sign Detection and Radar Self-motion Cancellation Through Clutter Identification. IEEE Trans. Microw. Theory Tech. 2021, 69, 1932–1942. [Google Scholar] [CrossRef]
  19. Rong, Y.; Herschfelt, A.; Holtom, J.; Bliss, D.W. Cardiac and Respiratory Sensing from a Hovering UAV Radar Platform. In Proceedings of the 2021 IEEE Statistical Signal Processing Workshop (SSP), Rio de Janeiro, Brazil, 11–14 July 2021; pp. 541–545. [Google Scholar]
  20. Zhang, B.-B.; Zhang, D.; Song, R.; Wang, B.; Hu, Y.; Chen, Y. RF-Search: Searching Unconscious Victim in Smoke Scenes with RF-enabled Drone. In Proceedings of the 29th Annual International Conference on Mobile Computing and Networking, Association for Computing Machinery, Madrid, Spain, 2–6 October 2023; pp. 1–15. [Google Scholar]
  21. Stockel, P.; Wallrath, P.; Herschel, R.; Pohl, N. Detection and Monitoring of People in Collapsed Buildings Using a Rotating Radar on a UAV. IEEE Trans. Radar Syst. 2024, 2, 13–23. [Google Scholar] [CrossRef]
  22. Liu, Q.; Cheng, L.; Jia, A.L.; Liu, C. Deep Reinforcement Learning for Communication Flow Control in Wireless Mesh Networks. IEEE Netw. 2021, 35, 112–119. [Google Scholar] [CrossRef]
  23. Qiao, L.; Li, X.; Xiao, B.; He, M.; Bi, X.; Li, W.; Gao, X. Learning-Refined Integral Null Space Pursuit Algorithm for Noncontact Multisubjects Vital Signs Measurements Using SFCW-UWB and IR-UWB Radar. IEEE Trans. Instrum. Meas. 2022, 71, 8506013. [Google Scholar] [CrossRef]
  24. Antide, E.; Zarudniev, M.; Michel, O.; Pelissier, M. Comparative Study of Radar Architectures for Human Vital Signs Measurement. In Proceedings of the 2020 IEEE Radar Conference (RadarConf20), Florence, Italy, 21–25 September 2020; pp. 1–6. [Google Scholar]
  25. Cardillo, E.; Ferro, L.; Sapienza, G.; Li, C. Reliable eye-blinking detection with millimeter-wave radar glasses. IEEE Trans. Microw. Theory Tech. 2023, 72, 771–779. [Google Scholar] [CrossRef]
  26. Yang, B.; Huang, M.; Xie, Y.; Wang, C.; Rong, Y.; Huang, H.; Duan, T. Classification Method of Uniform Circular Array Radar Ground Clutter Data Based on Chaotic Genetic Algorithm. Sensors 2021, 21, 4596. [Google Scholar] [CrossRef] [PubMed]
  27. Rosenberg, L.; Duk, V. Land Clutter Statistics from an Airborne Passive Bistatic Radar. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5104009. [Google Scholar] [CrossRef]
  28. Cardillo, E.; Li, C.; Caddemi, A. Empowering Blind People Mobility: A Millimeter-Wave Radar Cane. In Proceedings of the 2020 IEEE International Workshop on Metrology for Industry 4.0 & IoT, Roma, Italy, 21–25 September 2020; pp. 213–217. [Google Scholar]
  29. Ning, X.; Selesnick, I.W.; Duval, L. Chromatogram baseline estimation and denoising using sparsity (BEADS). Chemom. Intell. Lab. Syst. 2014, 139, 156–167. [Google Scholar] [CrossRef]
  30. Hyvärinen, A.; Oja, E. Independent component analysis: Algorithms and applications. Neural Netw. 2000, 13, 411–430. [Google Scholar] [CrossRef]
  31. Jin, H.; Pan, J.; Gao, L.; Zhang, C.; Zhang, H. Enhanced blind source separation algorithm for partial discharge signals using Joint Approximate diagonalization of Eigenmatrices. Measurement 2025, 244, 116552. [Google Scholar] [CrossRef]
  32. Li, X.; Adalı, T.; Anderson, M. Joint Blind Source Separation by Generalized Joint Diagonalization of Cumulant Matrices. Signal Process. 2011, 91, 2314–2322. [Google Scholar] [CrossRef]
  33. Pei, S.C.; Guo, B.Y.; Lu, W.Y. Narrowband Notch Filter Using Feedback Structure Tips & Tricks. IEEE Signal Process. Mag. 2016, 33, 115–118. [Google Scholar] [CrossRef]
  34. Liang, F.; Lou, H.; Zhang, Y.; Lv, H.; Yu, X.; An, Q.; Li, Z.; Wang, J. Through-the-wall high-dimensional imaging of human vital signs by combining multiple enhancement algorithms using portable LFMCW-MIMO radar. Measurement 2022, 195, 111074. [Google Scholar] [CrossRef]
  35. Pramudita, A.A.; Lin, D.B.; Hsieh, S.N.; Ali, E.; Ryanu, H.H.; Adiprabowo, T.; Purnomo, A.T. Radar System for Detecting Respiration Vital Sign of Live Victim Behind the Wall. IEEE Sens. J. 2022, 22, 14670–14685. [Google Scholar] [CrossRef]
Figure 1. Two-stage unmanned search and rescue scheme.
Figure 1. Two-stage unmanned search and rescue scheme.
Sensors 25 05232 g001
Figure 2. Hardware structure of airborne radar system. (1). Onboard computer. (2). Transmission radio station of data collection end. (3). Flight controller. (4). X4M200 IR-UWB radar. (5). High-definition camera. (6) GPS module. (7) Transmission radio station of data reception end. (8) Ground station for visualizing.
Figure 2. Hardware structure of airborne radar system. (1). Onboard computer. (2). Transmission radio station of data collection end. (3). Flight controller. (4). X4M200 IR-UWB radar. (5). High-definition camera. (6) GPS module. (7) Transmission radio station of data reception end. (8) Ground station for visualizing.
Sensors 25 05232 g002
Figure 3. Pictures of airborne radar system. (a) Front view of the system. (b) Bottom view of the system.
Figure 3. Pictures of airborne radar system. (a) Front view of the system. (b) Bottom view of the system.
Sensors 25 05232 g003
Figure 4. Block diagram and detection principle of X4M200 IR-UWB radar.
Figure 4. Block diagram and detection principle of X4M200 IR-UWB radar.
Sensors 25 05232 g004
Figure 5. Background environments of the human target: (a) Smooth ground. (b) Grassland.
Figure 5. Background environments of the human target: (a) Smooth ground. (b) Grassland.
Sensors 25 05232 g005
Figure 6. Grass–surface clutter signal. (a) Time domain waveform. (b) Spectrum.
Figure 6. Grass–surface clutter signal. (a) Time domain waveform. (b) Spectrum.
Sensors 25 05232 g006
Figure 7. Statistical characteristics of clutter. (a) Frequency distribution histogram. (b) Fit curve, the probability density function is blue, and the normal distribution fitting curve is red.
Figure 7. Statistical characteristics of clutter. (a) Frequency distribution histogram. (b) Fit curve, the probability density function is blue, and the normal distribution fitting curve is red.
Sensors 25 05232 g007
Figure 8. The block diagram of the vital signal extraction method.
Figure 8. The block diagram of the vital signal extraction method.
Sensors 25 05232 g008
Figure 9. Schematic diagram of range migration compensation.
Figure 9. Schematic diagram of range migration compensation.
Sensors 25 05232 g009
Figure 10. Field test result. (a) Raw data. (b) After compensation.
Figure 10. Field test result. (a) Raw data. (b) After compensation.
Sensors 25 05232 g010
Figure 11. Two-dimensional pseudo-color image of UWB radar. (a) Raw radar data. (b) Data after preprocessing.
Figure 11. Two-dimensional pseudo-color image of UWB radar. (a) Raw radar data. (b) Data after preprocessing.
Sensors 25 05232 g011
Figure 12. Extracted vital signals at the target position. (a) Respiratory signal. (b) Heartbeat signal.
Figure 12. Extracted vital signals at the target position. (a) Respiratory signal. (b) Heartbeat signal.
Sensors 25 05232 g012
Figure 13. Amplitude–frequency curve of the feedback notch filter.
Figure 13. Amplitude–frequency curve of the feedback notch filter.
Sensors 25 05232 g013
Figure 14. Experiment in scenario 1. (a) Scenario setup. (b) Raw radar data.
Figure 14. Experiment in scenario 1. (a) Scenario setup. (b) Raw radar data.
Sensors 25 05232 g014
Figure 15. Target location. (a) Range FFT of raw data. (b) Cross-section view.
Figure 15. Target location. (a) Range FFT of raw data. (b) Cross-section view.
Sensors 25 05232 g015
Figure 16. Observed signals.
Figure 16. Observed signals.
Sensors 25 05232 g016
Figure 17. Results of scenario 1. (a) Reference respiratory signal. (b) Recovered respiratory signal by proposed method. (c) Recovered respiratory signal by reference method. (d) Spectrum of signal (a). (e) Spectrum of signal (b). (f) Spectrum of signal (c).
Figure 17. Results of scenario 1. (a) Reference respiratory signal. (b) Recovered respiratory signal by proposed method. (c) Recovered respiratory signal by reference method. (d) Spectrum of signal (a). (e) Spectrum of signal (b). (f) Spectrum of signal (c).
Sensors 25 05232 g017
Figure 18. Results of scenario 1. (a) Reference heartbeat signal. (b) Spectrum of signal (a). (c) Recovered heartbeat signal. (d) Spectrum of signal (c).
Figure 18. Results of scenario 1. (a) Reference heartbeat signal. (b) Spectrum of signal (a). (c) Recovered heartbeat signal. (d) Spectrum of signal (c).
Sensors 25 05232 g018
Figure 19. Experiment in scenario 2. (a) Scenario setup. (b) Raw radar data.
Figure 19. Experiment in scenario 2. (a) Scenario setup. (b) Raw radar data.
Sensors 25 05232 g019
Figure 20. Results of scenario 2. (a) Reference respiratory signal. (b) Recovered respiratory signal by proposed method. (c) Recovered respiratory signal by reference method. (d) Spectrum of signal (a). (e) Spectrum of signal (b). (f) Spectrum of signal (c).
Figure 20. Results of scenario 2. (a) Reference respiratory signal. (b) Recovered respiratory signal by proposed method. (c) Recovered respiratory signal by reference method. (d) Spectrum of signal (a). (e) Spectrum of signal (b). (f) Spectrum of signal (c).
Sensors 25 05232 g020
Table 1. Radar experimental parameters.
Table 1. Radar experimental parameters.
Center FrequencyBandwidthDetection ZoneRange ResolutionFrame Rate
7.29 GHz1.4 GHz0.4~5 m0.0514 m17 Hz
Table 2. Results of respiratory rate estimation.
Table 2. Results of respiratory rate estimation.
ParameterRR (Hz)Accuracy (%)SNR (dB)
ReferenceProposed MethodReference MethodProposed MethodReference MethodProposed MethodReference Method
Scenario 10.31130.31540.315498.6898.6812.45711.035
Scenario 20.31250.3320.33293.7693.769.0027.983
Table 3. Results of different distances.
Table 3. Results of different distances.
ParameterRR (Hz)Accuracy (%)HR (Hz)Accuracy (%)
2 m0.236698.501.01798.44
3 m0.311396.481.16696.25
4 m0.244995.741.13795.99
5 m0.281393.151.14890.22
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jing, Y.; Yan, Y.; Li, Z.; Qi, F.; Lei, T.; Wang, J.; Lu, G. Advancing Remote Life Sensing for Search and Rescue: A Novel Framework for Precise Vital Signs Detection via Airborne UWB Radar. Sensors 2025, 25, 5232. https://doi.org/10.3390/s25175232

AMA Style

Jing Y, Yan Y, Li Z, Qi F, Lei T, Wang J, Lu G. Advancing Remote Life Sensing for Search and Rescue: A Novel Framework for Precise Vital Signs Detection via Airborne UWB Radar. Sensors. 2025; 25(17):5232. https://doi.org/10.3390/s25175232

Chicago/Turabian Style

Jing, Yu, Yili Yan, Zhao Li, Fugui Qi, Tao Lei, Jianqi Wang, and Guohua Lu. 2025. "Advancing Remote Life Sensing for Search and Rescue: A Novel Framework for Precise Vital Signs Detection via Airborne UWB Radar" Sensors 25, no. 17: 5232. https://doi.org/10.3390/s25175232

APA Style

Jing, Y., Yan, Y., Li, Z., Qi, F., Lei, T., Wang, J., & Lu, G. (2025). Advancing Remote Life Sensing for Search and Rescue: A Novel Framework for Precise Vital Signs Detection via Airborne UWB Radar. Sensors, 25(17), 5232. https://doi.org/10.3390/s25175232

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop