Next Article in Journal
Classification of Plank Techniques Using Wearable Sensors
Previous Article in Journal
Different Sampling Frequencies to Calculate Collective Tactical Variables during Competition: A Case of an Official Female’s Soccer Match
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Near Field 3-D Millimeter-Wave SAR Image Enhancement and Detection with Application of Antenna Pattern Compensation

College of Electronic Science and Technology, National University of Defense Technology, Changsha 410073, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(12), 4509; https://doi.org/10.3390/s22124509
Submission received: 23 April 2022 / Revised: 1 June 2022 / Accepted: 10 June 2022 / Published: 14 June 2022
(This article belongs to the Section Radar Sensors)

Abstract

:
In this paper, a novel near-field high-resolution image focusing technique is proposed. With the emergence of Millimeter-wave (mmWave) devices, near-field synthetic aperture radar (SAR) imaging is widely used in automotive-mounted SAR imaging, UAV imaging, concealed threat detection, etc. Current research is mainly confined to the laboratory environment, thus ignoring the adverse effects of the non-ideal experimental environment on imaging and subsequent detection in real scenarios. To address this problem, we propose an optimized Back-Projection Algorithm (BPA) that considers the loss path of signal propagation among space by converting the amplitude factor in the echo model into a beam-weighting. The proposed algorithm is an image focusing algorithm for arbitrary and irregular arrays, and effectively mitigates sparse array imaging ghosts. We apply the 3DRIED dataset to construct image datasets for target detection, comparing the kappa coefficients of the proposed scheme with those obtained from classic BPA and Range Migration Algorithm (RMA) with amplitude loss compensation. The results show that the proposed algorithm attains a high-fidelity image reconstruction focus.

1. Introduction

With the increased popularity of low-cost commercial Millimeter-wave (mmWave) radar sensors, high-resolution 3-D mmWave imaging systems have recently attracted a wide range of attention [1]. Moreover, it now plays an essential role in many applications, including gesture recognition [2], medical imaging [3,4], automotive-mounted SAR imaging [5,6], UAV imaging, non-destructive testing, and concealed threat detection [7], etc. The success behind the works is partially due to the millimeter radar sensors, which have smaller sizes, higher integration levels, and broader bandwidth [2]. mmWave can penetrate objects such as various composites, wood, clothing, etc. A signal at the mmWave is also non-ionizing and not considered a dangerous radiation source [4]. Although the radar hardware is constantly updated, it is still a considerable challenge to obtain millimeter resolution images. An effective way to acquire high-resolution data is through SAR. The mmWave radar continuously obtains the synthetic aperture by adopting a planar scanning pattern in space at the expense of time complexity to achieve high-resolution imaging. In terms of imaging algorithms, the Back-Projection Algorithm (BPA) [8] and Range Migration Algorithm (RMA) [9,10] are regarded as the most classic algorithms in near-field radar imaging [11,12]. BPA can be used in any array configuration, which offers a time-domain solution to estimate the target scattering coefficient by traversing all imaging grid points to calibrate echo data. High computational loads are necessary for larger data volumes, severely reducing computational efficiency. The RMA using Fourier transform is the most efficient and widely used method in SAR imaging, which is a frequency-domain fast imaging algorithm calculated by converting the echo signal to wavenumber-domain for phase correction. However, the approach is limited as it is only applicable to regular spatial intervals. These algorithms perform image reconstruction from different domains by inverting radar echoes into target scattering factors [13,14,15]. Most imaging algorithms research is based on these two models for optimization. Near field imaging, fast imaging [15], sparse imaging [16,17], MIMO array design [18], and optimization are frequent research topics [19,20,21]. In recent years, the compression-sensing algorithm(CSA) has been a novel technique [22,23]. Since uniform arrays are costly, and the incredibly massive number of arrays often result in complex data processing, the CSA could take full advantage of the sparse characteristics of the target. The CSA requires an extremely high signal-to-noise ratio and also requires extremely accurate physical models to achieve good accuracy at sub-Nyquist sampling. Sun et al. [23] researched the NUFFT algorithm to accelerate CS radar imaging, two fast Gaussian lattice point-based NUFFT methods to expedite the CS process for solving computationally intensive problems in large-scale and real-time radar imaging. Kajbaf H. et al. [24] investigated 3-D microwave imaging compression-sensing processing and mainly discussed the inhomogeneous grid and optimal path sparse sampling. Compared with the SISO-SAR imaging algorithm, separating the transceiver array elements with the MIMO-SAR [25,26,27] should consider the different trajectories of the reflected and transmitted electromagnetic waves, instead of directly applying the equivalent phase model, which makes the signal processing more complicated [28,29,30,31,32]. Yanik, M.E. [12] from the University of Texas pioneered a near-field imaging system with commercially available low-cost industrial-grade mmWave MIMO radar sensors, and explored the concept of virtual antenna array in near-field MIMO SAR, incorporating a multistatic-to-monostatic correction factor to improve image formation. Wang J [31] proposed an approach utilizing the NUFFT algorithm to estimate the signal spectra over a rectilinear grid in the frequency-wavenumber (i.e., f-k) domain from the spatial signals measured by a 2-D MIMO array. Finally, Smith, J.W. [32] introduced the efficient 3-D near-field MIMO-SAR imaging for irregular scanning geometries.
However, the existing literature has not thoroughly discussed the suppression of the imaging clutter or ghost due to the non-ideal electromagnetic environment or sparse arrays during the imaging process [33]. Although practical progress has been made toward the current imaging optimization algorithms, most of them are performed under darkroom conditions and are not generally applicable to security detection scenarios [34,35,36,37,38,39,40,41,42,43].
In this article, from the essence of imaging, we propose a novel image reconstruction focusing technique for the efficient near-field imaging in non-ideal electromagnetic environments or sparse arrays, such as those present in security screening imaging, concealed object imaging, automotive, and UAV SAR. We examine the signal models and system for SAR and develop a method to enhance the imaging quality in terms of the radar equation and the antenna beam. This technique extends the BPA to optimize coherent accumulation. The measured results validate the robustness of the proposed algorithm in a complex environment or sparse arrays, and the analysis in the subsequent sections provides an intensive evaluation of the difference in parameter metrics between the technique and traditional imaging algorithms. Finally, we use the 3DRIED [35] to construct the image datasets under these three algorithms for practical inspection and validation. The proposed method indicates a higher fidelity focus comparable to the traditional planar RMA and BPA, even under non-ideal electromagnetic environments.
Within the article, Section 2 introduces the system models, including the signal model, the RMA with Amplitude Loss Compensation and BPA, a novel enhancement technique to planar BPA, and YOLO detection technique. Section 3 discusses imaging results and parameter metrics presented. Section 4 constructs the image datasets to perform target detection recognition to demonstrate the superiority of the proposed algorithm. Conclusively, Section 5 summarizes the whole paper and then concludes.

2. Relevant Research Theories

2.1. Signal Model

This paper uses the FMCW radar system to evaluate the performance metrics of multiple imaging algorithms in complex environments, with the specific radar measurement scenario as shown in Figure 1. The traditional signal model for FMCW is well described in the literature. Here, the FMCW radar emits a linear frequency modulated signal called chirp
m ( t ) = e j 2 π ( f c t + 0.5 K t 2 ) 0 t T c
where f c is the instantaneous carrier frequency at the time t = 0 , K = B / T c is the chirp of the frequency slope. B is the signal of sweep band and T c is the duration during the fast time.
The signal s ( t ) is the radar transmitted signal, reflected by the scattering point. The radar received antenna accepts the returned echo signal as a time delay of the transmitted signal. It is known as de-chirping and leads to a complex intermediate frequency signal [37]
m ( t ) = σ 1 4 π R t R r e j 2 π ( f c ( t τ ) + 0.5 K ( t τ ) 2 )
where τ is the pulse round trip time delay, R t and R r is the path of electromagnetic wave emission and reflection which can be described as
R t = ( x x t ) 2 + ( y y t ) 2 + ( z z t ) R r = ( x x r ) 2 + ( y y r ) 2 + ( z z r )
where τ = ( R t + R r ) / c , as the transmitted and received antennas are not located at the same point. This is equated to the co-location of the received and transmitted antennas to facilitate the subsequent signal processing. Unlike the direct EPC model for far fields, this approach of near-field approximation has been discussed and researched in the literature [32].
2 R = R t + R r 2 R + ( d x l ) 2 + ( d y l ) 2 4 z 0 R = ( x x ) 2 + ( y y ) 2 + ( z z ) 2
where R is the range between the virtual array element and the target scattering point, d x l and d y l are the horizontal and vertical distances between the transmitted and received antennas.
Therefore, the received signal m ( t ) is mixed with m ( t ) to generate a fundamental IF signal, which can be expressed as
s ( t ) = σ 8 π R 2 e j 2 π ( K τ t + f c τ 0.5 K τ 2 )
The first phase term of (5) is the beat frequency representing range information, and the second phase term means the Doppler phase, which is the critical factor in SAR imaging. The last phase term is known as the residual video phase (RVP) to be ignored [37]. Finally, the signal model can be represented as
s ( k ) = A σ e j 2 k R R 2 2 π f c c k 2 π f c
where f ( x , y ) is the scattering coefficient function and k = 2 π f / c , f = f c + K t . A is the constant term. f indicates the sampling frequency at each sampling point on the signal carrier frequency and k is the wavenumber. Expanding the signal model in (6), we can obtain
s ( x , y , k ) = f ( x , y , z ) R 2 e j 2 k R d x d y d z
The mmWave radar scans a 2-D array by Z shaped mechanical moving to obtain the 3D raw echo data. Therefore, large bandwidth and high resolution in azimuth and height are obtained. To ensure no ghost image, the horizontal and vertical synthetic aperture with spatial sampling interval d x and d y should be satisfied with Nyquist criterion. λ is the wavelength. For the worst case situation that the target is very close to the radar, the aperture spatial interval should be less than λ/4.Traditionally, the aperture spatial interval between λ/4 and λ/2 is the “industry standard” [2]. Furthermore, the resolution of the range direction is determined by the system bandwidth. Specifically, the lower the bandwidth, the larger the range direction resolution for the radar to distinguish two targets. For the 3-D near-field radar imaging system, the spatial resolution [40] along each dimension can be given by
δ x λ D x + D x / 4 2 + ( z z ) 2 / 4 D x + D x d x δ x δ y λ D y + D y / 4 2 + ( z z ) 2 / 4 D y + D y   d y δ y δ z c 2 ( f f c ) c 2 B
where D x and D y are the width of the aperture along the x and y direction of a coordinate system, respectively, D x and D y are the width of the aperture along the x and y direction of the target. Furthermore, the spatial resolution [7] along azimuth and height dimensions can be simply expressed as
δ x λ z 0 2 D x δ y λ z 0 2 D y

2.2. Imaging Algorithm

2.2.1. Range Migration Algorithm with Amplitude Loss Compensation

The Fourier-based algorithm in the subsequent analysis is known as the range migration algorithm or ω k algorithm, which has been widely discussed in more detail elsewhere [2,32]. The amplitude factor and amplitude loss are usually ignored in RMA. Here, we take the amplitude loss into account for a stationary target whose distance Z 0 = z z is constant. The imaging reconstruction will have the phase direction and the preserved first order ( Z 0 R ) 1 to obtain. This amplitude approximation method has been mentioned in the literature [41]. As a result, we can yield
s ( x , y , k ) = f ( x , y , z ) e j 2 k R R d x d y d z
The next derivation is based on the fluctuation equation for spherical waves equivalent to plane waves [30,43]. We can yield
s ( x , y , k ) = j 2 π f ( x , y , z ) e j [ k x ( x x ) + k y ( y y ) + k z ( z z ) ] k z d k x d k y d x d y d z
where k x , k y , k z are the azimuthal-height-range direction of the wavenumber k , and it is vital to note that 4 k 2 = k x 2 + k y 2 + k z 2 , k x 2 + k y 2 4 k 2 is the critical constraint for image reconstruction. Leveraging conjugate symmetry of the spherical wavefront, we can obtain
s * ( x , y , k ) = j 2 π [ f ( x , y , z ) e j ( k x x + k y y + k z z ) d x d y d z ] e j k z z k z e j ( k x x + k y y ) d k x d k y
where ( ) * is the complex conjugate operation. In the Fourier transform operation F F T 2 D   and   I F F T 2 D , the above difference between the primed and unprimed coordinate systems is dropped, as they are coincident. Hence, Equation (9) becomes
s * ( x , y , k ) = j 2 π F ( k x , k y , k z ) e j k z z k z e j ( k x x + k y y ) d k x d k y = j 2 π I F F T 2 D F ( k x , k y , k z ) e j k z z k z
The time-domain convolution corresponds to the frequency-domain product. Removing constant terms and combining Equations (13) can be rewritten as
s * ( k x , k y , k ) = F F T 2 D f ( x , y , k z ) h ( x , y ) = F ( k x , k y , k z ) H ( k x , k y ) H ( k x , k y ) = e j k z z k z
Therefore, the interpolation is used to resample the data cube to uniformly spaced positions in k z , and the 3-D image reconstruction can be carried out as [44]
f ( x , y , z ) = I F F T 3 D k x , k y , k z S t o l t ( k k z ) S * ( k x , k y , k ) k z e j k z z
where S t o l t ( k k z ) means the interpolation to resample the k z domain to uniformly spaced positions.

2.2.2. Back Projection Algorithm

The gold-standard back-projection algorithm performs image reconstruction by coherently accumulating the signals of each transceiver antenna for each band in the time domain. This method can be applied to arbitrary array configurations, but with high computational complexity. Ignoring the loss path of the electromagnetic wave round trip, Equation (7) can be expressed as
s ( x , y , k ) = f ( x , y , z ) e j 2 k R d x d y d z
where R are given in Equation (4). Equation (16) can be rephrased to recover the scattering coefficient function f ( x , y , z ) from the collected raw data s ( x , y , k ) as [8]
f ( x , y , z ) = s ( x , y , k ) e 2 k R d R
s ( x , y , k ) is echo data and e 2 k R is the calibration factor. The scattering coefficient function f ( x , y , z ) can be obtained by sequentially phase-calibrating the echo data in the time domain, traversing all orientation points. However, the traditional BPA has not considered the effect of clutter on imaging results under the non-ideal condition. The following sub-section provides an efficient solution for planar array imaging in non-ideal situation.

2.2.3. Enhanced Back Projection Algorithm

In this section, compared to the traditional 3-D planar SAR imaging reconstruction methods using gold-standard BPA, this section proposes a beam-weighting-based Enhanced back-projection algorithm. Replace the amplitude factor and amplitude loss of 1 R 2 in Equation (6) with the beam-weighting amplitude compensation, which can effectually reduce the adverse effects of the complex electromagnetic environment on image reconstruction and effectively suppress the ghost caused by antenna spacing greater than half-wavelength. The formula is derived as shown below.
Ignoring spatial noise, for an M-element spatial array with array element spacing d ( n o r m a l l y   d = λ 2 ) , the array response vector for a uniform line array is expressed as [45]
m θ = 1 e j 2 π d λ sin θ e j 2 π ( M 1 ) d λ sin θ T
where θ means the angle of azimuth of signal incidence and β = 2 π d sin θ λ The array output can be shown as
Y = l = 1 M e j 2 π λ ( l 1 ) d sin θ = l = 1 M e j ( l 1 ) β = sin ( M β 2 ) M sin ( β 2 ) e j ( M l ) β / 2
After normalization
G = Y Y = sin ( M β / 2 ) M sin ( β / 2 )
For the virtual array elements shown in Figure 1, each array element varies in beam direction from the image grid area. Here, we assume that the number of virtual array elements is N . Then, Equations (18) and (20) can be expressed as
m = m 1 θ , m 2 θ , , m N θ = 1 1 1 e j 2 π λ d sin θ 1 e j 2 π λ d sin θ 2 e j 2 π λ d sin θ N e j 2 π λ ( M 1 ) d sin θ 1 e j 2 π λ ( M 1 ) d sin θ 2 e j 2 π λ ( M 1 ) d sin θ N G = G 1 , G 2 , , G N
Discretize s ( x , y , k ) and the corrected phase of backscattered data correspond to the distance between the virtual array element and the target’s plane grid point. The critical step is to efficiently convert the coherent accumulation into a beam-weighting accumulation process and compensate for the signal amplitude of the imaging grid point. We can yield the 3-D scattering coefficient function f ( M i , M j k , z ) . Hence, the Enhanced BPA image recovery process can be summarized as
  f ( M i , M j k , z ) = i = 1 N j = 1 J k = 1 K I F T k z s ( M i , M j k , k S t o l t ( k k z ) ) e j 2 k R M i , M j k G M i , M j k W M i , M j k
where S t o l t ( k k z ) denotes the Stolt interpolation, I F T k z denotes 1-D inverse Fourier transform operation over the k z domain, and M i   i = 1 , 2 , N means the virtual array elements, where N is the total number of virtual aperture. M j k represents the imaging grid matrix j = 1 , 2 , J   k = 1 , 2 , K , where J , K are the imaging grid size. W M i , M j k is the window function. The electromagnetic waves emitted by each virtual arrays are propagated into space as spherical waves. The latter is mapped to the imaging grid as a two-dimensional antenna pattern. Its amplitude compensation coefficient corresponds to the discrete normalized antenna beam.
The presented enhanced method is similar to the gold standard BPA. It improves the quality of the image using beam-weighting to achieve amplitude compensation in the non-ideal electromagnetic environment. This approach delivers high-fidelity image reconstruction focusing on coherently accumulating the received signal from each transceiver pair. Although its computational complexity is enormous [12,32,43], we can use the integrated parfor-function or GPU of MATLAB to speed up the computation.

2.3. YOLO Detection Network

YOLO (you only look once) provides a fast detection and high accuracy method for target detection, which takes the target detection problem as a predictive regression problem of target regions and categories. In this method, a single neural network has been used to directly predict the probabilities of target regions and categories, providing end-to-end target detection. Compared to conventional target detection, it has the characteristics of simpler process, faster speed, and easier training. YOLO unifies the target recognition process into a neural network to predict the bounding box category of the target by utilizing the complete image information. This network structure has been investigated in the literature and employed as a detection method in this paper[44]. The model is shown in Figure 2.
Class probability includes the probability of the prediction frame containing the target and the accuracy of the prediction frame. It is assumed that the YOLO algorithm detects the targets of n categories, then the class probability of the detected targets in this cell belonging to n categories can be expressed as
P r ( c l a s s i o b j e c t ) P r ( o b j e c t ) I O U p r e d t r u t h = P r ( c l a s s i ) I O U p r e d t r u t h
where P r ( c l a s s i ) I O U p r e d t r u t h is the confidence level and I O U p r e d t r u t h is the ratio of the intersection between the prediction box and the actual box.

3. Imaging Results and Evaluation

The parameter configuration of the experimental platform is described, and the results of imaging are used to evaluate these three imaging algorithms. As shown in Figure 3, the system devices include a three-axis controllable stepper, a 77G ‘AWR1843’ mmWave radar sensor, which can generate 77–81 GHz linear frequency modulated continuous waves, and a ‘DCA1000’ high-performance raw data acquisition card by Texas Instruments (TI), personal computer (PC), and the target.
PC, as the control center, connects with the mmWave radar sensor “AWR1843”, high-performance raw data acquisition card “DCA1000”, and the three-axis controllable stepper. When the mmWave radar and three-axis controllable stepper are in simultaneous operation, the “AWR1843” receives the echo signals on each sub-aperture point by point. It collects the raw data at high speed by DCA1000 and subsequently transfers the echo data to PC via the network.
In this paper, the above algorithm scheme is validated for mmWave FMCW-SAR accurate measurements. The platform works by scanning through a Z−trajectory, and the mmWave radar images the scissor and wrench in forward and side-looking modes. The radar system parameters are shown in Table 1.
In order to verify the superiority of the proposed algorithm in a 60 dBW gaussian white noise environment, we perform a simulation using the radar parameters of the scissor group described in the previous section to validate it. As shown in Figure 4a, the spatial targets model with nine points are distributed in a non-ideal space. Figure 4b–d show the 2-D SAR image. Compared with the image result based on these algorithms, the proposed algorithm can improve the measurement accuracy of RCS.
Figure 5 presents results of the scissor image reconstruction based on the three imaging algorithms. The scissor group is surrounded by angular reflection and scatterers, and the distance between the reference planar and the target is z 0 = 280 mm. The number of space sampling locations is chosen in an approximate virtual array from −100 mm to 100 mm along the x and y directions. To satisfy the sampling condition, the distance between adjacent sampling points is usually less than λ / 4 . Here, the horizontal and vertical space sampling intervals are set as 0.5 mm and 1 mm, respectively. Figure 5a shows the optical image of the experimental scissor model. Figure 5b–d show the 2-D SAR image. After the basic image filtering, 3-D SAR image under these three algorithms are obtained, as seen in Figure 5e–g. A comparison between Figure 5b,d indicates the scissor handle is completely submerged in the side lobes clutter under the BPA. Furthermore, severe loss of detail and edge diffraction can be observed. The scissors in the image under the RMA with amplitude loss compensation are well imaged, only with some streaks on their object contours. However, the side lobes clutter of the image has effectively been suppressed, with the most remarkable focusing result under the enhanced BPA.
Figure 6 shows the simulation using the radar parameters of the wrench group. As shown in Figure 6a, the spatial targets model with nine points are distributed in an ideal space. Figure 6b–d show the 2-D SAR image. When the vertical spacing is greater than half a wavelength, we can clearly observe that the proposed algorithm can suppress the ghost more effectively than the others.
The image reconstruction results of the wrench are shown in Figure 7. The wrench group is placed in the darkroom environment and the distance between the reference planar and the target is z 0 = 300 mm. The number of space sampling locations is chosen in an approximate virtual array from −150 mm to 150 mm along the x and y directions. The horizontal space sampling interval is selected as 0.5 mm. The vertical distance of antenna space sampling intervals is chosen to be 2 mm greater than half a wavelength, leading to the ghost image. Figure 7a shows the optical image of the experimental wrench model. Figure 7b–d show the 2-D SAR image. Figure 7e–g show the 3-D SAR image via these three algorithms. Notice that the images under the BPA and the RMA with amplitude loss compensation have more ghosts and streaks on the object contour. In contrast, the image reconstruction under the enhanced BPA is optimal, in which the ghost image can be effectively suppressed.
In fact, BPA and RMA are essentially the same imaging algorithms, while the former starts from the time domain and the latter from the frequency domain. In the time domain, the energy of each imaging grid point is distributed over all synthetic apertures. BPA compensates for the data of all synthetic aperture points on the imaging grid. At the same time, RMA uses the Fourier transform to fix the energy of each target scattering point at a frequency point in the wavenumber domain, which enables point-to-point focus.
However, in the real scenario, under near-field conditions, where the electromagnetic wave is spherical, the data of the distance cell between the virtual point and imaging grid point contain the scattered energy of the target and other unrelated targets at the same distance. At the same time, the near-field condition means each radar beam could not fully illuminate the target. The traditional BPA will gather other clutter energy at the same distance, thus affecting the image quality. In addition, the sparse space sampling in the virtual array also causes the ghost image due to the appearance of grating lobes on the frequency spectrum.
As for the solution to address side lobes clutter and the ghost, the proposed algorithm sets a beam-weighting method based on the antenna orientation map. The amplitude compensation is applied to the energy of each virtual point with all imaging grid points, and then all virtual points are traversed. From the perspective of the frequency spectrum, the side lobes and grating lobes are suppressed. In essence, the enhanced BPA is designed to make the data on the imaging grid points as much as possible to achieve the real target RCS.
To further evaluate the image quality under these three algorithms, quantitative analysis of the images, azimuth and height directional magnitude profiles, image contrast, and entropy are introduced in the later sections.
In order to verify the imaging performance of the proposed algorithm, Figure 8 shows the azimuth and height direction amplitude image of the scissor and wrench, respectively, for these three algorithms. Row 1 is the azimuthal and height profiles of the scissor. The azimuthal direction shows that the enhanced BPA reduces the intensity of the side lobes clutter by 30–40 dB compared with the BPA. Comparing the RMA with amplitude loss compensation in the scissor, the azimuthal image of the two principal lobes is narrower in the proposed algorithm, which also indicates a higher focusing performance. Row 2 represents the azimuthal and height profiles of the wrench. In the azimuthal direction, the proposed algorithm simply degrades the side lobes compared to the gold-standard BPA. It achieves the same imaging performance as the RMA with amplitude loss compensation. In the height direction, it can be clearly seen that the proposed algorithm shows a drop of at least 20 dB in grating lobes. Therefore, the proposed method boasts the better image, higher fidelity focusing ability, and effectively suppresses the ghost image under non-ideal electromagnetic cases or sparse arrays, compared with the others.
As shown in Figure 8, when the scenario environment is non-ideal or arrays are sparse, there are more grating lobes or side lobes clutter on the image using the traditional BPA and RMA with amplitude loss compensation, contributing to worse image quality. To further evaluate the image quality, the image contrast and entropy are introduced [45] here. The image contrast is the difference of color in an image, indicating the image texture characteristics. The higher the contrast, the more visible the image details. Entropy represents the degree of system disorder. The image entropy can indicate the quality of image focusing. The smaller the entropy, the better the effectiveness of focusing. The image contrast and entropy can be expressed as
I C o n t r a s t = M N α i j 2 2 i = 1 M j = 1 N α i j 4 I E n t r o p y = i = 1 M j = 1 N α i j 2 α i j 2 2 log α i j 2 α i j 2 2
For imaging under three different algorithms, the values of the image contrast and entropy are also rational, as shown in Table 2, From the viewpoint of the image quality. the enhanced BPA contrast is larger than the others, and the image entropy is lower. It means that the texture detail and image quality recovery is improved with the proposed method in radar imaging.

4. Discussion

To further validate the strong robustness of the proposed algorithms in complex environments, 3DRIED was used to construct the image datasets of these three imaging algorithms, and the YOLO algorithm was used for detection and evaluation. Figure 9 shows the different targets of optical images, such as knife, concealed pistol, stiletto and multi-target. Figure 10 shows the detection rates for different targets, with the green and red boxes being the true and prediction boxes, respectively. The prediction box has the category and confidence level in the upper left corner. It can be observed that YOLO can effectively detect the imaged objects. Since the radar antenna spacing in 3DRIED is larger than half a wavelength, the SAR images based on conventional algorithms are affected by the ghost. Due to the presence of the ghost image, the neural network often misidentifies shadows as targets, which affects the accuracy of detection.
The kappa coefficient is introduced further to measure the image evaluation accuracy [46]. It is used to evaluate the accuracy of the classification model. The higher the coefficient, the better the classification accuracy of the image. The range of this coefficient is [−1,1], and in practice, it is usually [0, 1]. The kappa coefficient can be expressed as
κ = P a c c u r a c y P e x p e c t e d   a c c u r a c y 1 P e x p e c t e d   a c c u r a c y
where P a c c u r a c y is the accuracy of the samples, and P e x p e c t e d   a c c u r a c y is the expected accuracy of the samples. The kappa coefficient of images classification are shown in Table 3. It is evident that the proposed algorithm could efficiently suppress the ghost image, and its kappa coefficient is improved compared to the other two algorithms.

5. Conclusions

In this paper, we design a 3-D near-field mmWave SAR imaging platform and propose an image focusing algorithm. Our technique extends the traditional BPA by introducing beam-weighting. The novel algorithm is proposed to efficiently suppress grating lobes or side lobes clutter on the image, applicable to a diverse set of complex environments. By conducting data acquisition, we analyze and evaluate the parameters of the imaging results using these algorithms. The results demonstrate the robustness of our approach in the presence of interference from external factors. Furthermore, we use the 3DRIED to construct image datasets to validate the proposed algorithm. Our algorithm achieves high-fidelity image reconstruction, focusing on experimental studies and detection results, which provides high-quality images as input for subsequent applications. The detection recognition rate is higher. Our purpose is to contribute a complete and efficient imaging and detection identification process for facilitating industrial research.

Author Contributions

Conceptualization, S.S. and S.X.; methodology, S.S. and J.L. (Jie Lu); software, S.S.; validation, S.S. and J.L. (Jie Lu); formal analysis, S.S., J.L. (Jie Lu) and J.W.; investigation, S.S. and J.W.; resources, S.X., S.Q. and Y.L.; data curation, S.S. and J.L. (Jing Lian); writing—original draft preparation, S.S.; writing—review and editing, S.S. and S.X.; visualization, S.S. and S.X.; supervision, J.W. and J.L. (Jing Lian); project administration, S.X., S.Q. and Y.L.; funding acquisition, S.X. and Y.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The National Natural Science Foundation of China (61971429, 61921001).

Data Availability Statement

https://github.com/zzzc1n/3-D-HPRID.git (accessed on 1 March 2022).

Acknowledgments

The authors would like to thank the anonymous reviewers and editors for their selfless help to improve our manuscript. Thanks to School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu Shunjun Wei providing 3DRIED dataset.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sheen, D.M.; Hall, T.E.; McMakin, D.L.; Jones, A.M.; Tedeschi, J.R. Three-dimensional radar imaging techniques and systems for near-field applications. In Proceedings of the Radar Sensor Technology XX, Baltimore, MD, USA, 18–21 April 2016; pp. 230–241. [Google Scholar]
  2. Yanik, M.E.; Torlak, M. Near-Field 2-D SAR Imaging by Millimeter-Wave Radar for Concealed Item Detection. In Proceedings of the 2019 IEEE Radio and Wireless Symposium (RWS), Orlando, FL, USA, 20–23 January 2019. [Google Scholar]
  3. Yuan, G.; Zoughi, R. Millimeter Wave Reflectometry and Imaging for Noninvasive Diagnosis of Skin Burn Injuries. IEEE Trans. Instrum. Meas. 2016, 66, 77–84. [Google Scholar]
  4. Chao, L.; Afsar, M.N.; Korolev, K.A. Millimeter wave dielectric spectroscopy and breast cancer imaging. In Proceedings of the Microwave Integrated Circuits Conference (EuMIC), 2012 7th European, Amsterdam, The Netherlands, 29–30 October 2012. [Google Scholar]
  5. Tokoro, S. Automotive application systems of a millimeter-wave radar. In Proceedings of the Conference on Intelligent Vehicles, Tokyo, Japan, 19–20 September 1996; pp. 51–56. [Google Scholar]
  6. Song, S.; Xing, S.; Wang, J.; Li, Y.; Pang, B. Validation of near-field millimeter wave radar-based RD and RMA time-frequency domain imaging algorithms. In Proceedings of the 2022 IEEE 6th Information Technology and Mechatronics Engineering Conference (ITOEC), Chongqing, China, 4–6 March 2022; pp. 1885–1889. [Google Scholar]
  7. Sheen, D.M.; Mcmakin, D.L.; Hall, T.E. Three-dimensional millimeter-wave imaging for concealed weapon detection. IEEE Trans. Microw. Theory Tech. 2001, 49, 1581–1592. [Google Scholar] [CrossRef]
  8. Yegulalp, A.F. Fast backprojection algorithm for synthetic aperture radar. In Proceedings of the IEEE, Waltham, MA, USA, 22 April 1999. [Google Scholar]
  9. Wang, Z.; Guo, Q.; Tian, X.; Chang, T.; Cui, H.L. Near-Field 3-D Millimeter-Wave Imaging Using MIMO RMA With Range Compensation. IEEE Trans. Microw. Theory Amp. Tech. 2018, 67, 1157–1166. [Google Scholar] [CrossRef]
  10. Yanik, M.E.; Wang, D.; Torlak, M. 3-D MIMO-SAR Imaging Using Multi-Chip Cascaded Millimeter-Wave Sensors. In Proceedings of the 2019 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Ottawa, ON, Canada, 11–14 November 2019. [Google Scholar]
  11. Mohammadian, N.; Furxhi, O.; Short, R.E.; Driggers, R. SAR millimeter wave imaging systems. In Proceedings of the Passive and Active Millimeter-Wave Imaging XXII, Baltimore, MD, USA, 13 May 2019. [Google Scholar]
  12. Yanik, M.E.; Wang, D.; Torlak, M. Development and Demonstration of MIMO-SAR mmWave Imaging Testbeds. IEEE Access 2020, 8, 126019–126038. [Google Scholar] [CrossRef]
  13. Moll, J.; Schops, P.; Krozer, V. Towards Three-Dimensional Millimeter-Wave Radar With the Bistatic Fast-Factorized Back-Projection Algorithm—Potential and Limitations. IEEE Trans. Terahertz Sci. Technol. 2012, 2, 432–440. [Google Scholar] [CrossRef]
  14. Lopez-Sahcnez, J.M.; Fortuny-Guasch, J. 3-D radar imaging using range migration techniques. IEEE Trans. Antennas Propagat. 2000, 48, 728–737. [Google Scholar] [CrossRef]
  15. Tang, K.; Guo, X.; Liang, X.; Lin, Z. Implementation of Real-time Automotive SAR Imaging. In Proceedings of the 2020 IEEE 11th Sensor Array and Multichannel Signal Processing Workshop (SAM), Hangzhou, China, 8–11 June 2020. [Google Scholar]
  16. Wang, M.; Wei, S.; Liang, J.; Liu, S.; Shi, J.; Zhang, X. Lightweight FISTA-Inspired Sparse Reconstruction Network for mmW 3-D Holography. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–20. [Google Scholar] [CrossRef]
  17. Zhuge, X. Three-Dimensional Near-Field MIMO Array Imaging Using Range Migration Techniques. IEEE Trans. Image Processing 2012, 21, 3026–3033. [Google Scholar] [CrossRef]
  18. Gao, J.; Qin, Y.; Deng, B.; Wang, H.; Li, X. Novel Efficient 3D Short-Range Imaging Algorithms for a Scanning 1D-MIMO Array. IEEE Trans. Image Processing 2018, 27, 3631–3643. [Google Scholar] [CrossRef]
  19. Fan, B.; Gao, J.-K.; Li, H.-J.; Jiang, Z.-J.; He, Y. Near-field 3D SAR imaging using a scanning linear MIMO array with arbitrary topologies. IEEE Access 2019, 8, 6782–6791. [Google Scholar] [CrossRef]
  20. Wang, J.; Cetinkaya, H.; Yarovoy, A. NUFFT based frequency-wavenumber domain focusing under MIMO array configurations. In Proceedings of the 2014 IEEE Radar Conference, Cincinnati, OH, USA, 19–23 May 2014; pp. 1–5. [Google Scholar]
  21. Dandes, E.J. Near-optimal signal recovery from random projections. Univers. Encoding Strateg. IEEE Trans. Inf. Theory 2006, 52, 5406–5425. [Google Scholar]
  22. Baraniuk, R.G. Compressive Sensing [Lecture Notes]. IEEE Signal Processing Mag. 2007, 24, 118–121. [Google Scholar] [CrossRef]
  23. Sun, S.; Zhu, G.; Jin, T. Novel methods to accelerate CS radar imaging by NUFFT. IEEE Trans. Geosci. Remote Sens. 2014, 53, 557–566. [Google Scholar]
  24. Kajbaf, H. Compressive Sensing for 3D Microwave Imaging Systems; Missouri University of Science and Technology: Rolla, MO, USA, 2012. [Google Scholar]
  25. Li, J.; Stoica, P. MIMO Radar Signal Processing || Concepts and Applications of a MIMO Radar System with Widely Separated Antennas; Wiley-IEEE Press: Hoboken, NJ, USA, 2008; pp. 365–410. [Google Scholar] [CrossRef]
  26. Bliss, D.W.; Forsythe, K.W. Multiple-input multiple-output (MIMO) radar and imaging: Degrees of freedom and resolution. In Proceedings of the Conference Record of the Thirty-Seventh Asilomar Conference on Signals, Systems and Computers, Pacific Grove, CA, USA, 9–12 November 2003. [Google Scholar]
  27. Gao, J.; Deng, B.; Qin, Y.; Wang, H.; Li, X. An Efficient Algorithm for MIMO Cylindrical Millimeter-Wave Holographic 3-D Imaging. IEEE Trans. Microw. Theory Tech. 2018, 66, 5065–5074. [Google Scholar] [CrossRef]
  28. Chen, X.; Zeng, Y.; Yang, Q.; Deng, B.; Wang, H. An Active Millimeter-Wave Imager Based on MIMO-SAR Scheme. J. Infrared Millim. Terahertz Waves 2021, 42, 1027–1039. [Google Scholar] [CrossRef]
  29. Smith, J.W.; Yanik, M.E.; Torlak, M. Near-Field MIMO-ISAR Millimeter-Wave Imaging. In Proceedings of the 2020 IEEE Radar Conference (RadarConf20), Florence, Italy, 21–25 September 2020. [Google Scholar]
  30. Yanik, M.E.; Torlak, M. Near-Field MIMO-SAR Millimeter-Wave Imaging with Sparsely Sampled Aperture Data. IEEE Access 2019, 7, 31801–31819. [Google Scholar] [CrossRef]
  31. Wang, J.; Aubry, P.; Yarovoy, A. 3-D Short-Range Imaging With Irregular MIMO Arrays Using NUFFT-Based Range Migration Algorithm. IEEE Trans. Geosci. Remote Sens. 2020, 58, 4730–4742. [Google Scholar] [CrossRef]
  32. Smith, J.W.; Torlak, M. Efficient 3-D Near-Field MIMO-SAR Imaging for Irregular Scanning Geometries. IEEE Access 2022, 10, 10283–10294. [Google Scholar] [CrossRef]
  33. Ren, Z.; Boybay, M.S.; Ramahi, O.M. Near-Field Probes for Subsurface Detection Using Split-Ring Resonators. IEEE Trans. Microw. Theory Tech. 2011, 59, 488–495. [Google Scholar] [CrossRef]
  34. Hao, J.; Li, J.; Pi, Y. Three-dimensional imaging of terahertz circular SAR with sparse linear array. Sensors 2018, 18, 2477. [Google Scholar] [CrossRef] [Green Version]
  35. Wei, S.; Zhou, Z.; Wang, M.; Wei, J.; Liu, S.; Shi, J.; Zhang, X.; Fan, F. 3DRIED: A High-Resolution 3-D Millimeter-Wave Radar Dataset Dedicated to Imaging and Evaluation. Remote Sens. 2021, 13, 3366. [Google Scholar] [CrossRef]
  36. Meta, A.; Hoogeboom, P.; Ligthart, P.L. Signal Processing for FMCW SAR. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3519–3532. [Google Scholar] [CrossRef]
  37. Yanik, M.E. Millimeter-Wave Imaging Using MIMO-SAR Techniques; The University of Texas at Dallas: Richardson, TX, USA, 2020. [Google Scholar]
  38. Wang, G.; Munoz-Ferreras, J.-M.; Gu, C.; Li, C.; Gomez-Garcia, R. Application of linear-frequency-modulated continuous-wave (LFMCW) radars for tracking of vital signs. IEEE Trans. Microw. Theory Tech. 2014, 62, 1387–1399. [Google Scholar] [CrossRef]
  39. Brekhovskikh, L.M.; Godin, O.A. Springer series on wave phenomena, 10. In Acoustics of Layered Media II: Point Sources and Bounded Beams, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 1999. [Google Scholar]
  40. Weyl, H. Ausbreitung elektromagnetischer Wellen über einem ebenen Leiter. Ann. Phys. 2006, 365, 481–500. [Google Scholar] [CrossRef] [Green Version]
  41. Moulder, W.F.; Krieger, J.D.; Majewski, J.J.; Coldwell, C.M.; Herd, J.S. Development of a high-throughput microwave imaging system for concealed weapons detection. In Proceedings of the IEEE International Symposium on Phased Array Systems & Technology, Waltham, MA, USA, 18–21 October 2017. [Google Scholar]
  42. Miller, R. Fundamentals of Radar Signal Processing (Richards, M.A.; 2005) [Book review]. Signal Processing Mag. IEEE 2009, 26, 100–101. [Google Scholar] [CrossRef]
  43. Zhu, R.; Zhou, J.; Jiang, G.; Fu, Q. Range migration algorithm for near-field MIMO-SAR imaging. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2280–2284. [Google Scholar] [CrossRef]
  44. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
  45. Zhang, S.; Liu, Y.; Li, X. Fast Entropy Minimization Based Autofocusing Technique for ISAR Imaging. Signal Processing IEEE Trans. 2015, 63, 3425–3434. [Google Scholar] [CrossRef]
  46. Chicco, D.; Warrens, M.J.; Jurman, G. The Matthews Correlation Coefficient (MCC) is More Informative Than Cohen’s Kappa and Brier Score in Binary Classification Assessment. IEEE Access 2021, 9, 78368–78381. [Google Scholar] [CrossRef]
Figure 1. The geometry of the SISO-SAR imaging configuration, where a planar aperture is synthesized by Z−shaped mechanical moving.
Figure 1. The geometry of the SISO-SAR imaging configuration, where a planar aperture is synthesized by Z−shaped mechanical moving.
Sensors 22 04509 g001
Figure 2. The system of detection models.
Figure 2. The system of detection models.
Sensors 22 04509 g002
Figure 3. The experimental equipment of SAR system.
Figure 3. The experimental equipment of SAR system.
Sensors 22 04509 g003
Figure 4. Spatial target model of multiple points in a 60 dBW gaussian white noise environment. (a) are the multiple points. (b) is the 2-D multiple points imaging result in BPA. (c) is the 2-D multiple points imaging result in RMA with amplitude loss compensation. (d) is the 2-D multiple points imaging result in Enhanced BPA.
Figure 4. Spatial target model of multiple points in a 60 dBW gaussian white noise environment. (a) are the multiple points. (b) is the 2-D multiple points imaging result in BPA. (c) is the 2-D multiple points imaging result in RMA with amplitude loss compensation. (d) is the 2-D multiple points imaging result in Enhanced BPA.
Sensors 22 04509 g004
Figure 5. Imaging results in a non-ideal environment. (a) is the scissor optical image. (b) is the 2-D scissor imaging result in BPA. (c) is the 2-D scissor imaging result in RMA with amplitude loss compensation. (d) is the 2-D scissor imaging result in Enhanced BPA. (e) is the 3-D scissor imaging result in BPA. (f) is the 3-D scissor imaging result in RMA with amplitude loss compensation. (g) is the 3-D scissor imaging result in Enhanced BPA.
Figure 5. Imaging results in a non-ideal environment. (a) is the scissor optical image. (b) is the 2-D scissor imaging result in BPA. (c) is the 2-D scissor imaging result in RMA with amplitude loss compensation. (d) is the 2-D scissor imaging result in Enhanced BPA. (e) is the 3-D scissor imaging result in BPA. (f) is the 3-D scissor imaging result in RMA with amplitude loss compensation. (g) is the 3-D scissor imaging result in Enhanced BPA.
Sensors 22 04509 g005aSensors 22 04509 g005b
Figure 6. Spatial target model of multiple points in an ideal space. (a) are the multiple points. (b) is the 2-D multiple points imaging result in BPA. (c) is the 2-D multiple points imaging result in RMA with amplitude loss compensation. (d) is the 2-D multiple points imaging result in Enhanced BPA.
Figure 6. Spatial target model of multiple points in an ideal space. (a) are the multiple points. (b) is the 2-D multiple points imaging result in BPA. (c) is the 2-D multiple points imaging result in RMA with amplitude loss compensation. (d) is the 2-D multiple points imaging result in Enhanced BPA.
Sensors 22 04509 g006aSensors 22 04509 g006b
Figure 7. Imaging results in dark environment. (a) is the wrench optical image. (b) is the 2-D wrench imaging result in BPA. (c) is the 2-D wrench imaging result in RMA with amplitude loss compensation. (d) is the 2-D wrench imaging result in Enhanced BPA. (e) is the 3-D wrench imaging result in BPA. (f) is the 3-D wrench imaging result in RMA with amplitude loss compensation. (g) is the 3-D wrench imaging result in Enhanced BPA.
Figure 7. Imaging results in dark environment. (a) is the wrench optical image. (b) is the 2-D wrench imaging result in BPA. (c) is the 2-D wrench imaging result in RMA with amplitude loss compensation. (d) is the 2-D wrench imaging result in Enhanced BPA. (e) is the 3-D wrench imaging result in BPA. (f) is the 3-D wrench imaging result in RMA with amplitude loss compensation. (g) is the 3-D wrench imaging result in Enhanced BPA.
Sensors 22 04509 g007
Figure 8. Azimuthal and height profiles. Row 1 is the azimuthal and height profiles of the scissor, while row 2 is the azimuthal and altitudinal profile of the wrench. Column 1 is the Azimuth profiles of the targets; Column 2 is the height profiles of the targets.
Figure 8. Azimuthal and height profiles. Row 1 is the azimuthal and height profiles of the scissor, while row 2 is the azimuthal and altitudinal profile of the wrench. Column 1 is the Azimuth profiles of the targets; Column 2 is the height profiles of the targets.
Sensors 22 04509 g008
Figure 9. Optical image in different targets.
Figure 9. Optical image in different targets.
Sensors 22 04509 g009
Figure 10. Detection and identification profiles. Row 1 is the knife of SAR images under the three imaging algorithms. Row 2 is the concealed pistol of SAR images under the three imaging algorithms. Row 3 is the stiletto of SAR images under the three imaging algorithms. Row 4 is the pistol and knife of SAR images under the three imaging algorithms. Column 1 is the Enhanced BPA imaging detection results. Column 2 is the BPA imaging detection results. Column 3 is the RMA with amplitude loss compensation imaging results.
Figure 10. Detection and identification profiles. Row 1 is the knife of SAR images under the three imaging algorithms. Row 2 is the concealed pistol of SAR images under the three imaging algorithms. Row 3 is the stiletto of SAR images under the three imaging algorithms. Row 4 is the pistol and knife of SAR images under the three imaging algorithms. Column 1 is the Enhanced BPA imaging detection results. Column 2 is the BPA imaging detection results. Column 3 is the RMA with amplitude loss compensation imaging results.
Sensors 22 04509 g010
Table 1. The parameters of the SAR platform.
Table 1. The parameters of the SAR platform.
Parameter TypeNumerical ValueUnit
Centre Carrier Frequency79GHz
Platform Speed20mm/s
Pulse Repetition Period25ms
Bandwidth4GHz
Range Resolution3.75cm
Azimuth resolution0.5mm
Height resolution0.5mm
Azimuth to Sub-aperture spacing0.5mm
Scissor Height to sub-aperture spacing1mm
Wrench Height to sub-aperture spacing2mm
Scissor Synthetic Aperture Size200 × 200mm2
Wrench Synthetic Aperture Size300 × 300mm2
Vertical Distance between Scissor and Radar280mm
Vertical Distance between Wrench and Radar300mm
Table 2. Quality evaluation of image contrast and entropy in different environments.
Table 2. Quality evaluation of image contrast and entropy in different environments.
Evaluation SystemsEnhanced BPABPAAmplitude Compensation-RMA
I C o n t r a s t S c i s s o r 134.567389.5611112.6986
I E n t r o p y S c i s s o r 3.80544.26883.8837
I C o n t r a s t W r e n c h 220.877188.3848191.0390
I E n t r o p y W r e n c h 3.74563.82303.8499
Table 3. Quality evaluation of kappa coefficient in different objects.
Table 3. Quality evaluation of kappa coefficient in different objects.
Kappa CoefficientsEnhancement-BPABPAAmplitude Compensation-RMA
κ k n i f e 0.920.870.85
κ p i s t o l 0.930.920.91
κ s t i l e t t o 0.950.860.89
κ p i s t o l , k n i f e 0.930.870.81
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Song, S.; Lu, J.; Xing, S.; Quan, S.; Wang, J.; Li, Y.; Lian, J. Near Field 3-D Millimeter-Wave SAR Image Enhancement and Detection with Application of Antenna Pattern Compensation. Sensors 2022, 22, 4509. https://doi.org/10.3390/s22124509

AMA Style

Song S, Lu J, Xing S, Quan S, Wang J, Li Y, Lian J. Near Field 3-D Millimeter-Wave SAR Image Enhancement and Detection with Application of Antenna Pattern Compensation. Sensors. 2022; 22(12):4509. https://doi.org/10.3390/s22124509

Chicago/Turabian Style

Song, Shaoqiu, Jie Lu, Shiqi Xing, Sinong Quan, Junpeng Wang, Yongzhen Li, and Jing Lian. 2022. "Near Field 3-D Millimeter-Wave SAR Image Enhancement and Detection with Application of Antenna Pattern Compensation" Sensors 22, no. 12: 4509. https://doi.org/10.3390/s22124509

APA Style

Song, S., Lu, J., Xing, S., Quan, S., Wang, J., Li, Y., & Lian, J. (2022). Near Field 3-D Millimeter-Wave SAR Image Enhancement and Detection with Application of Antenna Pattern Compensation. Sensors, 22(12), 4509. https://doi.org/10.3390/s22124509

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop