In ground penetrating radars, background clutter, which comprises the signals backscattered from the rough, uneven ground surface and the background noise, impairs the visualization of buried objects and subsurface inspections. In this paper, a clutter mitigation method is proposed for target detection. The removal of background clutter is formulated as a constrained optimization problem to obtain a low-rank matrix and a sparse matrix. The low-rank matrix captures the ground surface reflections and the background noise, whereas the sparse matrix contains the target reflections. An optimization method based on split-Bregman algorithm is developed to estimate these two matrices from the input GPR data. Evaluated on real radar data, the proposed method achieves promising results in removing the background clutter and enhancing the target signature.
This paper addresses the problem of scene reconstruction in conjunction with wall-clutter mitigation for com- pressed multi-view through-the-wall radar imaging (TWRI). We consider the problem where the scene behind- the-wall is illuminated from different vantage points using a different set of frequencies at each antenna. First, a joint Bayesian sparse recovery model is employed to estimate the antenna signal coefficients simultaneously, by exploiting the sparsity and inter-signal correlations among antenna signals. Then, a subspace-projection technique is applied to suppress the signal coefficients related to the wall returns. Furthermore, a multi-task linear model is developed to relate the target coefficients to the image of the scene. The composite image is reconstructed using a joint Bayesian sparse framework, taking into account the inter-view dependencies. Experimental results are presented which demonstrate the effectiveness of the proposed approach for multi-view imaging of indoor scenes using a reduced set of measurements at each view.
In this paper, a distributed compressive sensing (CS) model is proposed to recover missing data samples along the
temporal frequency domain for through-the-wall radar imaging (TWRI). Existing CS-based approaches recover
the signal from each antenna independently, without considering the correlations among measurements. The
proposed approach, on the other hand, exploits the structure or correlation in the signals received across the array
aperture by using a hierarchical Bayesian model to learn a shared prior for the joint reconstruction of the high-resolution radar profiles. A backprojection method is then applied to form the radar image. Experimental results
on real TWRI data show that the proposed approach produces better radar images using fewer measurements
compared to existing CS-based TWRI methods.
With the advances in radar technology, there is an increasing interest in automatic radar-based human gait
identification. This is because radar signals can penetrate through most dielectric materials. In this paper, an
image-based approach is proposed for classifying human micro-Doppler radar signatures. The time-varying radar
signal is first converted into a time-frequency representation, which is then cast as a two-dimensional image. A
descriptor is developed to extract micro-Doppler features from local time-frequency patches centered along the
torso Doppler frequency. Experimental results based on real data collected from a 24-GHz Doppler radar showed
that the proposed approach achieves promising classification performance.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.