The work presented here is a continuation of research first reported in Mahler, et. al. Our earlier efforts included integrating the Statistical Features algorithm with a Bayesian nonlinear filter, allowing simultaneous determination of target position, velocity, pose and type via maximum a posteriori estimation. We then considered three alternative classifiers: the first based on a principal component decomposition, the second on a linear discriminant approach, and the third on a wavelet representation. In addition, preliminary results were given with regards to assigning a measure of confidence to the output of the wavelet based classifier. In this paper we continue to address the problem of target classification based on high range resolution radar signatures. In particular, we examine the performance of a variant of the principal component based classifier as the number of principal components is varied. We have chosen to quantify the performance in terms of the Bhattacharyya distance. We also present further results regarding the assignment of confidence values to the output of the wavelet based classifier.
The work presented here is a continuation of research first reported in Mahler, et. Al. Our goal is a generalization of Bayesian filtering and estimation theory to the problem of multisensor, multitarget, multi-evidence unified joint detection, tracking and target identification. Our earlier efforts were focused on integrating the Statistical Features algorithm with a Bayesian nonlinear filter, allowing simultaneous determination of target position, velocity, pose and type via maximum a posteriori estimation. In this paper we continue to address the problem of target classification based on high range resolution radar signatures. While we continue to consider feature based techniques, as in StaF and our earlier work, instead of considering the location and magnitude of peaks in a signature as our features, we consider three alternative features. The features arise from applying either a Wavelet Decomposition, Principal Component Analysis or Linear Discriminant Analysis to the signature. We discuss briefly also, in the wavelet decomposition setting, the challenge of assigning a measure of uncertainty with a classification decision.
The problem described in this paper deals with tracking the optical path perturbations introduced by the atmosphere when illuminating a target (missile) with laser light. Due to atmospheric irregularities, the optical path from an observer to an in-flight missile deviates from a straight line, and also changes in time. If the goal of the system is to point a laser beam at a specific point (or area) of the missile body for a given period of time, these optical path variations should be tracked and compensated when pointing the laser beam. The laser beam should be pointed, not to the true but to the apparent location of the desired spot. In the actual system, the missile is illuminated with several lasers (forming a broad beam), and an image of the missile (distorted through the atmosphere) is obtained from the backscattered light. This image contains all the information available about the optical path. The purpose of the work presented here is to estimate the apparent location of five different spots of the missile (distributed evenly along the longitudinal axis, from the nose up to mid-body) from the backscattered images and a- priori information that includes the size and speed of the missile. The dta available is high-fidelity simulated data, and the apparent locations of the desired spots (over time) are known. Two approaches are considered here. The first approach is based on breaking the problem into two parts: a measurement part and an estimation part. For the measurement part, a Neural Network is used to infer a mapping from the image to the apparent location of the points of interest (known for the simulated data). Those measured locations are then used by a Kalman filter to estimate the apparent locations. The Kalman filter exploits the fact that the optical paths (from different spots along the longitudinal axis) are correlated in time. This correlation is caused by the missile's displacement through the atmosphere. The second approach is to compute the centroids of the images and use the resulting points as estimates of the apparent location of a point on the missile. For the first approach, simulation results show a noticeable decrease in the rms error of the apparent location estimates when compared to the average location (mean value). The second approach, while simple, was found to perform quite well.
The work presented here is pat of a generalization of Bayesian filtering and estimation theory to the problem of multisource, multitarget, multi-evidence unified joint detection, tracking, and target ID developed by Lockheed Martin Tactical Defense Systems and Scientific Systems Co., Inc. Our approach to robust joint target identification and tracking was to take the StaF algorithm and integrate it with a Bayesian nonlinear filter, where target position, velocity, pose, and type could then be determined simultaneously via maximum a posteriori estimation. The basis for the integration between the tracker and classifier is base don 'finite-set statistics' (FISST). The theoretical development of FISST is a Lockheed Martin ongoing project since 1994. The specific problem addressed in this paper is that of robust joint target identification and tracking via fusion of high range resolution radar (HRRR) - from the automatic radar target identification (ARTI) data base - signatures and radar track data. A major problem in HRRR ATR is the computational load created by having to match observations against target models for every feasible pose. If pose could be estimated efficiently by a filtering algorithm from track data, the ATR search space would be greatly reduced. On the other hand, HRRR ATR algorithms produce useful information about pose which could potentially aid the track-filtering process as well. We have successfully demonstrated the former concept of 'loose integration' integrating the tracker and classifier for three different type of targets moving on 2D tracks.
It is commonly understood that in active detection system constant-frequency pulses correspond to good Doppler but poor delay resolution capability; and that linearly-swept frequency pulses have the opposite behavior. Many systems are capable of both types of operation, and hence in this paper the fusion of such pulses is examined. It is discovered that in many (but not all) situations the features complement in such a way that tracking performance using a combined CW-FM pulse is improved by an order of magnitude when compared to a scheme using only a full CW or FM pulse. Also investigated are alternating- pulse systems, and while these are suboptimal their performances appear robust.
If members of a suite of sensors from which fusion is to be carried out are not co-located, it is unreasonable to assume that they share a common resolution cell grid; this is generally ignored in the data fusion community. In this paper we explore the effects of such `noncoincidence', and we find that what at first seems to be a problem can in fact be exploited. The idea is that a target is known to be confined to an intersection of overlapping resolution cells, and this overlap is generally small.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.