KEYWORDS: Sensors, Electronic support measures, Data fusion, Reliability, Sensor fusion, Data modeling, Monte Carlo methods, Atrial fibrillation, Time metrology, System identification
We address the problem of fusing ESM reports by two evidential reasoning schemes, namely Dempster-Shafer theory
and Dezert-Smarandache theory. These schemes provide results in different frames of discernment, but are able to fuse
realistic ESM data. We discuss their advantages and disadvantages under varying conditions of sensor data certainty and
fusion reliability, the latter coming from errors in the association process. A thresholded version of Dempster-Shafer
theory is fine-tuned for performance across a wide range of values for certainty and reliability. The results are presented
first for typical scenarios, and secondly for Monte-Carlo studies of scenarios under varying sensor certainty and fusion
reliability. The results exhibit complex non-linear functions, but for which clear trends can nevertheless be extracted. A
compromise has to be achieved between stability under occasional miss-associations, and reaction time latency under a
real change of allegiance. The alternative way of reporting results through Dezert-Smarandache theory is studied under
similar conditions, and shown to provide good results, which are however more dependent on the unreliability, and
slightly less stable. In this case however, the frame of discernment is larger, and permits additional interpretations, which
are outside the scope of Dempster-Shafer.
KEYWORDS: Fuzzy logic, Probability theory, Electronic support measures, Data fusion, Data modeling, Information fusion, Remote sensing, Radar, Databases, Visualization
In several practical applications of data fusion and more precisely in object identification problems, we need to combine imperfect information coming from different sources (sensors, humans, etc.), the resulting uncertainty being naturally of different kinds. In particular, one information could naturally been expressed by a membership function while the other could best be represented by a belief function. Usually, information modeled in the fuzzy sets formalism (by a membership function) concerns attributes like speed, length, or Radar Cross Section whose domains of definition are continuous. However, the object identification problem refers to a discrete and finite framework (the number of objects in the data base is finite and known). This implies thus a natural but unavoidable change of domain. To be able to respect the intrinsic characteristic of uncertainty arising from the different sources and fuse it in order to identify an object among a list of possible ones in the data base, we need (1) to use a unified framework where both fuzzy sets and belief functions can be expressed, (2) to respect the natural discretization of the membership function through the change of domain (from attribute domain to frame of discernment). In this paper, we propose to represent both fuzzy sets and belief function by random sets. While the link between belief functions and random sets is direct, transforming fuzzy sets into random sets involves the use of α-cuts for the construction of the focal elements. This transformation usually generates a large number of focal elements often unmanageable in a fusion process. We propose a way to reduce the number of focal elements based on some parameters like the desired number of focal elements, the acceptable distance from the approximated random set to the original discrete one, or the acceptable loss of information.
In the field of pattern recognition, more specifically in the area of supervised and feature-vector-based classifications, various classification methods exist but none of them can return always right results for any given kind of data. Each classifier behaves differently, having its own strengths and weaknesses. Some are more efficient then others in particular situations. Performances of these individual classifiers can be improved by combining them into one multiple classifier. In order to make more realistic decisions, the multiple classifier can analyze internal values generated by each classifier and can also rely on statistics learned from previous tests, such as reliability rates and confusion matrix. Individual classifiers studied in this project are Bayes, k-nearest neighbors, and neural network classifiers. They are combined using the Dempster-Shafer theory. The problem simplifies in finding weights that best represent individual classifier evidences. A particular approach has been developed for each of them, and for all of them it has been proven better to rely on classifiers internal information rather than statistics. When tested on a database comprised of 8 different kinds of military ships, represented by 11 features extracted from FLIR images, the resulting multiple classifier has given better results than others reported in the literature and tested in this work.
KEYWORDS: Detection and tracking algorithms, Monte Carlo methods, Radon, Computer simulations, Algorithms, Target recognition, Data fusion, Process control, Databases, Error analysis
The major drawback of the Dempster-Shafer's theory of evidence is its computational burden. Indeed, the Dempster's rule of combination involves an exponential number of focal elements, that can be unmanageable in many applications. To avoid this problem, some approximation rules or algorithms have been explored for both reducing the number of focal elements and keeping a maximum of information in the next belief function to be combined. Some studies have yet to be done which compare approximation algorithms. The criteria used always involve pignistic transformations, and by that a loss of information in both the original belief function and the approximated one. In this paper, we propose to analyze some approximation methods by computing the distance between the original belief function and the approximated one. This real distance allows then to quantify the quality of the approximation. We also compare this criterion to other error criteria, often based on pignistic transformations. We show results of Monte-Carlo simulations, and also of an application of target identification.
KEYWORDS: Probability theory, Fuzzy logic, Data fusion, Data modeling, Sensors, Information fusion, Data processing, Defense and security, Information theory, Associative arrays
For several years, researchers have explored the unification of the theories enabling the fusion of imperfect data and have finally considered two frameworks: the theory random sets and the conditional events algebra. Traditionally, the information is modeled and fused in one of the known theories: bayesian, fuzzy sets, possibilistic, evidential, or rough sets... Previous work has shown what kind of imperfect data these theories can best deal with. So, depending on the quality of the available information (uncertain, vague, imprecise, ...), one particular theory seems to be the preferred choice for fusion. However, in a typical application, the variety of sources provide different kinds of imperfect data. The classical approach is then to model and fuse the incoming data in a single theory being previously chosen. In this paper, we first introduce the various kinds of imperfect data and then the theories that can be used to cope with the imperfection. We also present the existing relationships between them and detail the most important properties for each theory. We finally propose the random sets theory as a possible framework for unification, and thus show how the individual theories can fit in this framework.
KEYWORDS: Data fusion, Sensors, Information fusion, MATLAB, Visualization, Detection and tracking algorithms, Algorithms, Data processing, Algorithm development, Analytical research
In this paper, we present a software package designed to explore data fusion area applied to different contexts. This tool, called CEPfuse (Conceptual Exploration Package for Data Fusion) provides a good support to become familiar with all concepts and vocabulary linked to data fusion. Developed with Matlab 5.2, it's also a good tool to test, compare and analyze algorithms. Although the core of this package is evidential reasoning and identity information fusion, it has been conceived to develop all the interesting part of the Multi-Sensor Data Fusion system. Actually, because we concentrate our research work on identity information fusion, the principal included algorithms are Dempster- Shafer rules of combination, Shafer-Logan algorithms for hierarchical structures, and several decision rules.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.