In the field of pattern recognition, more specifically in the area of supervised and feature-vector-based classifications, various classification methods exist but none of them can return always right results for any given kind of data. Each classifier behaves differently, having its own strengths and weaknesses. Some are more efficient then others in particular situations. Performances of these individual classifiers can be improved by combining them into one multiple classifier. In order to make more realistic decisions, the multiple classifier can analyze internal values generated by each classifier and can also rely on statistics learned from previous tests, such as reliability rates and confusion matrix. Individual classifiers studied in this project are Bayes, k-nearest neighbors, and neural network classifiers. They are combined using the Dempster-Shafer theory. The problem simplifies in finding weights that best represent individual classifier evidences. A particular approach has been developed for each of them, and for all of them it has been proven better to rely on classifiers internal information rather than statistics. When tested on a database comprised of 8 different kinds of military ships, represented by 11 features extracted from FLIR images, the resulting multiple classifier has given better results than others reported in the literature and tested in this work.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.