Detailed Information

Cited 1 time in webofscience Cited 2 time in scopus
Metadata Downloads

AN INFORMATION-THEORETIC FILTER METHOD FOR FEATURE WEIGHTING IN NAIVE BAYES

Authors
Lee, Chang-Hwan
Issue Date
Aug-2014
Publisher
WORLD SCIENTIFIC PUBL CO PTE LTD
Keywords
Data mining; classification; feature weighting; naive Bayes
Citation
INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, v.28, no.5
Indexed
SCIE
SCOPUS
Journal Title
INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE
Volume
28
Number
5
URI
https://scholarworks.dongguk.edu/handle/sw.dongguk/15096
DOI
10.1142/S0218001414510070
ISSN
0218-0014
1793-6381
Abstract
In spite of its simplicity, naive Bayesian learning has been widely used in many data mining applications. However, the unrealistic assumption that all features are equally important negatively impacts the performance of naive Bayesian learning. In this paper, we propose a new method that uses a Kullback-Leibler measure to calculate the weights of the features analyzed in naive Bayesian learning. Its performance is compared to that of other state-of-the-art methods over a number of datasets.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > Department of Information and Communication Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE