4.5 Article

Bayesian network classifiers using ensembles and smoothing

期刊

KNOWLEDGE AND INFORMATION SYSTEMS
卷 62, 期 9, 页码 3457-3480

出版社

SPRINGER LONDON LTD
DOI: 10.1007/s10115-020-01458-z

关键词

Bayesian network classifier; Ensemble learning; Probability smoothing; Hierarchical Dirichlet process; Attribute discretization

资金

  1. China Scholarship Council [201506300081]
  2. Australian Government through the Australian Research Council [DP190100017, DE170100037]

向作者/读者索取更多资源

Bayesian network classifiers are, functionally, an interesting class of models, because they can be learnt out-of-core, i.e. without needing to hold the whole training data in main memory. The selective K-dependence Bayesian network classifier (SKDB) is state of the art in this class of models and has shown to rival random forest (RF) on problems with categorical data. In this paper, we introduce an ensembling technique for SKDB, called ensemble of SKDB (ESKDB). We show that ESKDB significantly outperforms RF on categorical and numerical data, as well as rivalling XGBoost. ESKDB combines three main components: (1) an effective strategy to vary the networks that is built by single classifiers (to make it an ensemble), (2) a stochastic discretization method which allows to both tackle numerical data as well as further increases the variance between different components of our ensemble and (3) a superior smoothing technique to ensure proper calibration of ESKDB's probabilities. We conduct a large set of experiments with 72 datasets to study the properties of ESKDB (through a sensitivity analysis) and show its competitiveness with the state of the art.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据