4.7 Article

RFBoost: An improved multi-label boosting algorithm and its application to text categorisation

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 103, Issue -, Pages 104-117

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2016.03.029

Keywords

RFBoost; Boosting; AdaBoost.MH; Text categorisation; Labeled Latent Dirichlet Allocation; Multi-label classification

Funding

  1. Malaysia Ministry of Education [FRGS/1/2014/ICT02/UKM/01/1]

Ask authors/readers for more resources

The AdaBoost.MH boosting algorithm is considered to be one of the most accurate algorithms for multi label classification. AdaBoost.MH works by iteratively building a committee of weak hypotheses of decision stumps. In each round of AdaBoost.MH learning, all features are examined, but only one feature is used to build a new weak hypothesis. This learning mechanism may entail a high. degree of computational time complexity, particularly in the case of a large-scale dataset. This paper describes a way to manage the learning complexity and improve the classification performance of AdaBoost.MH. We propose an improved version of AdaBoost.MH, called RFBoost. The weak learning in RFBoost is based on filtering a small fixed number of ranked features in each boosting round rather than using all features, as AdaBoost.MH does. We propose two methods for ranking the features: One Boosting Round and Labeled Latent Dirichlet Allocation (LLDA), a supervised topic model based on Gibbs sampling. Additionally, we investigate the use of LLDA as a feature selection method for reducing the feature space based on the maximal conditional probabilities of words across labels. Our experimental results on eight well-known benchmarks for multi-label text categorisation show that RFBoost is significantly more efficient and effective than the baseline algorithms. Moreover, the LLDA-based feature ranking yields the best performance for RFBoost. (C) 2016 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available