Home > Archive > 2014 > Volume 4 Number 1 (Feb. 2014) >
IJMLC 2014 Vol.4(1): 79-84 ISSN: 2010-3700
DOI: 10.7763/IJMLC.2014.V4.390

Comparing Informative Sample Selection Strategies in Classification Ensembles

Hamza Osman İlhan and Mehmet Fatih Amasyal

Abstract—Usage of more training data with label information gives more success for classification of datasets in machine learning. But in real life, obtaining data with label information is a cost-effective and long-lasting process. Herein, active learning algorithms are emerged. Active learning algorithms aim to maintain current success rate with fewer samples in train set or increase total success of model in training process. Active learning is not only functional for regular learning methods but also can be used in ensemble learning algorithms with specified techniques. In this study, two different active learning algorithms based on class probabilities of the samples are tested on five datasets classification. Ensemble learning methods are used as classification model. Comparative results presented as graphically and numerically.

Index Terms—Active learning, adaboost, bagging, decision tree, ensemble learning, machine learning.

The authors are with the Department of Computer Engineering, Yildiz Technical University, İstanbul, 34720 TR (e-mail: hoilhan@yildiz.edu.tr, mfatih@ce.yildiz.edu.tr).

[PDF]

Cite:Hamza Osman İlhan and Mehmet Fatih Amasyal, "Comparing Informative Sample Selection Strategies in Classification Ensembles," International Journal of Machine Learning and Computing vol.4, no. 1, pp. 79-84, 2014.

General Information

  • E-ISSN: 2972-368X
  • Abbreviated Title: Int. J. Mach. Learn.
  • Frequency: Quaterly
  • DOI: 10.18178/IJML
  • Editor-in-Chief: Dr. Lin Huang
  • Executive Editor:  Ms. Cherry L. Chen
  • Abstracing/Indexing: Inspec (IET), Google Scholar, Crossref, ProQuest, Electronic Journals LibraryCNKI.
  • E-mail: ijml@ejournal.net


Article Metrics in Dimensions