A comparison of stacking with meta decision trees to bagging, boosting, and stacking with other methods

Abstract

Meta decision trees (MDTs) are a method for combining multiple classifiers. We present an integration of the algorithm MLC4.5 for learning MDTs into the Weka data mining suite. We compare classifier ensembles combined with MDTs to bagged and boosted decision trees, and to classifier ensembles combined with other methods: voting and stacking with three different meta-level classifiers (ordinary decision trees, naive Bayes, and multi-response linear regression - MLR).

Publication
IEEE International Conference on Data Mining