In this paper, we present an integration of the algorithm MLC4.5 for learning meta decision trees (MDTs) into the Weka data mining suite. MDTs are a method for combining multiple classifiers. Instead of giving a prediction, MDT leaves specify which classifier should be used to obtain a prediction. The algorithm is based on the C4.5 algorithm for learning ordinary decision trees. An extensive performance evaluation of stacking with MDTs on twenty-one data sets has been performed. We combine base-level classifiers generated by three learning algorithms: an algorithm for learning decision trees, a nearest neighbor algorithm and a naive Bayes algorithm. We compare MDTs to bagged and boosted decision trees, and to combined classifiers with voting and three different stacking methods: with ordinary decision trees, with naive Bayes algorithm and with multi-response linear regression as a meta-level classifier. In terms of performance, stacking with MDTs gives better results than other methods except when compared to stacking with multi-response linear regression as a meta-level classifier; the latter is slightly better than MDTs.