Experiments with heterogeneous meta decision trees

Abstract

The work presented here focuses on combining the predictions of base-level classifiers induced by applying different learning algorithms to a single data set. It adopts the stacking framework, where we have to learn how to combine the base-level classifiers. We developed an extension of meta decision trees (MDTs) [16] called heterogeneous MDTs (HMDTs). MDTs use properties of the base-level predictions (which are class probability distributions) to decide which of the base level classifier to use for each example; these properties are independent of the data set. The first modification of HMDTs when compared to MDTs is that the model is induced on a union of different data sets instead of just on one. The second modification is that HMDTs use a set of data set properties in order to take into account where each example originates from. A model built on examples described with these two types of attributes is, at least in principle, applicable to arbitrary data set.

Publication
Tech Report, DP-8638, Jožef Stefan Institute