Is ensemble learning an example of many instances of a particular classifier, for example Decision Tree Classifier; or is it a mixture of couple of classifiers such as Neural Networks, Decision Tree, SVM and so forth?
I have looked into this wikipedia's description on Bagging
an ensemble learner. It says that:
Bagging leads to "improvements for unstable procedures" (Breiman, 1996), which include, for example, neural nets, classification and regression trees, and subset selection in linear regression (Breiman, 1994).
I am little confused about this description. I also have looked into MATLAB's implementation of ensemble algorithm. For example this one:
load fisheriris
ens = fitensemble(meas,species,'AdaBoostM2',100,'Tree')
meas
and species
are inputs of the fitensemble
function. Here in this example it is using AdaBoostM2
weak learner of type Tree
and is using 100
of them. How can this simple instance of this function is being addressed to show that ensemble learning is used to combine different classifiers such as Neural Net, KNN, Naive Bayes
together?
Can anybody explain what is ensemble learning actually and what is MATLAB trying to do in its implementation of fitensemble
function?