I just started learning about decision trees with Adaboost and am trying it out on OpenCV and have some questions.
Boosted Decision Trees
I understand that when I use Adaboost with Decision Trees, I continuously fit Decision Trees to reweighted version of the training data. Classification is done by a weighted majority vote
Can I instead use Bootstrapping when training Decision Trees with Adaboost ? i.e. we select subsets of our dataset and train a tree on each subset before feeding the classifiers into Adaboost.
Boosted Decision Stumps
Do I use the same technique for Decision Stumps ? Or can I instead create stumps equal to the number of features ? I.e. if I have 2 classes with 10 features, I create a total of of 10 Decision Stumps for each feature before feeding the classifiers into Adaboost.