Why does the order of feature importances change according to the max depth choice in the decision tree classifier?
I used the synthetic data, but I didn't share the code because it is unnecessary and long. I just wonder the logic behind that when I change the maximum depth, why does the order of important features change.
dec_tree_clf = tree.DecisionTreeClassifier(max_depth=4, random_state=23, criterion="entropy")
dec_tree_clf.fit(X_data, y_data)
features importance
z 0.267464
n 0.124694
y 0.094134
c 0.090750
i 0.084806
dec_tree_clf = tree.DecisionTreeClassifier(max_depth=3, random_state=23, criterion="entropy")
dec_tree_clf.fit(X_data, y_data)
features importance
z 0.350545
n 0.163426
c 0.118939
i 0.111149
b 0.106650