0
votes

We build a Data Science model and watch feature importance. If we drop the features and build a new model , will there be any improvement in the accuracy?. I see only one advantage is that consumer of model can pass only limited parameters to get the prediction. Are there any other advantages?

2

2 Answers

0
votes

Yes! Fewer parameters mean a faster learning and a faster prediction. Done correctly, it also means a smaller chance of overfitting your data.

0
votes

The non-relevant features act as a noise, thus it will reduce the accuracy of the model.

While training the model it makes the convergence towards global minima harder, as it minimizes a more complex function.