0
votes

What are some of the reasons that a Neural Network (Feed Forward) might have a general tendency to overshoot the real output rather than undershoot, and vice versa?

As an example, the below figure (right) shows an almost perfect gaussian distribution of the prediction errors, however it seems there is a tendency for the neural network to predict values higher than the actual value: Error distribution

1
What is the purpose of the left plot? It looks like ground truth and predictions are aligning very well...gtancev
@gtancev disregard the left plot. The right plot is of interest and serves as an example to my questionMarc225

1 Answers

1
votes

Only overshoot/undershoot can happen if there is an offset in the data but you don't have a constant term (in NNs called "bias") in your model to compensate for that. This is not a problem if you center your data first. (What also happens in NNs, especially in deep ones, is that the variance increases layer by layer and the distribution widens, since the variance of a variable Y = aX is V(Y) = a^2*V(X).)

Outliers on either side of the mean can also lead to a rotation (in your left plot), which means that small values are over-/undererstimated and larger values are under-/overestimated, depending on the exact position of the outlier(s).

However, in your right plot, the prediction error is so much smaller than the values of Y that it could be just a numerical/machine precision issue.

I hope that I could help you.