Regressions learning rate: 0.001 Ratio of training to test data:50%
I trained my neural network and both training error and testing error started out at 0.120 then it both decreases steadily and train it until I reached 2,105 epochs. End results was both training error and training loss data was 0.006. Is this considered over-fitting, under-fitting, or did I made a huge error?. Also I would like to ask, if a good-fitting model has a low validation error(testing error) but still slightly higher then training loss then how far off are they from each other?. for example, 0.012 = validation loss(testing loss), 0.005 for training loss. Would a good fitting model look similar to that number?.