1
votes

I've implemented a multilayer perceptron and at first designed the training method to take a certain number of epochs for training. I trained the network against the XOR logic gate, and the majority of the time the network would learn how to solve the problem, but every once in a while the network would only learn two of the training examples and be stuck on the other two.

At first I considered this unimportant, but later I wanted to change the training method to stop after the error is below some accepted error value that I choose. Now the network sometimes returns from training and sometimes gets stuck like I mentioned above. Is this normal, is a multilayer perceptron just not going to learn correctly sometimes or is this an error in my implementation.

If it matters the implementation is in C++, and the multilayer perceptron is the standard feed-foward backpropagation neural network; 2 input neurons, 2 hidden layer neurons, 1 output neuron.

Should I be using two output neurons, and if so what would the values be.

1

1 Answers

0
votes

I suppose it was neither an error in my implementation or a property of perceptrons, I was able to fix the problem by adding two more hidden layer neurons, although I heard that the rule of thumb was to keep the number of hidden layer neurons under the number of input neurons.