I am asking if Recurrent Neural Networks are a chain of Neural Networks.
Now, intuitively an RNN is a Neural Network with a feedback loop from the past outputs and depending on one's implementation a feedback loop from the hidden Layers to the next timestamp's hidden Layer and/or inputs.
Excluding the implementation in which it links the hidden layers, is this implementation any different from a chain of Neural Networks?
I think, from my understanding that an implementation like this could be built from a chain of Neural Networks where each input of the NN is the timestamp of the given data + the output of the last timestamp.
Would you know if this intuition is correct? Or is there any differences in RNNs and ANNs that I am missing?
I have also asked my professor who specializes in Machine Learning this question, and if anyone is curious enough I can see if I can post his response once I get one.