I am developing, on TensorFlow, a model to attribute a continuous label to each time-step of a time-series. This model is intended to be used on real-time data, such that the values of the time-series that have been observed on the previous time-steps, will have an impact on the label that the LSTM attributes to the current time-step. For this I am using tf.contrib.rnn.LSTMCell
.
My data consists on a daily time-series with minute-to-minute resolution. The total length of the time-series is always the same. Below, at the left plot in blue, you can find an example of what my data looks like.
I want to classify my input time-series with a float value between +1, and -1, such that +1 corresponds to the maximum value of the time-series, -1 to the minimum, and any other value (within +1 and -1) to something in between. As such, I want the LSTM to predict, based on the values observed on previous time-steps, whether it's likely that we are at the maximum or at the minimum value of the time-series. An example of the target labels can be seen at the right plot above, in green.
Now, in order for this to be successful, I need to give the LSTM an estimate of what the initial label is. Otherwise, and as can be seen above in red for the results that I am obtaining, the initial predicted labels will be off by a lot, since the LSTM has no previous context on the time-series.
Having said this, I am looking for opinions on what the best way to initialize the LSTM is. The LSTMCell
module has an initializer
attribute, but this serves to initialize the state of the LSTM, not to establish what the initial label should be. So for example, for the time-series that I show above, how do I configure the LSTM such that the initial label it attributes is 0.75 (the value of the initial target label)?