Quantcast
Channel: Proper way to feed time-series data to stateful LSTM? - Stack Overflow
Viewing all articles
Browse latest Browse all 2

Proper way to feed time-series data to stateful LSTM?

$
0
0

Let's suppose I have a sequence of integers:

0,1,2, ..

and want to predict the next integer given the last 3 integers, e.g.:

[0,1,2]->5, [3,4,5]->6, etc

Suppose I setup my model like so:

batch_size=1time_steps=3model = Sequential()model.add(LSTM(4, batch_input_shape=(batch_size, time_steps, 1), stateful=True))model.add(Dense(1))

It is my understanding that model has the following structure (please excuse the crude drawing):

enter image description here

First Question: is my understanding correct?

Note I have drawn the previous states C_{t-1}, h_{t-1} entering the picture as this is exposed when specifying stateful=True. In this simple "next integer prediction" problem, the performance should improve by providing this extra information (as long as the previous state results from the previous 3 integers).

This brings me to my main question: It seems the standard practice (for example see this blog post and the TimeseriesGenerator keras preprocessing utility), is to feed a staggered set of inputs to the model during training.

For example:

batch0: [[0, 1, 2]]batch1: [[1, 2, 3]]batch2: [[2, 3, 4]]etc

This has me confused because it seems this is requires the output of the 1st Lstm Cell (corresponding to the 1st time step). See this figure:

From the tensorflow docs:

stateful: Boolean (default False). If True, the last state for each sample at index i in a batch will be used as initial state for the sample of index i in the following batch.

it seems this "internal" state isn't available and all that is available is the final state. See this figure:

So, if my understanding is correct (which it's clearly not), shouldn't we be feeding non-overlapped windows of samples to the model when using stateful=True? E.g.:

batch0: [[0, 1, 2]]batch1: [[3, 4, 5]]batch2: [[6, 7, 8]]etc

Viewing all articles
Browse latest Browse all 2

Trending Articles