You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For this seq2seq model, the X_train, y_train has different numbers of rows. I couldn't find a way to implement a pipeline to describe such a kind of y_train.
Just wonder such a kind of seq2seq model is supported by timeserio or not.
Thank you in advance,
The text was updated successfully, but these errors were encountered:
the underlying assumption in timeserio is that the number of rows in each generated batch is the same between X and y, as each row corresponds to one training example. If you are using MultiNetworkBase, a model wrapper that takes numpy arrays, then X and z can have different shape after the first (batch) dimension, e.g. X.shape == (batch_size, input_seq_length, num_features) and y.shape == (batch_size, output_seq_length, num_outputs).
If you are using it with pandas pipeline in MultiModel, I recommend flattening the time and feature dimensions to be able to store the data in a pandas DataFrame. We implement some batch generators that support this mode, e.g. SequenceForecastBatchGenerator http://tech.octopus.energy/timeserio/examples/SolarGenerationTimeSeries_Part2.html#Auto-regressive-model), but you may want to implement your own. The key is that the raw data can have different number of timesteps (of your y is at a lower sampling rate), but each generated batch must be one example per row in each of X, y.
I am trying to migrate a seq2seq model to the framework of timeserio.
The model is at https://github.com/JEddy92/TimeSeries_Seq2Seq/blob/master/notebooks/TS_Seq2Seq_Conv_Full_Exog.ipynb .
For this seq2seq model, the X_train, y_train has different numbers of rows. I couldn't find a way to implement a pipeline to describe such a kind of y_train.
Just wonder such a kind of seq2seq model is supported by timeserio or not.
Thank you in advance,
The text was updated successfully, but these errors were encountered: