- Sponsor
-
Notifications
You must be signed in to change notification settings - Fork 459
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Masking time steps in order to use TCN for variable length sequences #240
Comments
@fsbashiri thanks for reporting! I propose an explanation. I'm not 100% sure, you can challenge me. My clue is the TCN works a bit like an RNN even though it has no states like a LSTM would have. The last outputs depend on the end of the sequence but also on the beginning.
For your second point |
This problem is mentioned on issue #89. Author states "Con1d by keras lacking supports for Masking layer". |
I will close this issue because Keras Conv layers can't handle masking well.
import numpy as np
from tensorflow.keras.layers import Masking, Conv1D, TimeDistributed, Dense
from tensorflow.keras.models import Sequential
# Example input with padding at the end (post-padding)
mask_value = 0.0
x = np.random.rand(1, 5, 8) # 1 sample, 5 time steps, 8 features
x_padded = np.append(x, mask_value * np.ones((1, 3, 8)), axis=1) # Padding at the end
# Build model with masking
model = Sequential()
model.add(Masking(mask_value=mask_value)) # Masking layer
model.add(Conv1D(filters=64, kernel_size=3, padding='causal', activation='relu'))
model.add(TimeDistributed(Dense(1))) # Dense layer applied to each time step
# Test the model
out = model.predict(x_padded)
print(out) |
Describe the bug
In my project, I am using TCN for sequence-to-sequence analysis of time series data that have variable lengths. I have defined a subclass of the Sequence class that pads each batch of data to its maximum sequence length (similar to what is suggested here). As for the model, I use a masking layer to compute and pass a mask to TCN (as suggested here issue #234). Supposedly, layers that support masking will automatically propagate the mask to the next layer. In the simplest form of my model, I have a masking layer, followed by a TCN, and a Dense layer with 1 unit.
Here are two issues that I've got:
_keras_mask
.Paste a snippet
Please see the following simple code:
The output of the code:
Dependencies
I am using:
keras 2.4.3
keras-tan 3.1.1
Tensorflow-gpu 2.3.1
The text was updated successfully, but these errors were encountered: