You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! Thank you for sharing your code! I have some questions about your coding. The shape of the input of the function _project_features is (N, L, D), and the output of this function is the same shape of the input. So, why you need to have this function? What the purpose of this function?
The text was updated successfully, but these errors were encountered:
hi, i have the same question, have you figured out? i found project_features was added to attention extracted from hidden state (see function _attention_layer) @lcuyh@yunjey
i think the project_features is transforming feature dim to the dim of hidden state, which was used in attention layer. The dim of VGG conv5_3 is the same with h_dim, but if we change another feature, maybe we have to use project feature
Hi! Thank you for sharing your code! I have some questions about your coding. The shape of the input of the function _project_features is (N, L, D), and the output of this function is the same shape of the input. So, why you need to have this function? What the purpose of this function?
The text was updated successfully, but these errors were encountered: