You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
apply_grads= [(grad, var) for (grad, var) inzip(gradients, var_list) ifgradisnotNone]
opt.apply_gradients(apply_grads)
returngenerator_loss_supervised
the generator variables are added to var_list. This is not needed, as generator_loss_supervised does not depend on these variables (so gradient will be 0 and no update on these parameters will take place). This is probably inherited from the original TimeGAN implementation.
If I am correct, I suggest removing these variables from var_list, for better code clarity.
The text was updated successfully, but these errors were encountered:
(This is not a problem in the algorithm per-se, but more a matter of code clarity. So I do not tag it as a bug)
In the definition of the
train_supervisor
method:ydata-synthetic/src/ydata_synthetic/synthesizers/timeseries/timegan/model.py
Lines 159 to 169 in 1279e5e
the generator variables are added to
var_list
. This is not needed, asgenerator_loss_supervised
does not depend on these variables (so gradient will be 0 and no update on these parameters will take place). This is probably inherited from the originalTimeGAN
implementation.If I am correct, I suggest removing these variables from
var_list
, for better code clarity.The text was updated successfully, but these errors were encountered: