You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Couple quick questions if possible! Looking to do a full finetune as an experiment with a i2v dataset I've been working on.
train.sh:
I note a 2e-5 learning rate. Was this the learning used for the full train of the model the entire time? Or was a learning rate schedule ran or something? Was 2e-5 the ending learning rate, or starting learning rate where it was lowered throughout the training? Trying to understand if this 2e-5 is a reasonable starting learning rate for a finetune, or if I should start it closer to 1e-4 as seen in the Lora config, then slowly ramp it down to 2e-5 towards the end.
train_batch_size of 1, does this mean a batch size of 1 was used? Should this be raised?
Training resources
How many iterations/steps were ran?
Curious approximately how much the full train cost and how long with what resources?
The text was updated successfully, but these errors were encountered:
Couple quick questions if possible! Looking to do a full finetune as an experiment with a i2v dataset I've been working on.
train.sh
:The text was updated successfully, but these errors were encountered: