Gradient Accumulation with Dual (optimizer, scheduler) Training #897
Answered
by
celsofranssa
celsofranssa
asked this question in
Q&A
-
Hello, Lightning community, I am using a dual (optimizer, scheduler) training as shown in the code snippet below: def configure_optimizers(self):
[...]
return (
{"optimizer": optimizer_1,
"lr_scheduler": {"scheduler": scheduler_1, "interval": "step", "name": "scheduler_1"},
"frequency": 1},
{"optimizer": optimizer_2,
"lr_scheduler": {"scheduler": scheduler_2, "interval": "step", "name": "scheduler_2"},
"frequency": 1},
) With Therefore, is there an approach to combine |
Beta Was this translation helpful? Give feedback.
Answered by
celsofranssa
Oct 4, 2022
Replies: 1 comment
-
Moved to PL discussion. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
celsofranssa
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Moved to PL discussion.