Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable Gradient Accumulation fix across all models + trainer fully in forward() #34283

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Commits on Oct 22, 2024

  1. Configuration menu
    Copy the full SHA
    0bfce6e View commit details
    Browse the repository at this point in the history
  2. handle peft case

    muellerzr committed Oct 22, 2024
    Configuration menu
    Copy the full SHA
    8c12bf4 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    058fe34 View commit details
    Browse the repository at this point in the history
  4. Use accelerator state

    muellerzr committed Oct 22, 2024
    Configuration menu
    Copy the full SHA
    fc6d674 View commit details
    Browse the repository at this point in the history
  5. Quality

    muellerzr committed Oct 22, 2024
    Configuration menu
    Copy the full SHA
    0aeb5ac View commit details
    Browse the repository at this point in the history
  6. Guard

    muellerzr committed Oct 22, 2024
    Configuration menu
    Copy the full SHA
    49b29d2 View commit details
    Browse the repository at this point in the history
  7. Configuration menu
    Copy the full SHA
    4f3f86d View commit details
    Browse the repository at this point in the history
  8. Fairseq only

    muellerzr committed Oct 22, 2024
    Configuration menu
    Copy the full SHA
    58ee680 View commit details
    Browse the repository at this point in the history