-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for seq2seq models #29
Comments
Hi, I also want to apply xlora to T5, did you try to do that? |
@yuxiang-guo Yeah I did. But I faced a tensor related error while doing the forward pass. I tried debugging through it but couldn't exactly understand what was the problem here. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I was trying to use xlora for combining Flan-T5 LoRAs and ran into error within apply_scalings_to_x, does xLoRA support seq2seq models such as Flan-T5 and BART ?
The text was updated successfully, but these errors were encountered: