-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'exclude_modules' #2208
Comments
this is a little weird since exclude_modules shouldnt exist in v13.2 |
I tried to load another model and it seemed to work fine
did you happen to train the lora by installing peft from source? the 'exclude_modules' is only available when you install peft from the main branch, its not there is 13.2 yet, can you double check once? |
@JINO-ROHIT what do you mean?
the issues i think here. it 'embed_token' not 'embed_tokens' right? |
have you tried to pass exclude_modules anywhere during finetuning lora with source peft installation? |
@JINO-ROHIT not install from the source just remove this part |
removing that part works? |
@JINO-ROHIT yes |
ahh okay |
but qwen model showing |
not quite sure what you mean here, mind sharing the whole code? |
@JINO-ROHIT is there any way to handle and save this part for each model?
|
im not quite following , are you having trouble using the finetuned lora adapter during inference? or are you hacing trouble during finetuning? |
talking about this
and check this
so these two are different
|
facing issues with inference. getting this error. |
can you try installing from source and see if that fixes the issue?
|
let me try |
@JINO-ROHIT now facing this issues
|
can you make sure you did training and inference using the same version? |
yeah. but the issues is related to this am i right? |
if i remove this line the fine tune lora is working with peft for inference |
im really unsure at this point, sorry give me sometime ill try and debug this |
ok sure. but try qwen2.5 small model with 10 step. |
This error occurs because the LoRA adapter that you're trying to load was trained with a from source install of PEFT, which already includes the feature: https://huggingface.co/FINGU-AI/Qwen2.5-7b-lora-e-6/blob/main/adapter_config.json#L6 You have 2 options: 1. You upgrade your PEFT version to be from source too, as @JINO-ROHIT suggested, or you just edit the
This error most likely occurs because the checkpoint was trained with a resized vocabulary, probably to add more special tokens. This is why we need |
@BenjaminBossan that module to save not working for Qwen2.5 model |
I don't understand why you think so. Do you get an error specifically related to
No,
I'm not really sure why you define a LoRA config. When loading the adapter, that step is not necessary. |
System Info
Name: peft
Version: 0.13.2
Who can help?
when i try to load the adapter for inference its showing the following error.
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'exclude_modules'
Information
Tasks
examples
folderReproduction
Expected behavior
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'exclude_modules'
The text was updated successfully, but these errors were encountered: