We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
none
LoraConfig
transformers
@BenjaminBossan
examples
from peft import LoraConfig LoraConfig(init_lora_weights="loftq")
From the code in peft.tuners.lora.config.py this should raise a ValueError.
ValueError
if self.loftq_config is None: raise ValueError("`loftq_config` must be specified when `init_lora_weights` is 'loftq'.")
However the loftq_config field of LoraConfig is defined to be a dict by default, so it is never None.
ValueError should be raised when init_lora_weights="loftq" but loftq_config is not specified.
init_lora_weights="loftq"
loftq_config
The text was updated successfully, but these errors were encountered:
No branches or pull requests
System Info
transformers
version: 4.45.2Who can help?
@BenjaminBossan
Information
Tasks
examples
folderReproduction
From the code in peft.tuners.lora.config.py this should raise a
ValueError
.However the loftq_config field of
LoraConfig
is defined to be a dict by default, so it is never None.Expected behavior
ValueError
should be raised wheninit_lora_weights="loftq"
butloftq_config
is not specified.The text was updated successfully, but these errors were encountered: