-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't run the provided llama2 example #59
Comments
Hi there-- what version of transformers are you running? This looks like an error inside the transformers library. Are you able to run llama without unlimiformer? |
transformer version is 4.36.0 and also this happenes with the latest version 4.36.2. |
downgrading to 4.27.0 should (likely) fix this for now; I'll look into the issue with the newer version |
Thanks, it works with version 4.28.0 |
Hi, is it possible to have a quick fix of this issue in the new version? Seems like downgrade transformers to a version before 4.35.0 has several rust compiler error. It's probably more convenient to use to most recent one. Thank you. |
I got this error message while trying to run the prompt in the README file
File "python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 402, in forward
kv_seq_len += past_key_value.get_usable_length(kv_seq_len, self.layer_idx)
AttributeError: 'list' object has no attribute 'get_usable_length'
The text was updated successfully, but these errors were encountered: