You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
but encountered an error when try to load the model:
Traceback (most recent call last):
File "/root/llm-project/TinyLLaVA_Factory/mycode/inference.py", line 10, in <module>
model = AutoModelForCausalLM.from_pretrained(hf_path, trust_remote_code=True)
File "/root/anaconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 550, in from_pretrained
model_class = get_class_from_dynamic_module(
File "/root/anaconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 501, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module)
File "/root/anaconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 201, in get_class_in_module
module = importlib.machinery.SourceFileLoader(name, module_path).load_module()
File "<frozen importlib._bootstrap_external>", line 548, in _check_name_wrapper
File "<frozen importlib._bootstrap_external>", line 1063, in load_module
File "<frozen importlib._bootstrap_external>", line 888, in load_module
File "<frozen importlib._bootstrap>", line 290, in _load_module_shim
File "<frozen importlib._bootstrap>", line 719, in _load
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/root/llm-project/utils/models/modules/transformers_modules/tinyllava/TinyLLaVA-Phi-2-SigLIP-3.1B/a98601f69e72442f71721aefcfbcdce26db8982a/modeling_tinyllava_phi.py", line 27, in <module>
from transformers import AutoConfig, AutoModelForCausalLM, PhiForCausalLM
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/root/anaconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1501, in __getattr__
value = getattr(module, name)
File "/root/anaconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1500, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/root/anaconda3/envs/tinyllava_factory/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1512, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.phi.modeling_phi because of the following error (look up to see its traceback):
/root/anaconda3/envs/tinyllava_factory/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZNK3c105Error4whatEv
Specifying the version of flash-attn to install can resolve this issue. It seems that the latest version of flash-attn is incompatible with torch==2.0.1 specified in the pyproject.toml. I was able to resolve the problem using the following version:
I follow the instruction to set up env:
but encountered an error when try to load the model:
The code I use:
How can I solve this?
The text was updated successfully, but these errors were encountered: