Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hum-Works/lodestone-base-4096-v1 fails #1658

Open
Muennighoff opened this issue Jan 1, 2025 · 1 comment
Open

Hum-Works/lodestone-base-4096-v1 fails #1658

Muennighoff opened this issue Jan 1, 2025 · 1 comment

Comments

@Muennighoff
Copy link
Contributor

2025-01-01 04:36:02.380968: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2025-01-01 04:36:02.394677: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2025-01-01 04:36:02.398548: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
INFO:mteb.cli:Running with parameters: Namespace(model='Hum-Works/lodestone-base-4096-v1', task_types=None, categories=None, tasks=['ClimateFEVERHardNegatives'], languages=None, benchmarks=None, device=None, output_folder='/data/niklas/results/results', verbosity=2, co2_tracker=True, eval_splits=None, model_revision=None, batch_size=32, overwrite=False, save_predictions=False, func=<function run at 0x7fb929fa4f70>)
WARNING:mteb.model_meta:Loader not specified for model Hum-Works/lodestone-base-4096-v1, loading using sentence transformers.
Traceback (most recent call last):
File "/env/lib/conda/gritkto/bin/mteb", line 8, in
sys.exit(main())
File "/data/niklas/mteb/mteb/cli.py", line 387, in main
args.func(args)
File "/data/niklas/mteb/mteb/cli.py", line 123, in run
model = mteb.get_model(args.model, args.model_revision, device=device, trust_remote_code=True)
File "/data/niklas/mteb/mteb/models/overview.py", line 150, in get_model
model = meta.load_model(**kwargs)
File "/data/niklas/mteb/mteb/model_meta.py", line 128, in load_model
model: Encoder = loader(**kwargs) # type: ignore
File "/data/niklas/mteb/mteb/model_meta.py", line 40, in sentence_transformers_loader
return SentenceTransformerWrapper(model=model_name, revision=revision, **kwargs)
File "/data/niklas/mteb/mteb/models/sentence_transformer_wrapper.py", line 38, in init
self.model = SentenceTransformer(model, revision=revision, **kwargs)
File "/env/lib/conda/gritkto/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 306, in init
modules, self.module_kwargs = self._load_sbert_model(
File "/env/lib/conda/gritkto/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 1722, in _load_sbert_model
module = module_class(model_name_or_path, cache_dir=cache_folder, backend=self.backend, **kwargs)
File "/env/lib/conda/gritkto/lib/python3.10/site-packages/sentence_transformers/models/Transformer.py", line 76, in init
self._load_model(model_name_or_path, config, cache_dir, backend, **model_args)
File "/env/lib/conda/gritkto/lib/python3.10/site-packages/sentence_transformers/models/Transformer.py", line 108, in _load_model
self.auto_model = AutoModel.from_pretrained(
File "/env/lib/conda/gritkto/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 557, in from_pretrained
cls.register(config.class, model_class, exist_ok=True)
File "/env/lib/conda/gritkto/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 584, in register
raise ValueError(
ValueError: The model class you are passing has a config_class attribute that is not consistent with the config class you passed (model has <class 'transformers.models.bert.configuration_bert.BertConfig'> and you passed <class 'transformers_modules.Hum-Works.lodestone-base-4096-v1.9bbc2d0b57dd2198aea029404b0f976712a7d966.configuration_bert.BertConfig'>. Fix one of those so they match!

@isaac-chung
Copy link
Collaborator

I am able to reproduce this error by running the model's example script with

  • python3.12
  • torch==2.5.1
  • torchvision==0.20.1
  • sentence-transformers==3.3.1
from sentence_transformers import SentenceTransformer

model = SentenceTransformer('Hum-Works/lodestone-base-4096-v1', trust_remote_code=True, revision='v1.0.0')
sentences = ["This is an example sentence", "Each sentence is converted"]
embeddings = model.encode(sentences)
print(embeddings)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants