Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: minicpmv2 #1705

Merged
merged 3 commits into from
Jan 6, 2025
Merged

Fix: minicpmv2 #1705

merged 3 commits into from
Jan 6, 2025

Conversation

Samoed
Copy link
Collaborator

@Samoed Samoed commented Jan 4, 2025

I haven't checked the performance because it was only tested on CMTEB, which includes very large tasks. Additionally, they probably ran it with instructions, but this needs to be verified. Also commented out flash_attention_2, because I've tested model on kaggle and flash_attention_2 is not supporting T4.

Closes #1696

Checklist

  • Run tests locally to make sure nothing is broken using make test.
  • Run the formatter to format the code using make lint.

Adding a model checklist

  • I have ensured that my model can be loaded using
    • mteb.get_model(model_name)

@KennethEnevoldsen KennethEnevoldsen merged commit 222bb35 into main Jan 6, 2025
10 checks passed
@KennethEnevoldsen KennethEnevoldsen deleted the fix_minicpmv2 branch January 6, 2025 15:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

openbmb/MiniCPM-Embedding still fails
2 participants