Response generated with Mistral 7B Instruct for NER task contains only one named entity at the most. #380
Replies: 2 comments
-
Hi @siddharthtumre, thanks for reporting this. I've converted this to a discussion as this is not a bug per se. As you can see from the IO you included in your post, the underlying problem here is that Mistral just doesn't response with more than one entity (I could reproduce this). Now it's possible that a differently written prompt might yield better results with Mistral, but maintaining a different prompt version for each model isn't sustainable on our side. Unfortunately the only thing I can recommend is to a) experiment with the prompt or b) use a different model. In the long term we'd like to be able to guarantee that all our tasks/prompts work properly with all models, but at the moment the differences in capabilities are too significant to do so. |
Beta Was this translation helpful? Give feedback.
-
Mistral works better with ner v2 than ner v3 |
Beta Was this translation helpful? Give feedback.
-
I have run spacy-llm for ner task to identify organisation and product entities with the Mistral 7B instruct model.
In the LLM response I can see that only a single named entity is being identified at most. I have checked this with multiple examples and the result is same.
Also the explanation generated by the LLM is trimmed in every output I have seen.
Here is my config.cfg
Beta Was this translation helpful? Give feedback.
All reactions