Langchain integration #281
Answered
by
rmitsch
batrasrishti
asked this question in
Q&A
-
Hello,
|
Beta Was this translation helpful? Give feedback.
Answered by
rmitsch
Sep 7, 2023
Replies: 2 comments 5 replies
-
Hi @batrasrishti!
|
Beta Was this translation helpful? Give feedback.
4 replies
Answer selected by
rmitsch
-
Hello batrasrishti, Yes, you can use models with spacy_llm by leveraging Llamacpp from Langchain! Below, I'll show you how to set this up and provide an example of using it for Named Entity Recognition (NER). Here’s the langchain_llm_ner.cfg configuration file that outlines how to load the model and set up NER: langchain_llm_ner.cfg[nlp]
lang = "en"
pipeline = ["llm"]
[components]
[components.llm]
factory = "llm"
[components.llm.task]
@llm_tasks = "spacy.NER.v3"
labels = ["PERSON", "ORGANISATION", "LOCATION"]
[components.llm.model]
@llm_models = "langchain.LlamaCpp.v1"
name = "mistralai_mistral-7b-instruct-v0.1"
config = {
"model_path": "models/mistral-7b-instruct-v0.1.Q4_0.gguf",
"temperature": 0.0
} Python codefrom spacy_llm.util import assemble
nlp = assemble("langchain_llm_ner.cfg")
content = "Jack and Jill rode up the hill in Les Deux Alpes"
doc = nlp(content)
print([(ent.text, ent.label_) for ent in doc.ents]) |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @batrasrishti!
langchain
config parameter to thelangchain
model via thespacy-llm
config. I've never tried .ggml models vialangchain
, but I don't see a reason why this shouldn't work. If you encounter a particular problem, let us know.@
is used for registries. See the docs on the spaCy config system.spacy.Lllama2.v1
. See the docs.