-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question]: How can I use huggingface model to complete function calling? #17553
Comments
Hello, @kxu-prosera! I'm here to assist you with any bugs or questions you have while waiting for a human maintainer. Let's tackle this issue together! I found a similar unsolved discussion titled "llama-index tools just for open LLMs?" which might be relevant to your question [1]. Regarding your question, LlamaIndex does support function calling with LLMs other than OpenAI, but there are some limitations and specific requirements when using HuggingFace models. Currently, HuggingFace models do not support function calling messages directly within the LlamaIndex framework [2]. This means that while you can use HuggingFace models for tasks like text generation, the function calling capabilities are not supported in the same way as with OpenAI models. If you are looking to implement function calling with HuggingFace models, you might need to explore alternative approaches or wait for future updates that might include this feature. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
I dont think it's supported directly in the inference api class, although it's implemented in many.other classes (local HughingFaceLLM, Ollama, anthropic, bedrock converse, openai, etc) But isn't hughingface openai compatible these days? Pretty sure you can use the openai client directly (i might be confusing their severless vs. dedicated endpoints though)
|
Thank you for the answers. I was not aware of that before. Would it possible to be more specific? Do you mean code like this, call HuggingFaceInferenceAPI for the model, then pass to OpenAILike class? like below
or directly assign model to
I tried both, but it seems I missed some keys, none of them working. So I changed to below format, it points 404 error.
So, do you have any ideas what I missed or mis-configure? Thank you! |
It was just a guess. This might only work for huggingaces actually deployed/paid endpoints (their docs aren't clear) Sounds like this won't work with the free severless endpoints without a PR to update the inference api llm class |
Yes, you're correct. They do not support free/trial accounts to use endpoints as I tested it.
Except this huggingface, are there other ways to use llama-3.x model via llamaindex for function calling? Is so, any documents or instructions? Thank you! |
@kxu-prosera check out Ollama, it will work fine in this case |
Thank you so much! I will give a shot. |
Question Validation
Question
How to use function calling from hugging face's model?
This code works.
llm_model = OpenAI(
api_key=OPEN_AI_KEY,
model="gpt-3.5-turbo",
temperature=0
)
def setup_agent(self):
tools = self.registry.get_tools_for_analysis()
self.agent_worker = FunctionCallingAgentWorker.from_tools(tools, llm=llm_model, verbose = True)
self.agent = self.agent_worker.as_agent()
But this does not work.
llm_model = HuggingFaceInferenceAPI(
model_name='NousResearch/Hermes-3-Llama-3.2-3B',
api_key=HF_TOKEN
)
def setup_agent(self):
tools = self.registry.get_tools_for_analysis()
self.agent_worker = FunctionCallingAgentWorker.from_tools(tools, llm=llm_model, verbose = True)
self.agent = self.agent_worker.as_agent()
I tried to wrap HuggingfaceInferenceAPI with FunctionCallingLLM, but it still does not work, as I expect.
So question,
Thank you.
The text was updated successfully, but these errors were encountered: