Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run llama3:8b error #1123

Closed
ZQFORWARD opened this issue Sep 13, 2024 · 2 comments
Closed

run llama3:8b error #1123

ZQFORWARD opened this issue Sep 13, 2024 · 2 comments

Comments

@ZQFORWARD
Copy link

Hi, when I used the example code to test the assistant, it occured some error.

rag_assistant = get_rag_assistant(llm_model='llama3:8b')
response = ""
for delta in rag_assistant.run('who are you?'):
response += delta # type: ignore

for delta in rag_assistant.run('who are you?'):
File "C:\Users\admin\miniconda3\envs\agent\lib\site-packages\phi\assistant\assistant.py", line 890, in _run
for response_chunk in self.llm.response_stream(messages=llm_messages):
File "C:\Users\admin\miniconda3\envs\agent\lib\site-packages\phi\llm\ollama\chat.py", line 229, in response_stream
for response in self.invoke_stream(messages=messages):
File "C:\Users\admin\miniconda3\envs\agent\lib\site-packages\phi\llm\ollama\chat.py", line 95, in invoke_stream
yield from self.client.chat(
File "C:\Users\admin\miniconda3\envs\agent\lib\site-packages\ollama_client.py", line 88, in stream
partial = json.loads(line)
File "C:\Users\admin\miniconda3\envs\agent\lib\json_init
.py", line 346, in loads
return _default_decoder.decode(s)
File "C:\Users\admin\miniconda3\envs\agent\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Users\admin\miniconda3\envs\agent\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

@jacobweiss2305
Copy link
Contributor

@ZQFORWARD can you share the get_rag_assistant() function I would like to check the args you have set for Assistant()

@ZQFORWARD
Copy link
Author

Of course, here are the code
from phi.assistant import Assistant
from phi.llm.ollama import Ollama

assistant = Assistant(
name="test_RAG",
llm=Ollama(model='llama3:8b', host='http://localhost:11434/v1')
)

response = ''

for d in assistant.run("who are you?"):
response += d
print(response)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants