-
Notifications
You must be signed in to change notification settings - Fork 95
Add a simple notebook for llama stack #244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
sorry, one last comment, I think you need to remove the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just one small comment, merge it after
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Amnah199! I left my comments
Co-authored-by: Bilge Yücel <[email protected]>
@@ -0,0 +1,225 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@@ -0,0 +1,225 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@@ -0,0 +1,225 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's add more info here to explain the relation between two:
"Now, let's create a LlamaStackChatGenerator
and pass it to the Agent
. The Agent component will use the model running with LlamaStackChatGenerator
to reason and make decisions."
Reply via ReviewNB
@@ -0,0 +1,225 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Now, when we ask questions, the Agent
will utilize both the provided tool and the LlamaStackChatGenerator
to generate answers. We enable the streaming in Agent through streaming_callback
, so you can observe the tool calls and results in real time."
Reply via ReviewNB
@@ -0,0 +1,225 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The markdown here needs to get fixed:
"[Llama Stack docs](https://llama-stack.readthedocs.io/en/latest/providers/inference/index.html)."
Reply via ReviewNB
No description provided.