Skip to content

Commit

Permalink
add support for tools
Browse files Browse the repository at this point in the history
  • Loading branch information
javierluraschi committed Aug 6, 2024
1 parent ea11e59 commit d0e1da8
Show file tree
Hide file tree
Showing 14 changed files with 163 additions and 8 deletions.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Changelog

## 2.6.1

- Add `complete()` and `describe()` to support handling tools

## 2.5.7

- `save()` supports creating subfolders
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
[![GitHub star chart](https://img.shields.io/github/stars/hal9ai/hal9?style=flat-square)](https://star-history.com/#hal9ai/hal9)

Create and deploy generative ([LLMs](https://github.com/Hannibal046/Awesome-LLM) and [diffusers](https://github.com/huggingface/diffusers)) applications (chatbots and APIs) in seconds.
- **Open:** Use any model ([OpenAI](https://platform.openai.com/docs/api-reference/introduction), [Llama](https://ai.meta.com/blog/5-steps-to-getting-started-with-llama-2/), [Groq](https://docs.api.groq.com/md/tutorials/python.groqapi.html), [MidJourney](https://docs.imagineapi.dev/en)) and any library like ([LangChainl](https://python.langchain.com/v0.1/docs/get_started/quickstart/), [DSPy](https://dspy-docs.vercel.app/docs/quick-start/installation)).
- **Open:** Use any model ([OpenAI](https://platform.openai.com/docs/api-reference/introduction), [Llama](https://ai.meta.com/blog/5-steps-to-getting-started-with-llama-2/), [Groq](https://docs.api.groq.com/md/tutorials/python.groqapi.html), [MidJourney](https://docs.imagineapi.dev/en)) and any library like ([LangChain](https://python.langchain.com/v0.1/docs/get_started/quickstart/), [DSPy](https://dspy-docs.vercel.app/docs/quick-start/installation)).
- **Intuitive:** No need to learn app frameworks ([Flask](https://flask.palletsprojects.com/en/3.0.x/quickstart/)), simply use `input()` and `print()`, or write file to disk.
- **Scalable:** Engineers can integrate your app with scalable technologies ([Docker](https://www.docker.com/), [Kubernetes](https://kubernetes.io/), etc)
- **Powerful:** Using an OS process (stdin, stdout, files) as our app contract, enables long-running agents, multiple programming languages, and complex system dependencies.
Expand Down
2 changes: 1 addition & 1 deletion javascript/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
[![GitHub star chart](https://img.shields.io/github/stars/hal9ai/hal9?style=flat-square)](https://star-history.com/#hal9ai/hal9)

Create and deploy generative ([LLMs](https://github.com/Hannibal046/Awesome-LLM) and [diffusers](https://github.com/huggingface/diffusers)) applications (chatbots and APIs) in seconds.
- **Open:** Use any model ([OpenAI](https://platform.openai.com/docs/api-reference/introduction), [Llama](https://ai.meta.com/blog/5-steps-to-getting-started-with-llama-2/), [Groq](https://docs.api.groq.com/md/tutorials/python.groqapi.html), [MidJourney](https://docs.imagineapi.dev/en)) and any library like ([LangChainl](https://python.langchain.com/v0.1/docs/get_started/quickstart/), [DSPy](https://dspy-docs.vercel.app/docs/quick-start/installation)).
- **Open:** Use any model ([OpenAI](https://platform.openai.com/docs/api-reference/introduction), [Llama](https://ai.meta.com/blog/5-steps-to-getting-started-with-llama-2/), [Groq](https://docs.api.groq.com/md/tutorials/python.groqapi.html), [MidJourney](https://docs.imagineapi.dev/en)) and any library like ([LangChain](https://python.langchain.com/v0.1/docs/get_started/quickstart/), [DSPy](https://dspy-docs.vercel.app/docs/quick-start/installation)).
- **Intuitive:** No need to learn app frameworks ([Flask](https://flask.palletsprojects.com/en/3.0.x/quickstart/)), simply use `input()` and `print()`, or write file to disk.
- **Scalable:** Engineers can integrate your app with scalable technologies ([Docker](https://www.docker.com/), [Kubernetes](https://kubernetes.io/), etc)
- **Powerful:** Using an OS process (stdin, stdout, files) as our app contract, enables long-running agents, multiple programming languages, and complex system dependencies.
Expand Down
1 change: 1 addition & 0 deletions python/hal9/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,4 @@
from hal9.code import extract
from hal9.urls import is_url, url_contents
from hal9.events import event
from hal9.complete import complete, describe
91 changes: 91 additions & 0 deletions python/hal9/complete.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
import inspect
import json

type_mapping = {
int: "integer",
str: "string",
float: "number",
bool: "boolean",
list: "array",
dict: "object"
}

def describe_single(func):
"""
Takes a function and returns its metadata as a JSON string in the specified format.
"""
signature = inspect.signature(func)
params = signature.parameters

# Collecting function metadata
func_name = func.__name__
func_doc = inspect.getdoc(func) or ""

properties = {}
for name, param in params.items():
param_type = param.annotation
if param_type in type_mapping:
json_type = type_mapping[param_type]
else:
json_type = "string" # default to string if type is not mapped
properties[name] = {"type": json_type}

result = {
"name": func_name,
"description": func_doc,
"parameters": {
"type": "object",
"properties": properties,
"required": list(properties.keys())
}
}

return result

def describe(functions):
return [describe_single(func) for func in functions]

def complete(completion, messages = [], tools = [], show = True):
tools = {func.__name__: func for func in tools}
content = result= ""
tool_name = tool_text = ""
tool_args = None

if not 'stream' in str(type(completion)).lower():
content = completion.choices[0].message.content
if chunk.choices[0].message.function_call != None:
tool_name = chunk.choices[0].message.function_call.name
tool_args = json.loads(chunk.choices[0].message.function_call.arguments)
if show:
print(content)
else:
for chunk in completion:
if chunk.choices and len(chunk.choices) > 0 and chunk.choices[0].delta:
if chunk.choices[0].delta.content:
if show:
print(chunk.choices[0].delta.content, end="")
content += chunk.choices[0].delta.content
if chunk.choices[0].delta.function_call != None:
tool_text += chunk.choices[0].delta.function_call.arguments
if chunk.choices[0].delta.function_call.name:
tool_name = chunk.choices[0].delta.function_call.name
try:
tool_args = json.loads(tool_text)
except Exception as e:
pass
if show:
print()

if len(content) > 0:
messages.append({ "role": "assistant", "content": content})

if tool_args:
if tool_name in tools:
try:
result = str(tools[tool_name](**tool_args))
except Exception as e:
result = str(e)
print(result)
messages.append({ "role": "function", "name": tool_name, "content": result})

return content + result
11 changes: 7 additions & 4 deletions python/hal9/iobind.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,9 +99,12 @@ def save(name, contents = None, hidden = False, files = None):
}, indent=2)
Path(name + '.asset').write_text(asset_definition)

def input(prompt = "", extract = False):
def input(prompt = "", extract = False, messages = []):
print(prompt, end="")
text = sys.stdin.read()
prompt = sys.stdin.read()

if extract:
text = url_contents(text)
return text
prompt = url_contents(text)

messages.append({"role": "user", "content": prompt})
return prompt
1 change: 1 addition & 0 deletions python/messages.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
""
1 change: 1 addition & 0 deletions python/messages.pkl
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
�N.
2 changes: 1 addition & 1 deletion python/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "hal9"
version = "2.6.0"
version = "2.6.1"
description = ""
authors = ["Javier Luraschi <[email protected]>"]
readme = "README.md"
Expand Down
File renamed without changes.
5 changes: 5 additions & 0 deletions python/tests/test_iobind.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
import hal9 as h9

def test_save_empty_error():
h9.save("messages", None)
assert True
27 changes: 27 additions & 0 deletions website/learn/genapps/llmapps/tools.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Tools

This section presents how to add tools to your LLM application.

```python
import os
from openai import OpenAI
import hal9 as h9

def multiply(a: int, b: int) -> int:
"""Multiply two numbers."""
return a * b

messages = h9.load("messages", [])
prompt = h9.input(messages = messages)

completion = OpenAI().chat.completions.create(
model = "gpt-4",
messages = messages,
functions = h9.describe([multiply]),
function_call = "auto",
stream = True
)

h9.complete(completion, messages = messages, functions = [multiply])
h9.save("messages", messages, hidden = True)
```
22 changes: 22 additions & 0 deletions website/reference/complete.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# Complete

Convenience functions to handle LLM completions

## Complete
`complete(completion, messages, tools, show)` <br/><br/>
Finishes completing the completions by printing them, appending messages, or handling tools.

| Param | Type | Description |
| --- | --- | --- |
| completion | <code>String</code> | The completions form the LLM. |
| messages | <code>Array</code> | Messages to append replies to, defaults to `[]`. |
| tools | <code>Array</code> | An array of functions to use as tools, defaults `[]`. |
| show | <code>Bool</code> | Print the completions? Defaults to `True`. |

## Describe
`describe(funcs)` <br/><br/>
Describes an array of functions with descriptions, parameters and types. Useful when completing chats.

| Param | Type | Description |
| --- | --- | --- |
| functions | <code>Array</code> | An array of functions to describe. |
2 changes: 1 addition & 1 deletion website/src/pages/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import { Floating } from '../components/floating.jsx';
Create and deploy generative (LLMs and diffusers) applications (chatbots and APIs) in seconds. Focus on AI (RAG, fine-tuning, alignment, training) and skip engineering tasks (frontend development, backend integration, deployment, operations).

<div class="FloatingWrapper">
<Floating title="Open"><b>Open</b> to any model (OpenAI, Llama, Groq, Midjourney) and any library like (LangChainl, DSPy).</Floating>
<Floating title="Open"><b>Open</b> to any model (OpenAI, Llama, Groq, Midjourney) and any library like (LangChain, DSPy).</Floating>
<Floating title="Intuitive">No need to learn app frameworks (flask), <b>intuitively</b> use `input()` and `print()`, or write file to disk.</Floating>
<Floating title="Scalable">Engineers can integrate your app with <b>scalable</b> technologies (Docker, Kubernetes, etc).</Floating>
<Floating title="Powerful"><b>Powerful</b> architecture for agents, multiple programming languages, and complex dependencies.</Floating>
Expand Down

0 comments on commit d0e1da8

Please sign in to comment.