Skip to content

Commit

Permalink
Merge branch 'main' into test-pull-request-2
Browse files Browse the repository at this point in the history
Including the repository name in the README will undoubtedly facilitate easier onboarding for future users, minimizing the need for modifications.
  • Loading branch information
unclecode committed Mar 13, 2024
2 parents ba7e67a + 1919a15 commit 18d903f
Show file tree
Hide file tree
Showing 4 changed files with 13 additions and 10 deletions.
9 changes: 6 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,11 @@ To run this proxy locally on your own machine, follow these steps:
4. Set your Groq API token key as an environment variable:
```export GROQ_API_KEY=your-api-key```

5. Run the FastAPI server:
```uvicorn main:app --reload```
5. Create uvicorn logs folder:
```mkdir .logs```

6. Run the FastAPI server:
```uvicorn --app-dir app/ main:app --reload```


## Using the Pre-built Server 🌐
Expand All @@ -49,7 +52,7 @@ To use the pre-built server, simply make requests to the following base URL:
This README is organized into three main sections, each showcasing different aspects of FunckyCall.ai:

- **Sending POST Requests**: Here, I explore the functionality of sending direct POST requests to LLMs using FunckyCall.ai. This section highlights the flexibility and control offered by the library when interacting with LLMs.
- **FunckyHub**: The third section introduces the concept of FunckyHub, a usefull feature that simplifies the process of executing functions. With FunckyHub, there is no need to send the function JSON schema explicitly, as the functions are already hosted on the proxy server. This approach streamlines the workflow, allowing developers to obtain results with a single call without having to handle function call is production server.
- **FunckyHub**: The second section introduces the concept of FunckyHub, a useful feature that simplifies the process of executing functions. With FunckyHub, there is no need to send the function JSON schema explicitly, as the functions are already hosted on the proxy server. This approach streamlines the workflow, allowing developers to obtain results with a single call without having to handle function call is production server.
- **Using FunckyCall with PhiData**: In this section, I demonstrate how FunckyCall.ai can be seamlessly integrated with other libraries such as my favorite one, the PhiData library, leveraging its built-in tools to connect to LLMs and perform external tool requests.


Expand Down
6 changes: 3 additions & 3 deletions app/libs/chains.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
import uuid
from fastapi import Request
from fastapi.responses import JSONResponse
from app.providers import BaseProvider
from app.prompts import SYSTEM_MESSAGE, SUFFIX, get_func_result_guide
from app.providers import GroqProvider
from providers import BaseProvider
from prompts import SYSTEM_MESSAGE, SUFFIX, get_func_result_guide
from providers import GroqProvider
import importlib


Expand Down
6 changes: 3 additions & 3 deletions app/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
from fastapi.templating import Jinja2Templates
from fastapi.staticfiles import StaticFiles
from starlette.requests import Request
from app.routes import proxy
from app.routes import examples
from routes import proxy
from routes import examples


from app.utils import create_logger
from utils import create_logger
import os
from dotenv import load_dotenv
load_dotenv()
Expand Down
2 changes: 1 addition & 1 deletion app/routes/proxy.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from fastapi import APIRouter, Response, Request, Path, Query
from fastapi.responses import JSONResponse
from app.libs.chains import (
from libs.chains import (
Context,
ProviderSelectionHandler,
ToolExtractionHandler,
Expand Down

0 comments on commit 18d903f

Please sign in to comment.