Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/fireworks integration #2089

Open
wants to merge 22 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 19 commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
f1ad995
FEAT: Added Fireworks Integration
somashekhar161 Sep 20, 2024
a38675b
FEAT: Added Fireworks Integration
somashekhar161 Sep 20, 2024
2d5865e
FEAT: Added fixed UI mode name
somashekhar161 Sep 20, 2024
c5c1f9b
fixed at ui height overlow
somashekhar161 Sep 20, 2024
241637f
FEAT: Added Fireworks integration
somashekhar161 Sep 20, 2024
4d22546
Added dockerfile and docker-compose for fireworks
somashekhar161 Sep 21, 2024
519c48b
Merge branch 'main' of github.com:somashekhar161/private-gpt into fea…
somashekhar161 Sep 24, 2024
9c3590e
Added embedded model option for fireworks \n Added documentation for …
somashekhar161 Sep 24, 2024
cecec30
fixed test black error
somashekhar161 Sep 24, 2024
80f15a1
fixed ruff chekc
somashekhar161 Sep 24, 2024
b807e50
fixed mypy private_gpt for llama-index
somashekhar161 Sep 24, 2024
6a46060
fixed mypy ignored mypy-llama-index-embeddings-fireworks
somashekhar161 Sep 24, 2024
03e8809
fixed mypy ignored llama-index-embeddings-fireworks
somashekhar161 Sep 24, 2024
0ff7a06
fixed mypy ignored tool.mypy-llama_index.embeddings.fireworks
somashekhar161 Sep 24, 2024
b2ffe5b
fixed mypy
somashekhar161 Sep 24, 2024
16d1f60
fixed mypy ignored tool.mypy-llama_index.embeddings.fireworks
somashekhar161 Sep 24, 2024
b8cb49a
updated dependencies poetry lock
somashekhar161 Sep 24, 2024
2052ff4
added # type: ignore for embeddings.fireworks
somashekhar161 Sep 24, 2024
5334dda
fixed ruff and black
somashekhar161 Sep 24, 2024
c846b3f
revert back to main branch's dependecy version
somashekhar161 Sep 25, 2024
62985df
resolved dependecies
somashekhar161 Sep 26, 2024
c4be3f8
Merge branch 'zylon-ai:main' into feat/fireworks-integration
somashekhar161 Nov 4, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
54 changes: 54 additions & 0 deletions Dockerfile.fireworks
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say that the Docker file is useless in this case. It's a better idea if someone wants to use fireworks, make the necessary modifications.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes the docker file is only for running it even if the make modification docker file is usefull cause of dependency errors
python version mismatch etc

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like this premise.
Adding another Docker/Docker-compose profile, will mean more code to maintain, when it is not the optimal PGPT user case. If fireworks was a fully local-setup environment, probably, it would be nice.

Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
FROM python:3.11.6-slim-bookworm as base

# Install poetry
RUN pip install pipx
RUN python3 -m pipx ensurepath
RUN pipx install poetry==1.8.3
ENV PATH="/root/.local/bin:$PATH"
ENV PATH=".venv/bin/:$PATH"

RUN apt update && apt install -y \
build-essential

# https://python-poetry.org/docs/configuration/#virtualenvsin-project
ENV POETRY_VIRTUALENVS_IN_PROJECT=true

FROM base as dependencies
WORKDIR /home/worker/app
COPY pyproject.toml poetry.lock ./

ARG POETRY_EXTRAS="ui llms-fireworks embeddings-fireworks vector-stores-qdrant embeddings-openai"
RUN poetry install --no-root --extras "${POETRY_EXTRAS}"

FROM base as app
ENV PYTHONUNBUFFERED=1
ENV PORT=8080
ENV APP_ENV=prod
ENV PYTHONPATH="$PYTHONPATH:/home/worker/app/private_gpt/"
EXPOSE 8080

# Prepare a non-root user
# More info about how to configure UIDs and GIDs in Docker:
# https://github.com/systemd/systemd/blob/main/docs/UIDS-GIDS.md

# Define the User ID (UID) for the non-root user
# UID 100 is chosen to avoid conflicts with existing system users
ARG UID=100

# Define the Group ID (GID) for the non-root user
# GID 65534 is often used for the 'nogroup' or 'nobody' group
ARG GID=65534

RUN adduser --system --gid ${GID} --uid ${UID} --home /home/worker worker
WORKDIR /home/worker/app

RUN chown worker /home/worker/app
RUN mkdir local_data && chown worker local_data
RUN mkdir models && chown worker models
COPY --chown=worker --from=dependencies /home/worker/app/.venv/ .venv
COPY --chown=worker private_gpt/ private_gpt
COPY --chown=worker *.yaml .
COPY --chown=worker scripts/ scripts

USER worker
ENTRYPOINT python -m private_gpt
23 changes: 20 additions & 3 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
@@ -1,13 +1,12 @@
services:

#-----------------------------------
#---- Private-GPT services ---------
#-----------------------------------

# Private-GPT service for the Ollama CPU and GPU modes
# This service builds from an external Dockerfile and runs the Ollama mode.
private-gpt-ollama:
image: ${PGPT_IMAGE:-zylonai/private-gpt}:${PGPT_TAG:-0.6.2}-ollama # x-release-please-version
image: ${PGPT_IMAGE:-zylonai/private-gpt}:${PGPT_TAG:-0.6.2}-ollama # x-release-please-version
user: root
build:
context: .
Expand Down Expand Up @@ -84,7 +83,7 @@ services:
ollama-cpu:
image: ollama/ollama:latest
volumes:
- ./models:/root/.ollama
- ./local_data:/root/.ollama
profiles:
- ""
- ollama-cpu
Expand All @@ -103,3 +102,21 @@ services:
capabilities: [gpu]
profiles:
- ollama-cuda

# fireworks service
private-gpt-fireworks:
build:
context: .
dockerfile: Dockerfile.fireworks
volumes:
- ./local_data/:/home/worker/app/local_data
ports:
- "3001:8080"
environment:
PORT: 8080
PGPT_PROFILES: fireworks
FIREWORKS_API_KEY: ${FIREWORKS_API_KEY}
env_file:
- .env
profiles:
- fireworks
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as Dockerfile

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here i have used how other docker files and docker-compose are written
followed those template so that it won't be outsider :)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say the same. It is not the core of PGPT, therefore, we should not give this kind of support. Our goal is to give a 100% private solution, in this area, our two main providers are Ollama and Llama-CPP. Of course, this PR gives more value to other people with the same problems, but it doesn't make sense to maintain on docker-compose and Dockerfile.

Loading
Loading