Skip to content

joshuasundance-swca/langchain-research-assistant-docker

Repository files navigation

langchain-research-assistant-docker

License: MIT python

Push to Docker Hub langchain-research-assistant-docker on Docker Hub Docker Image Size (tag)

pre-commit Ruff Checked with mypy Code style: black

security: bandit Known Vulnerabilities

This repo provides a docker setup to run the LangChain research-assistant template using langserve.

Quickstart

Using Docker

docker run -d --name langchain-research-assistant-docker \
  -e OPENAI_API_KEY=sk-... \
  -e TAVILY_API_KEY=tvly-... \
  -e LANGCHAIN_API_KEY=ls__... \
  -e LANGCHAIN_TRACING_V2=true \
  -e LANGCHAIN_PROJECT=langchain-research-assistant-docker \
  -p 8000:8000 \
  joshuasundance/langchain-research-assistant-docker:latest

Using Docker Compose

version: '3.8'

services:
  langchain-research-assistant-docker:
    image: joshuasundance/langchain-research-assistant-docker:latest
    container_name: langchain-research-assistant-docker
    environment:  # use values from .env
      - "OPENAI_API_KEY=${OPENAI_API_KEY:?OPENAI_API_KEY is not set}"  # required
      - "TAVILY_API_KEY=${TAVILY_API_KEY}"  # optional
      - "LANGCHAIN_API_KEY=${LANGCHAIN_API_KEY}"  # optional
      - "LANGCHAIN_TRACING_V2=${LANGCHAIN_TRACING_V2:-false}"  # false by default
      - "LANGCHAIN_PROJECT=${LANGCHAIN_PROJECT:-langchain-research-assistant-docker}"
    ports:
      - "${APP_PORT:-8000}:8000"

Kubernetes

The following assumes you have a .env file and a Kubernetes cluster running and kubectl configured to access it.

It creates a secret called research-assistant-secret. To use a different name, edit ./kubernetes/resources.yaml as well.

You can also edit the file and uncomment certain lines to deploy on private endpoints, with a predefined IP, etc.

kubectl create secret generic research-assistant-secret --from-env-file=.env
kubectl apply -f ./kubernetes/resources.yaml

All deployment options are flexible and configurable.

Usage

  • The service will be available at http://localhost:8000.

  • You can access the OpenAPI documentation at http://localhost:8000/docs and http://localhost:8000/openapi.json.

  • Access the Research Playground at http://127.0.0.1:8000/research-assistant/playground/.

  • You can also use the RemoteRunnable class to interact with the service:

from langserve.client import RemoteRunnable

runnable = RemoteRunnable("http://localhost:8000/research-assistant")

See the LangChain docs for more information.