Skip to content

Commit

Permalink
Merge pull request #49 from dannyl1u/feat/streamline-processes
Browse files Browse the repository at this point in the history
Feat/streamline processes
  • Loading branch information
dannyl1u authored Nov 11, 2024
2 parents 80484b6 + 0ccd627 commit a244566
Show file tree
Hide file tree
Showing 5 changed files with 130 additions and 4 deletions.
3 changes: 2 additions & 1 deletion .env.example
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
APP_ID=your_app_id_here
WEBHOOK_SECRET=your_webhook_secret_here
OLLAMA_MODEL=your_chosen_llm_model_here
OLLAMA_MODEL=your_chosen_llm_model_here
NGROK_DOMAIN=your_ngrok_domain_here
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,4 @@ volumes/
venv/
.env
*.pem
tmp/
19 changes: 16 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Open the newly created `.env` file and update the following variables with your
\* `APP_ID`: Replace `your_app_id_here` with your actual app ID.
\* `WEBHOOK_SECRET`: Replace `your_webhook_secret_here` with your actual webhook secret.
\* `OLLAMA_MODEL`: Replace `your_chosen_llm_model_here` with your chosen LLM model (e.g. "llama3.2"). Note: it must be an Ollama supported model (see: https://ollama.com/library for supported models)

\* `NGROK_DOMAIN`: Replace `your_ngrok_domain_here` with your ngrok domain if you have one
4. Place the downloaded private key in the project root and name it `rsa.pem`.

5. Run the application locally
Expand All @@ -77,10 +77,11 @@ Start the Flask application:

The application will start running on http://localhost:4000

### 3. Deploy the Application (ngrok instructions)
### 3. Prepare Dependencies and Deploy (ngrok and Ollama instructions)

1. We will use ngrok for its simplicity
We will use ngrok for its simplicity

**Option 1: generated public URL**
In a new terminal window, start ngrok to create a secure tunnel to your local server:

```bash
Expand All @@ -91,6 +92,18 @@ ngrok will generate a public URL (e.g., https://abc123.ngrok.io)

Append `/webhook` to the url, e.g. https://abc123.ngrok.io -> https://abc123.ngrok.io/webhook

In another terminal window, start Ollama

```bash
ollama run <an OLLAMA model here>
```
**Option 2: Using Shell Script with your own ngrok domain**

Ensure environment variables are all set.
```bash
./run-dev.sh
```

### 4. Update GitHub App Settings

1. Go back to your GitHub App settings.
Expand Down
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,3 +10,4 @@ requests~=2.32.3
ruff~=0.6.7
black~=24.4.2
ollama~=0.3.3
pyngrok~=7.2.0
110 changes: 110 additions & 0 deletions run-dev.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
#!/bin/bash
set -E
set -m

# Array to store subshell PIDs
declare -a subshell_pids=()

# Log file for debugging
LOG_FILE="tmp/run-dev.log"

# Function to log messages
log_message() {
local message="$1"
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $message" | tee -a "$LOG_FILE"
}

# Function to check if a process is running
is_process_running() {
local pid=$1
if kill -0 "$pid"; then
return 0
else
return 1
fi
}

# Define a function to handle cleanup
cleanup() {
log_message "Current Processes:"
eval ps >> $LOG_FILE

log_message "Initiating cleanup..."

# Kill all registered subshell processes by group process ID
for pid in "${subshell_pids[@]}"; do
if is_process_running "$pid"; then
log_message "Terminating process $pid"
kill -- -"$pid" >>$LOG_FILE 2>&1
wait -- -"$pid" >/dev/null 2>&1
fi
done

log_message "Cleanup complete"
exit 0
}

trap cleanup SIGINT SIGTERM EXIT

# Function to start a process and register its PID
start_process() {
local process_name="$1"
local command="$2"

# Start the process in a subshell
(
eval "$command"
) > /dev/null 2>&1 &

local pid=$!

# Wait briefly to check if the process started successfully
sleep 1
if is_process_running "$pid"; then
subshell_pids+=("$pid")
log_message "Started $process_name (PID: $pid)"
return 0
else
log_message "Failed to start $process_name"
return 1
fi
}

# Main process
main() {
# clear log file
: > "$LOG_FILE"

log_message "Starting run-dev.sh..."

# Source environment variables
if [[ -f .env ]]; then
source .env
else
log_message "Warning: .env file not found"
fi

# Start Ollama if configured
if [[ -n "${OLLAMA_MODEL:-}" ]]; then
start_process "Ollama" "ollama run '${OLLAMA_MODEL}'" || \
log_message "Failed to start Ollama"
fi

# Start Ngrok if configured. Using Port 4000 to match app.py
if [[ -n "${NGROK_DOMAIN:-}" ]]; then
start_process "Ngrok" "ngrok http 4000 --url '${NGROK_DOMAIN}'" || \
log_message "Failed to start Ngrok"
fi

# Wait for exit command. NOTE: If stuck in console, close the terminal
log_message "All processes started. Type 'exit' or CTRL+C to quit"
while read -r -p "> " input; do
if [[ "${input,,}" == "exit" ]]; then
log_message "Exit command received"
break
fi
done
}

# Run main function
main

0 comments on commit a244566

Please sign in to comment.