Skip to content

Commit

Permalink
Langchain migration (#28)
Browse files Browse the repository at this point in the history
* initial version for langchain migration

* format and sort some imports

* format the rest of the files

* disable some tests

* add a test mocking ollama server

* clean up the test

* add diff to black check

* upgrade black version

* clean up history.py

* fix bug and improve settings

* add missing file ollama.py

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* improve output format

* add openai support

* fix type error

* install types-requests

* fix type errors

* fix tests

* add prompt-toolkit as dependency

* add simple-term-menu as dependency

* update gif

* Update README.md

* update gif

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* update versioning options

* remove comments
  • Loading branch information
ricopinazo authored May 22, 2024
1 parent e4e5566 commit d1d838c
Show file tree
Hide file tree
Showing 19 changed files with 530 additions and 424 deletions.
18 changes: 8 additions & 10 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -1,25 +1,23 @@
name: Release new version

on:
on:
workflow_dispatch:
inputs:
type:
description: 'Release type'
description: "Release type"
required: true
default: 'preview'
default: "preview"
type: choice
options:
- major
- minor
- fix
- preview
- release
- fix
- minor
- minor,preview
- major,preview
- release

jobs:
test:
uses: ./.github/workflows/tests.yml
secrets:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
release:
needs: test
runs-on: ubuntu-latest
Expand Down
27 changes: 5 additions & 22 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,8 @@
name: tests

on:
on:
push:
workflow_call:
secrets:
OPENAI_API_KEY:
required: true

jobs:
test:
Expand All @@ -22,24 +19,10 @@ jobs:
with:
python-version: "3.x"
- name: Install dependencies
run: python -m pip install -e .[test]
- name: Set environment variable using secret
run: echo "OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }}" >> $GITHUB_ENV
- name: Run test suite
run: pytest -v
code-checks:
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- name: Check out repository code
uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: "3.x"
- name: Install black
run: python -m pip install black=="23.*" mypy=="1.*"
run: python -m pip install -e .[dev]
- name: Check style
run: black --check .
run: black --check --diff .
- name: chek types
run: mypy .
- name: Run test suite
run: pytest -v
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,6 @@ __pycache__
.python-version
dist
.env
.DS_Store
.DS_Store
.ipynb_checkpoints
experiments
109 changes: 44 additions & 65 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,127 +1,106 @@
<div align="center">
<img src="https://raw.githubusercontent.com/ricopinazo/comai/main/logo.svg" alt="comai" width="200"/>

**The AI powered terminal assistant**

[![Tests](https://github.com/ricopinazo/comai/actions/workflows/tests.yml/badge.svg)](https://github.com/ricopinazo/comai/actions/workflows/tests.yml)
[![Latest release](https://img.shields.io/github/v/release/ricopinazo/comai?color=brightgreen&include_prereleases)](https://github.com/ricopinazo/comai/releases)
[![PyPI](https://img.shields.io/pypi/v/comai)](https://pypi.org/project/comai/)
[![Issues](https://img.shields.io/github/issues/ricopinazo/comai?color=brightgreen)](https://github.com/ricopinazo/comai/issues)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/comai)](https://pypi.org/project/comai/)
[![License GPLv3](https://img.shields.io/badge/license-GPLv3-blue.svg)](./LICENSE)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Checked with mypy](http://www.mypy-lang.org/static/mypy_badge.svg)](http://mypy-lang.org/)

**The AI powered terminal assistant**

[![Tests](https://github.com/ricopinazo/comai/actions/workflows/tests.yml/badge.svg)](https://github.com/ricopinazo/comai/actions/workflows/tests.yml)
[![Latest release](https://img.shields.io/github/v/release/ricopinazo/comai?color=brightgreen&include_prereleases)](https://github.com/ricopinazo/comai/releases)
[![PyPI](https://img.shields.io/pypi/v/comai)](https://pypi.org/project/comai/)
[![Issues](https://img.shields.io/github/issues/ricopinazo/comai?color=brightgreen)](https://github.com/ricopinazo/comai/issues)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/comai)](https://pypi.org/project/comai/)
[![License GPLv3](https://img.shields.io/badge/license-GPLv3-blue.svg)](./LICENSE)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Checked with mypy](http://www.mypy-lang.org/static/mypy_badge.svg)](http://mypy-lang.org/)

</div>

## What is comai? 🎯

`comai` is an open source terminal assistant powered by OpenAI API that enables you to interact with your command line interface using natural language instructions. It simplifies your workflow by converting natural language queries into executable commands. No more memorizing complex syntax. Just chat with `comai` using plain English!
`comai` is an open source CLI utility that translates natural language instructions into executable commands.

<div align="left">
<img src="https://raw.githubusercontent.com/ricopinazo/comai/main/demo.gif" alt="demo" width="350"/>
<img src="https://github.com/ricopinazo/comai/blob/956aa235f3950fe5b057d2e6d50032702a579b0c/demo.gif" alt="demo" width="350"/>
</div>

## Installation 🚀

Getting `comai` up and running is a breeze. You can simply use [`pip`](https://pip.pypa.io/en/stable/) to install the latest version:
`comai` is available as a python package. We recommend using [`pipx`](https://pypa.github.io/pipx/) to install it:

```shell
pip install comai
pipx install comai
```

However, if you usually work with python environments, it is recommended to use [`pipx`](https://pypa.github.io/pipx/) instead:
By default, `comai` is setup to use [ollama](https://ollama.com) under the hood, which allows you to host any popular open source LLM locally. If you are happy with this, make sure to install and have ollama running. You can find the install instructions [here](https://ollama.com/download).

Once installed, make sure to download the `llama3` model, since comai has been optimised for it

```shell
pipx install comai
ollama pull llama3
```

The first time you run it, it'll ask you for an OpenAI API key. You can create a developer account [here](https://platform.openai.com/overview). Once in your account, go to `API Keys` section and `Create new secret key`. We recommend setting a usage limit under `Billing`/`Usage limits`.
Otherwise, you can set up any other model available in the ollama service via:

```shell
comai --config
```

> **_NOTE:_** `comai` uses the environment variable `TERM_SESSION_ID` to maintain context between calls so you don't need to repeat yourself giving instructions to it. You can check if it is available in your default terminal checking the output of `echo $TERM_SESSION_ID`, which should return some type of UUID. If the output is empty, you can simply add the following to your `.zshrc`/`.bashrc` file:
>
> ```shell
> export TERM_SESSION_ID=$(uuidgen)
> ```
## Usage Examples 🎉
## Usage examples 🎉
Using `comai` is straightforward. Simply invoke the `comai` command followed by your natural language instruction. `comai` will provide you with the corresponding executable command, which you can execute by pressing Enter or ignore by pressing any other key.
Let's dive into some exciting examples of how you can harness the power of `comai`:
1. Manage your system like a pro:
```shell
$ comai print my private ip address
❯ ifconfig | grep "inet " | grep -v 127.0.0.1 | awk '{print $2}'
192.168.0.2
1. Access network details:
$ comai and my public one
❯ curl ifconfig.me
```
$ comai print my public ip address
❯ curl -s4 ifconfig.co
92.234.58.146
```
2. Master the intricacies of `git`:
2. Manage `git` like a pro:
```shell
$ comai squash the last 3 commits into a single commit
❯ git rebase -i HEAD~3
$ comai rename the current branch to awesome-branch
❯ git branch -m $(git rev-parse --abbrev-ref HEAD) awesome-branch
$ comai show me all the branches having commit c4c0d2d in common
❯ git branch --contains c4c0d2d
chat-api
configparser
* main
❯ git branch -a --contains c4c0d2d
main
fix/terrible-bug
* awesome-branch
```
3. Check the weather forecast for your location:
```shell
$ comai show me the weather forecast
❯ curl wttr.in
```
3. Find the annoying process using the port 8080:
4. Find the annoying process using the port 8080:
```shell
$ comai show me the process using the port 8080
❯ lsof -i :8080
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
node 36350 pedrorico 18u IPv4 0xe0d28ea918e376b 0t0 TCP *:http-alt (LISTEN)
$ comai show me only the PID
❯ lsof -t -i :8080
36350
$ comai kill it
❯ kill $(lsof -t -i :8080)
```
5. Swiftly get rid of all your docker containers:
4. Get rid of all your docker containers:
```shell
$ comai stop and remove all running docker containers
❯ docker stop $(docker ps -aq) && docker rm $(docker ps -aq)
```
These are just a few examples of how `comai` can help you harness the power of the command line and provide you with useful and entertaining commands. Feel free to explore and experiment with the commands generated by `comai` to discover more exciting possibilities!
## Contributions welcome! 🤝
If you're interested in joining the development of new features for `comai`, here's all you need to get started:
1. Clone the [repository](https://github.com/ricopinazo/comai) and navigate to the root folder.
2. Install the package in editable mode by running `pip install -e .`.
3. Run the tests using `pytest`. Make sure you have the `OPENAI_API_KEY` environment variable set up with your OpenAI API key. Alternatively, you can create a file named `.env` and define the variable there.
This project utilizes black for code formatting. To ensure your changes adhere to this format, simply follow these steps:
```shell
pip install black
black .
```
For users of VS Code, you can configure the following options after installing `black`:
```json
"editor.formatOnSave": true,
"python.formatting.provider": "black"
```
3. Run the tests using `pytest`.
4. Format your code using [black](https://github.com/psf/black) before submitting any change.
## License 📜
Expand Down
Binary file modified demo.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
22 changes: 13 additions & 9 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,10 @@ build-backend = "hatchling.build"
[project]
name = "comai"
dynamic = ["version"]
authors = [
{ name="Pedro Rico", email="[email protected]" },
]
authors = [{ name = "Pedro Rico", email = "[email protected]" }]
description = "AI powered console assistant"
readme = "README.md"
license = {file = "LICENSE"}
license = { file = "LICENSE" }
requires-python = ">=3.7"
classifiers = [
"Programming Language :: Python :: 3",
Expand All @@ -20,9 +18,13 @@ classifiers = [
"Operating System :: Unix",
]
dependencies = [
"typer[all]==0.9.0",
"openai==0.27.5",
"cryptography==40.0.2",
"typer-slim==0.12.3",
"rich==13.7.1",
"prompt-toolkit==3.0.43",
"simple-term-menu==1.6.4",
"langchain==0.1.17",
"langchain-openai==0.1.6",
"ollama==0.1.9",
]

[project.urls]
Expand All @@ -34,10 +36,12 @@ issues = "https://github.com/ricopinazo/comai/issues"
comai = "comai.cli:app"

[project.optional-dependencies]
test = [
dev = [
"pytest",
"hatchling",
"python-dotenv"
"types-requests==2.31.0.20240406",
"mypy==1.9.0",
"black==24.4.0",
]

[tool.hatch.version]
Expand Down
29 changes: 24 additions & 5 deletions src/comai/animations.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,10 @@
from contextlib import contextmanager
from typing import Generator, Iterator
from rich import print
import prompt_toolkit
from prompt_toolkit.styles import Style

from comai.prompt import prompt_str

LEFT = "\033[D"
CLEAR_LINE = "\033[K"
Expand Down Expand Up @@ -42,12 +46,27 @@ def query_animation() -> Generator[None, None, None]:
t.join()


def print_answer(command_chunks: Iterator[str]):
def start_printing_command():
print(f"[{ANSWER_PROMPT_COLOR}]{ANSWER_PROMPT}", end="", flush=True)
first_chunk = next(command_chunks)
print(f"[{COMMAND_COLOR}]{first_chunk}", end="", flush=True)
for chunk in command_chunks:
print(f"[{COMMAND_COLOR}]{chunk}", end="", flush=True)


def print_command_token(chunk: str):
print(f"[{COMMAND_COLOR}]{chunk}", end="", flush=True)


def print_command_prompt(command: str):
sys.stdout.write(f"\r{CLEAR_LINE}")
style = Style.from_dict(
{
# User input (default text)
"": "ansicyan",
"mark": "ansimagenta",
}
)
message = [
("class:mark", ANSWER_PROMPT),
]
return prompt_toolkit.prompt(message, default="%s" % command, style=style) # type: ignore


def hide_cursor() -> None:
Expand Down
Loading

0 comments on commit d1d838c

Please sign in to comment.