Skip to content

Commit

Permalink
Fix Python deps, base64 impl, add examples (#426)
Browse files Browse the repository at this point in the history
* Fix python api deps

* Fix deps and add examples

* Update readmes with credit

* Fix links

* Fix base64 examples and configure higher body limit

* Fix base64 examples and configure higher body limit

* Cleaner way to load from file, base64, http

* Update example

* Add loading from local img docs

* Update docs

* Bump version to 0.1.18

* Clippy format and docs

* Default profile for py is release

* Update readme to build in verbose
  • Loading branch information
EricLBuehler authored Jun 12, 2024
1 parent 922d94e commit 0728b33
Show file tree
Hide file tree
Showing 19 changed files with 306 additions and 42 deletions.
2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ members = [
resolver = "2"

[workspace.package]
version = "0.1.17"
version = "0.1.18"
edition = "2021"
description = "Fast and easy LLM serving."
homepage = "https://github.com/EricLBuehler/mistral.rs"
Expand Down
6 changes: 4 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@ Please submit requests for new models [here](https://github.com/EricLBuehler/mis
- φ³ 📷 Run the Phi 3 vision model: [documentation and guide here](docs/PHI3V.md)

<img src="https://www.nhmagazine.com/content/uploads/2019/05/mtwashingtonFranconia-2-19-18-108-Edit-Edit.jpg" alt="Mount Washington" width = "400" height = "267">
<h6><a href = "https://www.nhmagazine.com/mount-washington/">Credit</a></h6>

*After following installation instructions*

Expand Down Expand Up @@ -197,10 +198,11 @@ Please submit more benchmarks via raising an issue!
## Installation and Build

1) Install required packages
- `openssl` (ex., `sudo apt install libssl-dev`)
- `pkg-config` (ex., `sudo apt install pkg-config`)
- `openssl` (ex. on Ubuntu, `sudo apt install libssl-dev`)
- `pkg-config` (ex. on Ubuntu, `sudo apt install pkg-config`)

2) Install Rust: https://rustup.rs/
*Example on Ubuntu:*
```bash
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source $HOME/.cargo/env
Expand Down
18 changes: 16 additions & 2 deletions docs/PHI3V.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,17 @@

The Phi 3 Vision Model has support in the Rust, Python, and HTTP APIs. The Phi 3 Vision Model supports ISQ for increased performance.

The Python and HTTP APIs support sending images as:
- URL
- Path to a local image
- [Base64](https://en.wikipedia.org/wiki/Base64) encoded string

The Rust API takes an image from the [image](https://docs.rs/image/latest/image/index.html) crate.

> Note: The Phi 3 Vision model works best with one image although it is supported to send multiple images.
> Note: when sending multiple images, they will be resized to the minimum dimension by which all will fit without cropping.
> Aspect ratio is not preserved.
> Aspect ratio is not preserved in that case.
## HTTP server
You can find this example [here](../examples/server/phi3v.py).
Expand All @@ -18,6 +25,7 @@ We support an OpenAI compatible HTTP API for vision models. This example demonst

**Image:**
<img src="https://www.nhmagazine.com/content/uploads/2019/05/mtwashingtonFranconia-2-19-18-108-Edit-Edit.jpg" alt="Mount Washington" width = "1000" height = "666">
<h6><a href = "https://www.nhmagazine.com/mount-washington/">Credit</a></h6>

**Prompt:**
```
Expand Down Expand Up @@ -73,6 +81,9 @@ print(resp)

```

- You can find an example of encoding the [image via base64 here](../examples/server/phi3v_base64.py).
- You can find an example of loading an [image locally base64 here](../examples/server/phi3v_local_img.py).

---

## Rust
Expand Down Expand Up @@ -201,4 +212,7 @@ res = runner.send_chat_completion_request(
)
print(res.choices[0].message.content)
print(res.usage)
```
```

- You can find an example of encoding the [image via base64 here](../examples/python/phi3v_base64.py).
- You can find an example of loading an [image locally base64 here](../examples/python/phi3v_local_img.py).
2 changes: 1 addition & 1 deletion examples/python/cookbook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"outputs": [],
"source": [
"# First, install Rust: https://rustup.rs/\n",
"%pip install mistralrs-cuda"
"%pip install mistralrs-cuda -v"
]
},
{
Expand Down
44 changes: 44 additions & 0 deletions examples/python/phi3v_base64.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
from mistralrs import Runner, Which, ChatCompletionRequest, VisionArchitecture
import base64

runner = Runner(
which=Which.VisionPlain(
model_id="microsoft/Phi-3-vision-128k-instruct",
tokenizer_json=None,
repeat_last_n=64,
arch=VisionArchitecture.Phi3V,
),
)

FILENAME = "picture.jpg"
with open(FILENAME, "rb") as image_file:
encoded_string = base64.b64encode(image_file.read()).decode("utf-8")

res = runner.send_chat_completion_request(
ChatCompletionRequest(
model="phi3v",
messages=[
{
"role": "user",
"content": [
{
"type": "image_url",
"image_url": {
"url": str(encoded_string),
},
},
{
"type": "text",
"text": "<|image_1|>\nWhat is shown in this image?",
},
],
}
],
max_tokens=256,
presence_penalty=1.0,
top_p=0.1,
temperature=0.1,
)
)
print(res.choices[0].message.content)
print(res.usage)
42 changes: 42 additions & 0 deletions examples/python/phi3v_local_img.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
from mistralrs import Runner, Which, ChatCompletionRequest, VisionArchitecture
import base64

runner = Runner(
which=Which.VisionPlain(
model_id="microsoft/Phi-3-vision-128k-instruct",
tokenizer_json=None,
repeat_last_n=64,
arch=VisionArchitecture.Phi3V,
),
)

FILENAME = "picture.jpg"

res = runner.send_chat_completion_request(
ChatCompletionRequest(
model="phi3v",
messages=[
{
"role": "user",
"content": [
{
"type": "image_url",
"image_url": {
"url": FILENAME,
},
},
{
"type": "text",
"text": "<|image_1|>\nWhat is shown in this image?",
},
],
}
],
max_tokens=256,
presence_penalty=1.0,
top_p=0.1,
temperature=0.1,
)
)
print(res.choices[0].message.content)
print(res.usage)
69 changes: 69 additions & 0 deletions examples/server/phi3v_base64.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
import requests
import httpx
import textwrap, json
import base64


def log_response(response: httpx.Response):
request = response.request
print(f"Request: {request.method} {request.url}")
print(" Headers:")
for key, value in request.headers.items():
if key.lower() == "authorization":
value = "[...]"
if key.lower() == "cookie":
value = value.split("=")[0] + "=..."
print(f" {key}: {value}")
print(" Body:")
try:
request_body = json.loads(request.content)
print(textwrap.indent(json.dumps(request_body, indent=2), " "))
except json.JSONDecodeError:
print(textwrap.indent(request.content.decode(), " "))
print(f"Response: status_code={response.status_code}")
print(" Headers:")
for key, value in response.headers.items():
if key.lower() == "set-cookie":
value = value.split("=")[0] + "=..."
print(f" {key}: {value}")


BASE_URL = "http://localhost:1234/v1"

# Enable this to log requests and responses
# openai.http_client = httpx.Client(
# event_hooks={"request": [print], "response": [log_response]}
# )

FILENAME = "picture.jpg"
with open(FILENAME, "rb") as image_file:
encoded_string = base64.b64encode(image_file.read()).decode("utf-8")

headers = {
"Content-Type": "application/json",
}

payload = {
"model": "phi3v",
"messages": [
{
"role": "user",
"content": [
{
"type": "image_url",
"image_url": {
"url": str(encoded_string),
},
},
{
"type": "text",
"text": "<|image_1|>\nWhat is shown in this image?",
},
],
}
],
"max_tokens": 300,
}

response = requests.post(f"{BASE_URL}/chat/completions", headers=headers, json=payload)
print(response.json())
67 changes: 67 additions & 0 deletions examples/server/phi3v_local_img.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
import requests
import httpx
import textwrap, json
import base64


def log_response(response: httpx.Response):
request = response.request
print(f"Request: {request.method} {request.url}")
print(" Headers:")
for key, value in request.headers.items():
if key.lower() == "authorization":
value = "[...]"
if key.lower() == "cookie":
value = value.split("=")[0] + "=..."
print(f" {key}: {value}")
print(" Body:")
try:
request_body = json.loads(request.content)
print(textwrap.indent(json.dumps(request_body, indent=2), " "))
except json.JSONDecodeError:
print(textwrap.indent(request.content.decode(), " "))
print(f"Response: status_code={response.status_code}")
print(" Headers:")
for key, value in response.headers.items():
if key.lower() == "set-cookie":
value = value.split("=")[0] + "=..."
print(f" {key}: {value}")


BASE_URL = "http://localhost:1234/v1"

# Enable this to log requests and responses
# openai.http_client = httpx.Client(
# event_hooks={"request": [print], "response": [log_response]}
# )

FILENAME = "picture.jpg"

headers = {
"Content-Type": "application/json",
}

payload = {
"model": "phi3v",
"messages": [
{
"role": "user",
"content": [
{
"type": "image_url",
"image_url": {
"url": FILENAME,
},
},
{
"type": "text",
"text": "<|image_1|>\nWhat is shown in this image?",
},
],
}
],
"max_tokens": 300,
}

response = requests.post(f"{BASE_URL}/chat/completions", headers=headers, json=payload)
print(response.json())
2 changes: 1 addition & 1 deletion mistralrs-bench/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ candle-core.workspace = true
serde.workspace = true
serde_json.workspace = true
clap.workspace = true
mistralrs-core = { version = "0.1.17", path = "../mistralrs-core" }
mistralrs-core = { version = "0.1.18", path = "../mistralrs-core" }
tracing.workspace = true
either.workspace = true
tokio.workspace = true
Expand Down
2 changes: 1 addition & 1 deletion mistralrs-pyo3/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ doc = false

[dependencies]
pyo3.workspace = true
mistralrs-core = { version = "0.1.17", path = "../mistralrs-core", features = ["pyo3_macros"] }
mistralrs-core = { version = "0.1.18", path = "../mistralrs-core", features = ["pyo3_macros"] }
serde.workspace = true
serde_json.workspace = true
candle-core.workspace = true
Expand Down
5 changes: 4 additions & 1 deletion mistralrs-pyo3/Cargo_template.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ doc = false

[dependencies]
pyo3.workspace = true
mistralrs-core = { version = "0.1.17", path = "../mistralrs-core", features=["pyo3_macros","$feature_name"] }
mistralrs-core = { version = "0.1.18", path = "../mistralrs-core", features=["pyo3_macros","$feature_name"] }
serde.workspace = true
serde_json.workspace = true
candle-core = { git = "https://github.com/EricLBuehler/candle.git", version = "0.6.0", rev = "f52e2347b6237d19ffd7af26315f543c22f9f286", features=["$feature_name"] }
Expand All @@ -27,6 +27,9 @@ intel-mkl-src = { workspace = true, optional = true }
either.workspace = true
futures.workspace = true
tokio.workspace = true
image.workspace = true
reqwest.workspace = true
base64.workspace = true

[build-dependencies]
pyo3-build-config = "0.21"
10 changes: 5 additions & 5 deletions mistralrs-pyo3/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,19 +21,19 @@ sudo apt install pkg-config
- CUDA
`pip install mistralrs-cuda`
`pip install mistralrs-cuda -v`
- Metal
`pip install mistralrs-metal`
`pip install mistralrs-metal -v`
- Apple Accelerate
`pip install mistralrs-accelerate`
`pip install mistralrs-accelerate -v`
- Intel MKL
`pip install mistralrs-mkl`
`pip install mistralrs-mkl -v`
- Without accelerators
`pip install mistralrs`
`pip install mistralrs -v`
All installations will install the `mistralrs` package. The suffix on the package installed by `pip` only controls the feature activation.
Expand Down
3 changes: 2 additions & 1 deletion mistralrs-pyo3/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "maturin"

[project]
name = "mistralrs"
version = "0.1.17"
version = "0.1.18"
requires-python = ">=3.8"
classifiers = [
"Programming Language :: Rust",
Expand All @@ -18,3 +18,4 @@ dynamic = ["description"]

[tool.maturin]
features = ["pyo3/extension-module"]
profile = "release"
Loading

0 comments on commit 0728b33

Please sign in to comment.