Skip to content

Commit 0935b77

Browse files
committed
fix vision chat
1 parent f2a7e98 commit 0935b77

32 files changed

+150
-211
lines changed

README.md

Lines changed: 11 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ fluent-ai is a lightweight, type-safe AI toolkit that seamlessly integrates mult
1111
## Installation
1212

1313
```sh
14-
npm install fluent-ai zod
14+
npm install fluent-ai zod@next
1515
```
1616

1717
## AI Service provider support
@@ -48,11 +48,11 @@ Each request to AI providers is wrapped in a `Job`. which can also serialized an
4848
### Method chaining
4949

5050
```ts
51-
import { openai, userPrompt } from "fluent-ai";
51+
import { openai, user } from "fluent-ai";
5252

5353
const job = openai()
5454
.chat("gpt-4o-mini")
55-
.messages([userPrompt("Hi")])
55+
.messages([user("Hi")])
5656
.temperature(0.5)
5757
.maxTokens(1024);
5858
```
@@ -101,11 +101,11 @@ Chat completion, such as ChatGPT, is the most common AI service. It generates re
101101
### Text generation
102102

103103
```ts
104-
import { openai, systemPrompt, userPrompt } from "fluent-ai";
104+
import { openai, system, user } from "fluent-ai";
105105

106106
const job = openai()
107107
.chat("gpt-4o-mini")
108-
.messages([systemPrompt("You are a helpful assistant"), userPrompt("Hi")]);
108+
.messages([system("You are a helpful assistant"), user("Hi")]);
109109

110110
const { text } = await job.run();
111111
```
@@ -120,7 +120,7 @@ fluent-ai provides a consistent `jsonSchema()` function for all providers to gen
120120

121121
```ts
122122
import { z } from "zod";
123-
import { openai, userPrompt } from "fluent-ai";
123+
import { openai, user } from "fluent-ai";
124124

125125
const personSchema = z.object({
126126
name: z.string(),
@@ -129,7 +129,7 @@ const personSchema = z.object({
129129

130130
const job = openai()
131131
.chat("gpt-4o-mini")
132-
.messages([userPrompt("generate a person with name and age in json format")])
132+
.messages([user("generate a person with name and age in json format")])
133133
.jsonSchema(personSchema, "person");
134134

135135
const { object } = await job.run();
@@ -161,7 +161,7 @@ To use the tool, add it to a chat job with a function-calling-enabled model, suc
161161
const job = openai().chat("gpt-4o-mini").tool(weatherTool);
162162

163163
const { toolCalls } = await job
164-
.messages([userPrompt("What is the weather in San Francisco?")])
164+
.messages([user("What is the weather in San Francisco?")])
165165
.run();
166166
```
167167

@@ -172,7 +172,7 @@ Rather than waiting for the complete response, streaming enables the model to re
172172
```ts
173173
const job = openai()
174174
.chat("gpt-4o-mini")
175-
.messages([systemPrompt("You are a helpful assistant"), userPrompt("Hi")])
175+
.messages([system("You are a helpful assistant"), user("Hi")])
176176
.stream();
177177

178178
const { stream } = await job.run();
@@ -188,13 +188,11 @@ fluent-ai supports streaming text, object and tool calls on demand. For more det
188188
You can leverage chat models with vision capabilities by including an image URL in your prompt.
189189

190190
```ts
191-
import { openai, systemPrompt, userPrompt } from "fluent-ai";
191+
import { openai, system, user } from "fluent-ai";
192192

193193
openai()
194194
.chat("gpt-4o-mini")
195-
.messages([
196-
userPrompt("Describe the image", { image: { url: "<image_url>" } }),
197-
]);
195+
.messages([user("Describe the image", { image: { url: "<image_url>" } })]);
198196
```
199197

200198
## Embedding

docs/chat-streaming.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ export interface StreamOptions {
1313
```ts
1414
const { textStream } = await openai()
1515
.chat("gpt-4o-mini")
16-
.messages([userPrompt("hi")])
16+
.messages([user("hi")])
1717
.stream()
1818
.run();
1919

@@ -28,9 +28,7 @@ for await (const text of textStream) {
2828
const { toolCallStream } = await openai()
2929
.chat("gpt-4o-mini")
3030
.tool(weatherTool)
31-
.messages([
32-
userPrompt("What's the weather like in Boston, Beijing, Tokyo today?"),
33-
])
31+
.messages([user("What's the weather like in Boston, Beijing, Tokyo today?")])
3432
.stream()
3533
.run();
3634

@@ -44,7 +42,7 @@ for await (const toolCalls of toolCallStream) {
4442
```ts
4543
const { objectStream } = await openai()
4644
.chat("gpt-4o-mini")
47-
.messages([userPrompt("generate a person with name and age in json format")])
45+
.messages([user("generate a person with name and age in json format")])
4846
.responseSchema(personSchema)
4947
.objectStream()
5048
.run();
@@ -61,7 +59,7 @@ The original chunk object from providers
6159
```ts
6260
const { stream } = await openai()
6361
.chat("gpt-4o-mini")
64-
.messages([userPrompt("hi")]);
62+
.messages([user("hi")]);
6563

6664
for await (const chunk of stream) {
6765
console.log(chunk);

examples/anthropic-chat.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
1-
import { anthropic, userPrompt } from "../src";
1+
import { anthropic, user } from "../src";
22

33
const job = anthropic()
44
.chat("claude-3-5-sonnet-20241022")
55
.maxTokens(1024)
6-
.messages([userPrompt("Hello, world")]);
6+
.messages([user("Hello, world")]);
77
const result = await job.run();
88
console.log(result.raw);

examples/deepseek-chat.ts

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,8 @@
1-
import { deepseek, systemPrompt, userPrompt } from "../src";
1+
import { deepseek, system, user } from "../src";
22

33
const job = deepseek()
44
.chat("deepseek-chat")
5-
.messages([
6-
systemPrompt("you are a helpful assistant"),
7-
userPrompt("who are you"),
8-
]);
5+
.messages([system("you are a helpful assistant"), user("who are you")]);
96
const result = await job.run();
107

118
console.log(result);

examples/fal-image-stream.ts

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,4 @@ const stream = await job.run();
66
for await (const event of stream) {
77
console.log(event);
88
}
9-
10-
const result = await stream.done();
119
console.log(result);

examples/fireworks-chat.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
1-
import { fireworks, systemPrompt, userPrompt } from "../src";
1+
import { fireworks, system, user } from "../src";
22

33
const job = await fireworks()
44
.chat("accounts/fireworks/models/llama-v3p1-70b-instruct")
5-
.messages([systemPrompt("you are a helpful assistant"), userPrompt("hi")]);
5+
.messages([system("you are a helpful assistant"), user("hi")]);
66
const result = await job.run();
77

88
console.log(result);

examples/google-chat.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
1-
import { google, systemPrompt, userPrompt } from "../src";
1+
import { google, system, user } from "../src";
22

33
const job = google()
44
.chat("gemini-1.5-flash")
5-
.messages([systemPrompt("you are a helpful assistant"), userPrompt("hi")]);
5+
.messages([system("you are a helpful assistant"), user("hi")]);
66
const result = await job.run();
77

88
console.log(result);

examples/groq-chat.ts

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,11 @@
1-
import { openai, systemPrompt, userPrompt } from "../src";
1+
import { openai, system, user } from "../src";
22

33
const job = openai({
44
apiKey: process.env.GROQ_API_KEY,
5-
baseURL: "https://api.groq.com/openai/v1"
5+
baseURL: "https://api.groq.com/openai/v1",
66
})
77
.chat("meta-llama/llama-4-scout-17b-16e-instruct")
8-
.messages([
9-
systemPrompt("you are a helpful assistant"),
10-
userPrompt("who are you"),
11-
]);
8+
.messages([system("you are a helpful assistant"), user("who are you")]);
129
const result = await job.run();
1310

1411
console.log(result);

examples/ollama-chat.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
import { ollama, systemPrompt, userPrompt } from "../src";
1+
import { ollama, system, user } from "../src";
22

33
const job = ollama()
44
.chat("llama3.2")
5-
.messages([systemPrompt("you are a helpful assistant"), userPrompt("hi")]);
5+
.messages([system("you are a helpful assistant"), user("hi")]);
66
const result = await job.run();
77
console.log(result);

examples/ollama-embedding.ts

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
import { ollama } from "../src";
22

3-
const job = ollama().embedding("nomic-embed-text").input("hello");
3+
const job = ollama().embedding("nomic-embed-text").value("hello");
44
const result = await job.run();
5-
6-
console.log(JSON.stringify(result, null, 2));
5+
console.log(result);

0 commit comments

Comments
 (0)