Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 21 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -189,10 +189,28 @@ export default async function Home() {

A working example can be found in the `examples/nextjs` directory.

#### Next.js client-side instrumentation
### AI SDK / Vercel

The `@vercel/otel` package does not support client-side instrumentation, so few additional steps are necessary to send spans and/or instrument the client-side.
For a working example, refer to the `examples/nextjs-client-side-instrumentation` directory, which instruments the client-side `fetch` calls.
To be abele to observer [Vercel AI SDK](https://ai-sdk.dev/), you can integrate [Logfire's span processor](/packages/logfire-vercel-ai-span-processor) with your OpenTelemetry setup.

Add the `LogfireVercelAISpanProcessor` to your span processors when registering OpenTelemetry in your application:


```bash
npm install @pydantic/logfire-vercel-ai-span-processor`
```

```ts
# instrumentation.ts
import { registerOTel } from '@vercel/otel';
import { LogfireVercelAISpanProcessor } from '@pydantic/logfire-vercel-ai-span-processor';

registerOTel({
serviceName: 'your-service-name',
autoDetectResources: true,
spanProcessors: [new LogfireVercelAISpanProcessor()],
});
```

### Express, generic Node instrumentation

Expand Down
43 changes: 43 additions & 0 deletions examples/vercel-ai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Vercel AI Example with Node.js TypeScript Native Support

This example demonstrates how to send OpenTelemetry (OTEL) traces and spans from Vercel AI to Logfire, enabling a detailed and presentable panel for each AI action. By integrating Logfire with Vercel AI, you can gain deep insights and visualizations into every AI-driven operation, making it easy to analyze, debug, and present the behavior of your AI workflows.

## Features
- **Weather analysis**: Uses AI tools to analyze and report weather for given locations.
- **OpenTelemetry integration**: Traces are exported using Vercel's OTEL SDK and Logfire span processor.

## Requirements
- **Node.js >= 22.6.0**
- This version introduces the `--experimental-strip-types` flag, allowing you to run `.ts` files directly.
- [Download Node.js 22.6.0+](https://nodejs.org/en/download/current)
- **npm** (comes with Node.js)

## Getting Started

1. **Install dependencies**

```bash
npm install
```

2. **Run the example**

```bash
npm start
```

This runs:
```bash
node --experimental-strip-types --disable-warning=ExperimentalWarning --import ./instrumentation.ts app.ts
```
- `--experimental-strip-types` enables native TypeScript support (see [Node.js docs](https://nodejs.org/api/typescript.html)).
- `--import ./instrumentation.ts` sets up OpenTelemetry and Logfire tracing before your app starts.

3. **What it does**
- The script will analyze weather conditions for London and New York using AI tools and print the results to the console.
- Tracing and telemetry are enabled and sent to Logfire (see `instrumentation.ts`).

## Project Structure

- `app.ts` — Main example script (weather analysis logic)
- `instrumentation.ts` — Sets up OpenTelemetry and Logfire tracing
110 changes: 110 additions & 0 deletions examples/vercel-ai/app.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
import { generateText, tool } from 'ai';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';

const getLatLng = tool({
description: 'Get the latitude and longitude of a location',
parameters: z.object({
location_description: z.string().describe('A description of a location')
}),
execute: async ({ location_description }) => {
const locations = {
'london': { lat: 51.5074, lng: -0.1278, timezone: 'Europe/London' },
'wiltshire': { lat: 51.3492, lng: -1.9927, timezone: 'Europe/London' },
'new york': { lat: 40.7128, lng: -74.0060, timezone: 'America/New_York' },
'tokyo': { lat: 35.6762, lng: 139.6503, timezone: 'Asia/Tokyo' }
};

const location = locations[location_description.toLowerCase()];
if (!location) {
throw new Error('Location not found');
}
return location;
}
});

const getWeather = tool({
description: 'Get comprehensive weather data for a location',
parameters: z.object({
lat: z.number().describe('Latitude of the location'),
lng: z.number().describe('Longitude of the location'),
include_historical: z.boolean().optional().describe('Include historical weather data')
}),
execute: async ({ lat, lng, include_historical = false }) => {
const current = {
temperature: '21°C',
description: 'Sunny',
humidity: '65%',
wind_speed: '12 km/h',
pressure: '1013 hPa',
visibility: '10 km',
uv_index: '5'
};

const forecast = [
{ date: '2024-03-20', temperature: '22°C', description: 'Partly Cloudy' },
{ date: '2024-03-21', temperature: '20°C', description: 'Light Rain' },
{ date: '2024-03-22', temperature: '19°C', description: 'Cloudy' },
{ date: '2024-03-23', temperature: '21°C', description: 'Sunny' },
{ date: '2024-03-24', temperature: '23°C', description: 'Clear' }
];

const historical = include_historical ? [
{ date: '2024-03-13', temperature: '18°C', description: 'Rainy' },
{ date: '2024-03-14', temperature: '17°C', description: 'Cloudy' },
{ date: '2024-03-15', temperature: '19°C', description: 'Partly Cloudy' }
] : [];

return {
current,
forecast,
historical
};
}
});

const tools = {
get_lat_lng: getLatLng,
get_weather: getWeather
} as const;

async function getWeatherInfo() {

try {
const { text, toolResults } = await generateText({
model: openai('gpt-4'),
maxSteps: 15,
messages: [
{
role: 'system',
content: `You are a sophisticated weather analysis system.
Use the available tools to:
1. Get location coordinates
2. Generate detailed weather reports
Be thorough and include all relevant data.`
},
{
role: 'user',
content: [
{
type: 'text',
text: 'Analyze the weather conditions in London and New York. '
}
]
}
],
tools,
experimental_telemetry: {
isEnabled: true,
functionId: 'weather-function',
},
});

console.log('\nWeather Analysis:', text);
console.log('\nTool Results:', JSON.stringify(toolResults, null, 2));
} catch (error) {
console.error('Error:', error);
}
}

getWeatherInfo()
10 changes: 10 additions & 0 deletions examples/vercel-ai/instrumentation.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
import { registerOTel, OTLPHttpJsonTraceExporter } from '@vercel/otel';
import { LogfireVercelAISpanProcessor } from '@pydantic/logfire-vercel-ai-span-processor';

export function register() {
registerOTel({
serviceName: 'logfire-vercel-ai-app',
autoDetectResources: true,
spanProcessors: [new LogfireVercelAISpanProcessor()],
});
}
21 changes: 21 additions & 0 deletions examples/vercel-ai/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"name": "@pydantic/logfire-vercel-ai-example",
"version": "1.0.0",
"main": "index.js",
"scripts": {
"start": "node --experimental-strip-types --disable-warning=ExperimentalWarning --import ./instrumentation.ts app.ts"
},
"author": "",
"license": "ISC",
"description": "",
"dependencies": {
"@ai-sdk/openai": "^1.3.22",
"@pydantic/logfire-vercel-ai-span-processor": "*",
"@vercel/otel": "^1.10.3"
},
"devDependencies": {
"@types/node": "^22",
"@types/react": "^19",
"typescript": "^5"
}
}
6 changes: 6 additions & 0 deletions examples/vercel-ai/tsconfig.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"extends": "@tsconfig/node20/tsconfig.json",
"compilerOptions": {
"moduleDetection": "force"
}
}
Loading