Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Where to get tracer output? #58

Open
sonicviz opened this issue Aug 2, 2024 · 1 comment
Open

Where to get tracer output? #58

sonicviz opened this issue Aug 2, 2024 · 1 comment

Comments

@sonicviz
Copy link

sonicviz commented Aug 2, 2024

  • I'm submitting a ...
    [ ] bug report
    [ ] feature request
    [ ] question about the decisions made in the repository
    [ x] question about how to use this project

  • Summary

Hi.

In your Open Telemetry example code you show some setup and then the output, but you don't show where the trace below is being acquired from. How do you access that?

I tried some example code with cot.getTraces() but I only get the actual answer trace, not the statistical data as below from the ai call.
But I don't see any getTraces methods in the ai object or ai.tracer object

{
"traceId": "ddc7405e9848c8c884e53b823e120845",
"name": "Chat Request",
"id": "d376daad21da7a3c",
"kind": "SERVER",
"timestamp": 1716622997025000,
"duration": 14190456.542,
"attributes": {
"gen_ai.system": "Ollama",
"gen_ai.request.model": "nous-hermes2",
"gen_ai.request.max_tokens": 500,
"gen_ai.request.temperature": 0.1,
"gen_ai.request.top_p": 0.9,
"gen_ai.request.frequency_penalty": 0.5,
"gen_ai.request.llm_is_streaming": false,
"http.request.method": "POST",
"url.full": "http://localhost:11434/v1/chat/completions",
"gen_ai.usage.completion_tokens": 160,
"gen_ai.usage.prompt_tokens": 290
}
}

  • Other information (e.g. detailed explanation, stack traces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
@dosco
Copy link
Collaborator

dosco commented Aug 4, 2024

In the example code in the README you'll see a tracing provider creates and passes a tracer class options: { tracer } this tracer handles all the collection etc it's implemented by the tracer you can get tracers for google cloud, aws, other cloud or llm observability products, etc.

import { trace } from '@opentelemetry/api';
import {
  BasicTracerProvider,
  ConsoleSpanExporter,
  SimpleSpanProcessor
} from '@opentelemetry/sdk-trace-base';

const provider = new BasicTracerProvider();
provider.addSpanProcessor(new SimpleSpanProcessor(new ConsoleSpanExporter()));
trace.setGlobalTracerProvider(provider);

const tracer = trace.getTracer('test');

const ai = new AxAI({
  name: 'ollama',
  config: { model: 'nous-hermes2' },
  options: { tracer }
});

const gen = new AxChainOfThought(
  ai,
  `text -> shortSummary "summarize in 5 to 10 words"`
);

const res = await gen.forward({ text });

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants