Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Traces of agent.astream_chat() on Arize Phoenix #17571

Open
MohammedShokr opened this issue Jan 21, 2025 · 1 comment
Open

[Bug]: Traces of agent.astream_chat() on Arize Phoenix #17571

MohammedShokr opened this issue Jan 21, 2025 · 1 comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@MohammedShokr
Copy link

Bug Description

I have an OpenAI agent implemented with Llama-index and I am using Arize Phoenix for observability.
The traces are working fine with agent.chat() but they are not showing as expected with agent.astream_chat(). Here's an image of a faulty trace for reference.

p.s. I have already raised an issue at Arize Phoenix side first, but they said that they consume the spans that Llama-index produces as they are.

Image

Version

llama-index==0.12.7

Steps to Reproduce

  • Create an OpenAI agent with Llama-index.
  • Install Arize Phoenix with versions: arize-phoenix==5.7.0, arize-phoenix-otel==0.6.1
  • Setup observability with:
from phoenix.otel import register
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor

tracer_provider = register(
    project_name="my-app",
)
LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)

  • invoke the agent with agent.astream_chat("")

Relevant Logs/Tracbacks

@MohammedShokr MohammedShokr added bug Something isn't working triage Issue needs to be triaged/prioritized labels Jan 21, 2025
Copy link

dosubot bot commented Jan 21, 2025

The issue you're experiencing with agent.astream_chat() traces not showing as expected in Arize Phoenix might be related to how spans are generated and handled in the LlamaIndex library. Here are a few points to consider:

  1. Asynchronous Span Generation: The agent.astream_chat() method is asynchronous, which can sometimes lead to differences in how spans are generated and reported compared to synchronous methods like agent.chat(). This might affect how the traces are displayed in observability tools like Arize Phoenix [1].

  2. StreamingAgentChatResponse.await_ Delay: The delay you observed in StreamingAgentChatResponse.await_ taking 3.06 seconds could be contributing to the issue. This method is responsible for handling asynchronous chat responses and writing them to history, which might introduce latency if there are inefficiencies or bottlenecks in the process [2].

  3. Integration with Arize Phoenix: Since Arize Phoenix consumes spans as they are produced by LlamaIndex, any issues with span generation or reporting in LlamaIndex will directly affect what you see in Arize Phoenix. Ensure that the integration setup with phoenix.otel and LlamaIndexInstrumentor is correctly configured, as any misconfiguration might lead to incomplete or missing traces.

To address the issue, you might want to:

  • Investigate the StreamingAgentChatResponse.await_ method to identify any potential bottlenecks or inefficiencies that could be causing delays.
  • Verify that the integration with Arize Phoenix is correctly set up and that all necessary configurations are in place.
  • Consider reaching out to the LlamaIndex maintainers for further insights into how spans are generated for asynchronous operations and any known issues with agent.astream_chat().

These steps might help in diagnosing and resolving the trace visibility issue with agent.astream_chat().

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

1 participant