You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an OpenAI agent implemented with Llama-index and I am using Arize Phoenix for observability.
The traces are working fine with agent.chat() but they are not showing as expected with agent.astream_chat(). Here's an image of a faulty trace for reference.
p.s. I have already raised an issue at Arize Phoenix side first, but they said that they consume the spans that Llama-index produces as they are.
Version
llama-index==0.12.7
Steps to Reproduce
Create an OpenAI agent with Llama-index.
Install Arize Phoenix with versions: arize-phoenix==5.7.0, arize-phoenix-otel==0.6.1
Setup observability with:
from phoenix.otel import register
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
tracer_provider = register(
project_name="my-app",
)
LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)
invoke the agent with agent.astream_chat("")
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered:
The issue you're experiencing with agent.astream_chat() traces not showing as expected in Arize Phoenix might be related to how spans are generated and handled in the LlamaIndex library. Here are a few points to consider:
Asynchronous Span Generation: The agent.astream_chat() method is asynchronous, which can sometimes lead to differences in how spans are generated and reported compared to synchronous methods like agent.chat(). This might affect how the traces are displayed in observability tools like Arize Phoenix [1].
StreamingAgentChatResponse.await_ Delay: The delay you observed in StreamingAgentChatResponse.await_ taking 3.06 seconds could be contributing to the issue. This method is responsible for handling asynchronous chat responses and writing them to history, which might introduce latency if there are inefficiencies or bottlenecks in the process [2].
Integration with Arize Phoenix: Since Arize Phoenix consumes spans as they are produced by LlamaIndex, any issues with span generation or reporting in LlamaIndex will directly affect what you see in Arize Phoenix. Ensure that the integration setup with phoenix.otel and LlamaIndexInstrumentor is correctly configured, as any misconfiguration might lead to incomplete or missing traces.
To address the issue, you might want to:
Investigate the StreamingAgentChatResponse.await_ method to identify any potential bottlenecks or inefficiencies that could be causing delays.
Verify that the integration with Arize Phoenix is correctly set up and that all necessary configurations are in place.
Consider reaching out to the LlamaIndex maintainers for further insights into how spans are generated for asynchronous operations and any known issues with agent.astream_chat().
These steps might help in diagnosing and resolving the trace visibility issue with agent.astream_chat().
Bug Description
I have an OpenAI agent implemented with Llama-index and I am using Arize Phoenix for observability.
The traces are working fine with agent.chat() but they are not showing as expected with agent.astream_chat(). Here's an image of a faulty trace for reference.
p.s. I have already raised an issue at Arize Phoenix side first, but they said that they consume the spans that Llama-index produces as they are.
Version
llama-index==0.12.7
Steps to Reproduce
arize-phoenix==5.7.0
,arize-phoenix-otel==0.6.1
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: