2.14.0
Deprecation Notes
- Tracing
- Deprecates the
DD_TRACE_SPAN_AGGREGATOR_RLOCK
environment variable. It will be removed in v3.0.0. - Deprecates support for APM Legacy App Analytics. This feature and its associated configuration options are deprecated and will be removed in v3.0.0.
DD_HTTP_CLIENT_TAG_QUERY_STRING
configuration is deprecated and will be removed in v3.0.0. UseDD_TRACE_HTTP_CLIENT_TAG_QUERY_STRING
instead.
- Deprecates the
New Features
-
DSM
- Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
- Introduces new tracing and datastreams monitoring functionality for Google Protobuf.
-
LLM Observability
- Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
- The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
- The LangChain integration now submits tool spans to LLM Observability.
- LLM Observability spans generated by the OpenAI integration now have updated span name and
model_provider
values. Span names are now prefixed with the OpenAI client name (possible values:OpenAI/AzureOpenAI
) instead of the defaultopenai
prefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. Themodel_provider
field also now corresponds toopenai
orazure_openai
based on the OpenAI client. - The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}
option is set on the completion or chat completion call. - Introduces the
LLMObs.annotation_context()
context manager method, which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active. - Introduces prompt template annotation, which can be passed as an argument to
LLMObs.annotate(prompt={...})
for LLM span kinds. For more information on prompt annotations, see the docs. - google_generativeai: Introduces tracing support for Google Gemini API
generate_content
calls.
See the docs for more information. - openai: The OpenAI integration now includes a new
openai.request.client
tag with the possible valuesOpenAI/AzureOpenAI
to help differentiate whether the request was made to Azure OpenAI or OpenAI. - openai: The OpenAI integration now captures token data from streamed completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}
option is set on the completion or chat completion call.
-
Profiling
- Captures
asyncio.Lock
usages withwith
context managers.
- Captures
-
Other
- botocore: Adds span pointers to some successful AWS botocore spans. Currently only supports S3 PutObject.
- pymongo: Adds support for pymongo>=4.9.0
Bug Fixes
-
Code Security (ASM)
- Fixes a bug in the IAST patching process where
AttributeError
exceptions were being caught, interfering with the proper application cycle. - Resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.
- Fixes a bug in the IAST patching process where
-
LLM Observability
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
LLMObs.enable(agentless_enabled=True)
or settingDD_LLMOBS_AGENTLESS_ENABLED=1
. - Resolves an issue in the
LLMObs.annotate()
method where non-JSON serializable arguments were discarded entirely. Now, theLLMObs.annotate()
method safely handles non-JSON-serializable arguments by defaulting to a placeholder text. - Resolves an issue where attempting to tag non-JSON serializable request/response parameters resulted in a
TypeError
in the OpenAI, LangChain, Bedrock, and Anthropic integrations. - anthropic: Resolves an issue where attempting to tag non-JSON serializable request arguments caused a
TypeError
. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text. - langchain: Resolves an issue where attempting to tag non-JSON serializable tool config arguments resulted in a
TypeError
. The LangChain integration now safely tags non-JSON serializable arguments with a default placeholder text.
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
-
Other
- SSI: This fix ensures injection denylist is included in published OCI package.
- postgres: Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- pymongo: Ensures instances of the
pymongo.MongoClient
can be patch after pymongo is imported.