Skip to content

Conversation

@dd-octo-sts
Copy link
Contributor

@dd-octo-sts dd-octo-sts bot commented Jan 21, 2026

Backport 4c65feb from #16037 to 4.3.

In litellm>=1.74.15, router streaming responses are wrapped in FallbackStreamWrapper (for mid-stream fallback support) which doesn't expose the .handler attribute that was expected.

This change adds defensive handling to check for the handler attribute before accessing it. When the handler is not available, the response is wrapped in our own TracedStream to ensure spans are properly finished.

Also reported here: BerriAI/litellm#13725

emmettbutler and others added 19 commits January 6, 2026 07:37
Backport c40e537 from #15883 to 4.2.

Aspect benchmarks are very fast, but the lack of enough values
introduces a high variability of the results.

This PR increases the number of values for iast aspect benchmarks to
obtain a more stable measure.

APPSEC-60435

Signed-off-by: Alberto Vara <alberto.vara@datadoghq.com>
Co-authored-by: Alberto Vara <alberto.vara@datadoghq.com>
## Description

This is a backport of #15962.
This issue was introduced in 4.1.
… [backport 4.2] (#15967)

Backport 99fe656 from #15961 to 4.2.

## Description

<!-- Provide an overview of the change and motivation for the change -->

CI run for #15723

## Testing

<!-- Describe your testing strategy or note what tests are included -->

## Risks

<!-- Note any risks associated with this change, or "None" if no risks
-->

## Additional Notes

<!-- Any other information that would be helpful for reviewers -->

Co-authored-by: Alexandre Choura <42672104+PROFeNoM@users.noreply.github.com>
Co-authored-by: Kian Jones <kian@letta.com>
Co-authored-by: Kyle Verhoog <kyle@verhoog.ca>
Co-authored-by: Louis Tricot <75956635+dubloom@users.noreply.github.com>
…ackport 4.2] (#16012)

## Description


Backport of #15989 to 4.2.
…ckport 4.2] (#16036)

Backport a51047c from #16035 to 4.2.

## Description

This PR removes a check added in #11182 to recolor langchain-openai
spans as LLM kind if certain kwarg key/value pairs were detected.

This has led to duplicate LLM spans representing the same LLM call (and
therefore duplicate LLM span count + cost/token metrics + status/error
metrics). While we acknowledge there may be langchain-openai traces that
might be missing LLM spans if the downstream openai integration is
disabled, we've decided it's more important to avoid duplicate LLM span
cases due to billing implications.

<!-- Provide an overview of the change and motivation for the change -->

## Testing

<!-- Describe your testing strategy or note what tests are included -->

## Risks

<!-- Note any risks associated with this change, or "None" if no risks
-->

## Additional Notes

<!-- Any other information that would be helpful for reviewers -->

Co-authored-by: Yun Kim <35776586+Yun-Kim@users.noreply.github.com>
…2] (#16063)

Backport a20b073 from #16049 to 4.2.

## Description

As title says!

Co-authored-by: Thomas Kowalski <thomas.kowalski@datadoghq.com>
… 4.2] (#16084)

Backport 742811e from #16064 to 4.2.

## Description

Noticed that our memory profiler flamegraphs are upside down. This fixes
that by removing `set_reverse_locations(true)` call from memory profiler
traceback. Added a simple regression test which fails without the change
and passes with the change.

## Testing

<!-- Describe your testing strategy or note what tests are included -->

## Risks

<!-- Note any risks associated with this change, or "None" if no risks
-->

## Additional Notes

<!-- Any other information that would be helpful for reviewers -->

Signed-off-by: Taegyun Kim <taegyun.kim@datadoghq.com>
Co-authored-by: Taegyun Kim <taegyun.kim@datadoghq.com>
@cit-pr-commenter-54b7da
Copy link

Codeowners resolved as

.riot/requirements/109a45b.txt                                          @DataDog/apm-python
.riot/requirements/10a8045.txt                                          @DataDog/apm-python
.riot/requirements/1229e9a.txt                                          @DataDog/apm-python
.riot/requirements/12b8cfa.txt                                          @DataDog/apm-python
.riot/requirements/1747447.txt                                          @DataDog/apm-python
.riot/requirements/1e893b9.txt                                          @DataDog/apm-python
.riot/requirements/27afe82.txt                                          @DataDog/apm-python
.riot/requirements/4061c90.txt                                          @DataDog/apm-python
.riot/requirements/8d10412.txt                                          @DataDog/apm-python
.riot/requirements/a417cc8.txt                                          @DataDog/apm-python
.riot/requirements/a971ee3.txt                                          @DataDog/apm-python
.riot/requirements/c88628f.txt                                          @DataDog/apm-python
.riot/requirements/d728b27.txt                                          @DataDog/apm-python
.riot/requirements/d8bb960.txt                                          @DataDog/apm-python
.riot/requirements/fc54849.txt                                          @DataDog/apm-python
ddtrace/contrib/integration_registry/registry.yaml                      @DataDog/apm-core-python @DataDog/apm-idm-python
ddtrace/contrib/internal/litellm/patch.py                               @DataDog/ml-observability
releasenotes/notes/llm-obs-litellm-add-router-span-info-10d6e1d5149e8407.yaml  @DataDog/apm-python
riotfile.py                                                             @DataDog/apm-python
tests/contrib/litellm/cassettes/completion_anthropic.yaml               @DataDog/ml-observability
tests/contrib/litellm/cassettes/completion_anthropic_v1_74_15.yaml      @DataDog/ml-observability
tests/contrib/litellm/test_litellm.py                                   @DataDog/ml-observability

…ses (#16037)

In litellm>=1.74.15, router streaming responses are wrapped in
`FallbackStreamWrapper` (for mid-stream fallback support) which doesn't
expose the .handler attribute that was expected.

This change adds defensive handling to check for the handler attribute
before accessing it. When the handler is not available, the response is
wrapped in our own `TracedStream` to ensure spans are properly finished.

Also reported here: BerriAI/litellm#13725

---------

Co-authored-by: Yun Kim <yun.kim@datadoghq.com>
(cherry picked from commit 4c65feb)
@Yun-Kim Yun-Kim force-pushed the backport-16037-to-4.3 branch from d2e8b9a to 6bfcf15 Compare January 21, 2026 21:34
@emmettbutler emmettbutler deleted the branch 4.3 January 26, 2026 18:20
@Yun-Kim Yun-Kim reopened this Jan 27, 2026
@Yun-Kim Yun-Kim requested review from a team as code owners January 27, 2026 19:54
@Yun-Kim Yun-Kim closed this Jan 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants