Skip to content

Commit

Permalink
fix(llmobs): fix token extraction for chat completion streams [backpo…
Browse files Browse the repository at this point in the history
…rt 2.20] (#12091)

Backport 75179ef from #12070 to 2.20.

Fixes token chunk extraction to account for the `choices` field in a
chunk being an empty list

#### Before
```
Error generating LLMObs span event for span <Span(id=16151817411339450163,trace_id=137677390470467884790869841527646927357,parent_id=None,name=openai.request)>, likely due to malformed span
Traceback (most recent call last):
  File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 118, in __aiter__
    await self._extract_token_chunk(chunk)
  File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 157, in _extract_token_chunk
    choice = getattr(chunk, "choices", [None])[0]
IndexError: list index out of range
```

#### After
Traced succesfully
<img width="904" alt="image"
src="https://github.com/user-attachments/assets/43c68edd-03f7-4105-a3d3-213eeb5fb0ab"
/>


Co-authored-by: lievan <[email protected]>
Co-authored-by: kyle <[email protected]>
  • Loading branch information
3 people authored Jan 27, 2025
1 parent 3ff00cc commit 07baf4c
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 2 deletions.
10 changes: 8 additions & 2 deletions ddtrace/contrib/internal/openai/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,10 @@ def _extract_token_chunk(self, chunk):
"""Attempt to extract the token chunk (last chunk in the stream) from the streamed response."""
if not self._dd_span._get_ctx_item("_dd.auto_extract_token_chunk"):
return
choice = getattr(chunk, "choices", [None])[0]
choices = getattr(chunk, "choices")
if not choices:
return
choice = choices[0]
if not getattr(choice, "finish_reason", None):
# Only the second-last chunk in the stream with token usage enabled will have finish_reason set
return
Expand Down Expand Up @@ -152,7 +155,10 @@ async def _extract_token_chunk(self, chunk):
"""Attempt to extract the token chunk (last chunk in the stream) from the streamed response."""
if not self._dd_span._get_ctx_item("_dd.auto_extract_token_chunk"):
return
choice = getattr(chunk, "choices", [None])[0]
choices = getattr(chunk, "choices")
if not choices:
return
choice = choices[0]
if not getattr(choice, "finish_reason", None):
return
try:
Expand Down
4 changes: 4 additions & 0 deletions releasenotes/notes/fix-token-extraction-0133808742374ef4.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
---
fixes:
- |
LLM Observability: This fix resolves an issue where extracting token metadata from openai streamed chat completion token chunks caused an IndexError.

0 comments on commit 07baf4c

Please sign in to comment.