Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fix(llmobs): fix token extraction for chat completion streams [backpo…
…rt 2.20] (#12091) Backport 75179ef from #12070 to 2.20. Fixes token chunk extraction to account for the `choices` field in a chunk being an empty list #### Before ``` Error generating LLMObs span event for span <Span(id=16151817411339450163,trace_id=137677390470467884790869841527646927357,parent_id=None,name=openai.request)>, likely due to malformed span Traceback (most recent call last): File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 118, in __aiter__ await self._extract_token_chunk(chunk) File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 157, in _extract_token_chunk choice = getattr(chunk, "choices", [None])[0] IndexError: list index out of range ``` #### After Traced succesfully <img width="904" alt="image" src="https://github.com/user-attachments/assets/43c68edd-03f7-4105-a3d3-213eeb5fb0ab" /> Co-authored-by: lievan <[email protected]> Co-authored-by: kyle <[email protected]>
- Loading branch information