Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Many LLM platforms support the OpenAI SDK. This means systems such as the follow
* - Name
- gen_ai.system
* - `Azure OpenAI <https://github.com/openai/openai-python?tab=readme-ov-file#microsoft-azure-openai>`_
- ``az.ai.openai``
- ``azure.ai.openai``
* - `Gemini <https://developers.googleblog.com/en/gemini-is-now-accessible-from-the-openai-library/>`_
- ``gemini``
* - `Perplexity <https://docs.perplexity.ai/api-reference/chat-completions>`_
Expand Down Expand Up @@ -80,7 +80,26 @@ Enabling message content

Message content such as the contents of the prompt, completion, function arguments and return values
are not captured by default. To capture message content as log events, set the environment variable
`OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` to `true`.
``OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT`` to one of the following values:

- ``true`` - Legacy. Used to enable content capturing on ``gen_ai.{role}.message`` and ``gen_ai.choice`` events when
`latest experimental features <#enabling-the-latest-experimental-features>`_ are *not* enabled.
- ``span`` - Used to enable content capturing on *span* attributes when
`latest experimental features <#enabling-the-latest-experimental-features>`_ are enabled.
- ``event`` - Used to enable content capturing on *event* attributes when
`latest experimental features <#enabling-the-latest-experimental-features>`_ are enabled.

Enabling the latest experimental features
***********************************************

To enable the latest experimental features, set the environment variable
``OTEL_SEMCONV_STABILITY_OPT_IN`` to ``gen_ai_latest_experimental``. Or, if you use
``OTEL_SEMCONV_STABILITY_OPT_IN`` to enable other features, append ``,gen_ai_latest_experimental`` to its value.

Without this setting, OpenAI instrumentation aligns with `Semantic Conventions v1.28.0 <https://github.com/open-telemetry/semantic-conventions/tree/v1.28.0/docs/gen-ai>`_
and would not capture additional details introduced in later versions.

.. note:: Generative AI semantic conventions are still evolving. The latest experimental features will introduce breaking changes in future releases.

Uninstrument
************
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,5 +12,19 @@ OPENAI_API_KEY=sk-YOUR_API_KEY

OTEL_SERVICE_NAME=opentelemetry-python-openai

# Change to 'false' to hide prompt and completion content
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
# Remove or change to 'none' to hide prompt and completion content
# Possible values (case insensitive):
# - `span` - record content on span attibutes
# - `event` - record content on event attributes
# - `true` - only used for backward compatibility when
# `gen_ai_latest_experimental` is not set in the
# `OTEL_SEMCONV_STABILITY_OPT_IN` environemnt variable.
# - everything else - don't record content on any signal
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=span

# Enables latest and greatest features available in GenAI semantic conventions.
# Note: since conventions are still in development, using this flag would
# likely result in having breaking changes.
#
# Comment out if you want to use semantic conventions of version 1.36.0.
OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,8 @@ your OpenAI requests.

Note: `.env <.env>`_ file configures additional environment variables:

- ``OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true`` configures OpenAI instrumentation to capture prompt and completion contents on events.
- ``OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=span`` configures OpenAI instrumentation to capture prompt and completion contents on *span* attributes.
- ``OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental`` enables latest experimental features.

Setup
-----
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,5 +18,19 @@ OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true
# Uncomment if your OTLP endpoint doesn't support logs
# OTEL_LOGS_EXPORTER=console

# Change to 'false' to hide prompt and completion content
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
# Remove or change to 'none' to hide prompt and completion content
# Possible values (case insensitive):
# - `span` - record content on span attibutes
# - `event` - record content on event attributes
# - `true` - only used for backward compatibility when
# `gen_ai_latest_experimental` is not set in the
# `OTEL_SEMCONV_STABILITY_OPT_IN` environemnt variable.
# - everything else - don't record content on any signal
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=span

# Enables latest and greatest features available in GenAI semantic conventions.
# Note: since conventions are still in development, using this flag would
# likely result in having breaking changes.
#
# Comment out if you want to use semantic conventions of version 1.36.0.
OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,9 @@ your OpenAI requests.
Note: `.env <.env>`_ file configures additional environment variables:

- ``OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true`` configures OpenTelemetry SDK to export logs and events.
- ``OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true`` configures OpenAI instrumentation to capture prompt and completion contents on events.
- ``OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=span`` configures OpenAI instrumentation to capture prompt and completion contents on *span* attributes.
- ``OTEL_LOGS_EXPORTER=otlp`` to specify exporter type.
- ``OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental`` enables latest experimental features.

Setup
-----
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,10 @@
from opentelemetry._events import get_event_logger
from opentelemetry.instrumentation.instrumentor import BaseInstrumentor
from opentelemetry.instrumentation.openai_v2.package import _instruments
from opentelemetry.instrumentation.openai_v2.utils import is_content_enabled
from opentelemetry.instrumentation.openai_v2.utils import (
get_content_mode,
is_latest_experimental_enabled,
)
from opentelemetry.instrumentation.utils import unwrap
from opentelemetry.metrics import get_meter
from opentelemetry.semconv.schemas import Schemas
Expand All @@ -71,38 +74,47 @@ def _instrument(self, **kwargs):
__name__,
"",
tracer_provider,
schema_url=Schemas.V1_28_0.value,
schema_url="https://opentelemetry.io/schemas/1.37.0", # TODO: Schemas.V1_37_0.value,
)
event_logger_provider = kwargs.get("event_logger_provider")
event_logger = get_event_logger(
__name__,
"",
schema_url=Schemas.V1_28_0.value,
schema_url="https://opentelemetry.io/schemas/1.37.0", # TODO: Schemas.V1_37_0.value,
event_logger_provider=event_logger_provider,
)
meter_provider = kwargs.get("meter_provider")
self._meter = get_meter(
__name__,
"",
meter_provider,
schema_url=Schemas.V1_28_0.value,
schema_url="https://opentelemetry.io/schemas/1.37.0", # TODO: Schemas.V1_37_0.value,
)

instruments = Instruments(self._meter)

latest_experimental_enabled = is_latest_experimental_enabled()
wrap_function_wrapper(
module="openai.resources.chat.completions",
name="Completions.create",
wrapper=chat_completions_create(
tracer, event_logger, instruments, is_content_enabled()
tracer,
event_logger,
instruments,
get_content_mode(latest_experimental_enabled),
latest_experimental_enabled,
),
)

wrap_function_wrapper(
module="openai.resources.chat.completions",
name="AsyncCompletions.create",
wrapper=async_chat_completions_create(
tracer, event_logger, instruments, is_content_enabled()
tracer,
event_logger,
instruments,
get_content_mode(latest_experimental_enabled),
latest_experimental_enabled,
),
)

Expand Down
Loading
Loading