Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ type OpenAICompatibleCompletionConfig = {
url: (options: { modelId: string; path: string }) => string;
fetch?: FetchFunction;
errorStructure?: ProviderErrorStructure<any>;
supportsStructuredOutputs?: boolean;

/**
* The supported URLs for the model.
Expand Down
7 changes: 7 additions & 0 deletions packages/openai-compatible/src/openai-compatible-provider.ts
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,11 @@ or to provide a custom fetch implementation for e.g. testing.
Include usage information in streaming responses.
*/
includeUsage?: boolean;

/**
Whether the provider supports structured outputs (JSON schema).
*/
supportsStructuredOutputs?: boolean;
}

/**
Expand Down Expand Up @@ -121,12 +126,14 @@ export function createOpenAICompatible<
new OpenAICompatibleChatLanguageModel(modelId, {
...getCommonModelConfig('chat'),
includeUsage: options.includeUsage,
supportsStructuredOutputs: options.supportsStructuredOutputs,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The supportsStructuredOutputs option is not passed to completion models, creating inconsistent behavior where only chat models will respect this setting.

View Details
📝 Patch Details
diff --git a/packages/openai-compatible/src/completion/openai-compatible-completion-language-model.ts b/packages/openai-compatible/src/completion/openai-compatible-completion-language-model.ts
index cbc7a84d4..57e4b347a 100644
--- a/packages/openai-compatible/src/completion/openai-compatible-completion-language-model.ts
+++ b/packages/openai-compatible/src/completion/openai-compatible-completion-language-model.ts
@@ -39,6 +39,11 @@ type OpenAICompatibleCompletionConfig = {
   fetch?: FetchFunction;
   errorStructure?: ProviderErrorStructure<any>;
 
+  /**
+   * Whether the model supports structured outputs.
+   */
+  supportsStructuredOutputs?: boolean;
+
   /**
    * The supported URLs for the model.
    */
@@ -50,6 +55,8 @@ export class OpenAICompatibleCompletionLanguageModel
 {
   readonly specificationVersion = 'v2';
 
+  readonly supportsStructuredOutputs: boolean;
+
   readonly modelId: OpenAICompatibleCompletionModelId;
   private readonly config: OpenAICompatibleCompletionConfig;
   private readonly failedResponseHandler: ResponseHandler<APICallError>;
@@ -69,6 +76,8 @@ export class OpenAICompatibleCompletionLanguageModel
       errorStructure.errorSchema,
     );
     this.failedResponseHandler = createJsonErrorResponseHandler(errorStructure);
+
+    this.supportsStructuredOutputs = config.supportsStructuredOutputs ?? false;
   }
 
   get provider(): string {
diff --git a/packages/openai-compatible/src/openai-compatible-provider.ts b/packages/openai-compatible/src/openai-compatible-provider.ts
index c8cc84298..f373adb65 100644
--- a/packages/openai-compatible/src/openai-compatible-provider.ts
+++ b/packages/openai-compatible/src/openai-compatible-provider.ts
@@ -133,6 +133,7 @@ export function createOpenAICompatible<
     new OpenAICompatibleCompletionLanguageModel(modelId, {
       ...getCommonModelConfig('completion'),
       includeUsage: options.includeUsage,
+      supportsStructuredOutputs: options.supportsStructuredOutputs,
     });
 
   const createEmbeddingModel = (modelId: EMBEDDING_MODEL_IDS) =>

Analysis

OpenAI Compatible Provider Structured Outputs Inconsistency

Analysis

The OpenAI Compatible provider exhibited an inconsistency where the supportsStructuredOutputs configuration option was only passed to chat models but not to completion models. This created an asymmetric API where provider-level configuration didn't apply uniformly to all model types created by the same provider instance.

Bug Manifestation

When creating an OpenAI Compatible provider with supportsStructuredOutputs: true:

const provider = createOpenAICompatible({
  baseURL: 'https://api.example.com',
  name: 'my-provider',
  supportsStructuredOutputs: true
});

const chatModel = provider.chatModel('gpt-4');
const completionModel = provider.completionModel('text-davinci-003');

console.log(chatModel.supportsStructuredOutputs);      // true
console.log(completionModel.supportsStructuredOutputs); // undefined (property missing)

Chat models correctly received the supportsStructuredOutputs property and used it for:

  • Issuing warnings when JSON schema is used without structured outputs support
  • Determining whether to use json_schema or json_object response format types

Completion models completely ignored this provider setting:

  • The property was not passed during model construction
  • The supportsStructuredOutputs property was absent from the model instance
  • Provider-level configuration had no effect on completion model behavior

Impact Assessment

While completion models in the current implementation don't actively use structured outputs for request formatting (they issue a blanket warning for non-text response formats), this inconsistency creates several problems:

  1. API Inconsistency: Users expect provider-level settings to apply to all models created from that provider
  2. Future Extensibility Issues: If structured outputs are added to completion models later, they won't inherit the provider configuration
  3. Debugging Confusion: Developers can't inspect the supportsStructuredOutputs property on completion models to understand the provider's capabilities
  4. Behavioral Divergence: Chat and completion models from the same provider handle structured output configuration differently

Root Cause

The issue originated in the provider factory function (packages/openai-compatible/src/openai-compatible-provider.ts). The createChatModel function correctly passed the supportsStructuredOutputs option:

const createChatModel = (modelId: CHAT_MODEL_IDS) =>
  new OpenAICompatibleChatLanguageModel(modelId, {
    ...getCommonModelConfig('chat'),
    includeUsage: options.includeUsage,
    supportsStructuredOutputs: options.supportsStructuredOutputs, // ✓ Present
  });

But the createCompletionModel function was missing this parameter:

const createCompletionModel = (modelId: COMPLETION_MODEL_IDS) =>
  new OpenAICompatibleCompletionLanguageModel(modelId, {
    ...getCommonModelConfig('completion'),
    includeUsage: options.includeUsage,
    // ✗ Missing: supportsStructuredOutputs: options.supportsStructuredOutputs,
  });

Additionally, the OpenAICompatibleCompletionLanguageModel class lacked the infrastructure to handle this property, missing both the configuration type definition and the instance property.

Solution Implementation

The fix required changes to both the completion model class and the provider:

1. Enhanced Completion Model Configuration Type

Added supportsStructuredOutputs to the OpenAICompatibleCompletionConfig type:

type OpenAICompatibleCompletionConfig = {
  // ... existing properties
  /**
   * Whether the model supports structured outputs.
   */
  supportsStructuredOutputs?: boolean;
  // ... remaining properties
};

2. Added Instance Property

Added the supportsStructuredOutputs readonly property to the completion model class:

export class OpenAICompatibleCompletionLanguageModel implements LanguageModelV2 {
  readonly specificationVersion = 'v2';
  readonly supportsStructuredOutputs: boolean;
  // ... other properties
}

3. Constructor Initialization

Modified the constructor to initialize the property with the same default behavior as chat models:

constructor(modelId, config) {
  // ... existing initialization
  this.supportsStructuredOutputs = config.supportsStructuredOutputs ?? false;
}

4. Provider Configuration Passing

Updated the provider to pass the supportsStructuredOutputs option to completion models:

const createCompletionModel = (modelId: COMPLETION_MODEL_IDS) =>
  new OpenAICompatibleCompletionLanguageModel(modelId, {
    ...getCommonModelConfig('completion'),
    includeUsage: options.includeUsage,
    supportsStructuredOutputs: options.supportsStructuredOutputs,
  });

Verification

The fix was validated through comprehensive testing showing that both model types now consistently receive the provider's supportsStructuredOutputs setting:

// Provider with supportsStructuredOutputs: true
const provider = createOpenAICompatible({
  supportsStructuredOutputs: true,
  // ... other options
});

const chatModel = provider.chatModel('gpt-4');
const completionModel = provider.completionModel('text-davinci-003');

console.log(chatModel.supportsStructuredOutputs);      // true
console.log(completionModel.supportsStructuredOutputs); // true ✓ Fixed

// Provider with supportsStructuredOutputs: false  
const providerFalse = createOpenAICompatible({
  supportsStructuredOutputs: false,
  // ... other options
});

const chatModelFalse = providerFalse.chatModel('gpt-4');
const completionModelFalse = providerFalse.completionModel('text-davinci-003');

console.log(chatModelFalse.supportsStructuredOutputs);      // false
console.log(completionModelFalse.supportsStructuredOutputs); // false ✓ Fixed

All existing tests continue to pass, confirming the fix doesn't introduce regressions. The change maintains backward compatibility since the property defaults to false when not specified, matching the previous implicit behavior.
No newline at end of file

});

const createCompletionModel = (modelId: COMPLETION_MODEL_IDS) =>
new OpenAICompatibleCompletionLanguageModel(modelId, {
...getCommonModelConfig('completion'),
includeUsage: options.includeUsage,
supportsStructuredOutputs: options.supportsStructuredOutputs,
});

const createEmbeddingModel = (modelId: EMBEDDING_MODEL_IDS) =>
Expand Down
Loading