-
-
Notifications
You must be signed in to change notification settings - Fork 335
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Basic checks
- I searched existing issues - this hasn't been reported
- I can reproduce this consistently
- This is a RubyLLM bug, not my application code
What's broken?
Structured schema responses cause BadRequestError: Invalid request in multi-turn conversations when messages are persisted. OpenAI API receives a Hash object instead of a JSON string for the message content.
How to reproduce
class PersonSchema < RubyLLM::Schema
string :name, description: "Person's full name"
integer :age, description: "Person's age in years"
string :city, description: "City where they live"
end
chat_record = Chat.create!(model: 'gpt-4.1-nano')
chat_record.with_schema(PersonSchema)
response = chat_record.ask("Generate a person from Paris")
chat_record.with_schema(nil)
chat_record.ask("Generate a fictional bio for this person")Expected behavior
Structured response from the first message should be serialized as a JSON string when included in the message history for subsequent API calls.
What actually happened
Structured response is sent as a Hash:
request -- POST https://api.openai.com/v1/chat/completions
request -- {:model=>"gpt-4.1-nano",
:messages=>
[{:role=>"user", :content=>"Generate a person from Paris"},
{:role=>"assistant",
:content=>{"name"=>"Sophie Laurent", "age"=>29, "city"=>"Paris"}},
{:role=>"user", :content=>"Generate a fictional bio for this person"}],
:stream=>false}
RubyLLM: API call failed, destroying message: 2322
Invalid type for 'messages[1].content': expected one of a string or array of objects, but got an object instead. (RubyLLM::BadRequestError)
Environment
- RubyLLM version: 1.9.1 (working in 1.8.2)
- Provider: OpenAI
the-s-anton and kryzhovnik
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working