feat: add GPT-5.3-Codex model for GitHub Copilot provider#954
feat: add GPT-5.3-Codex model for GitHub Copilot provider#954emvnuel wants to merge 2 commits intoanomalyco:devfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds configuration for the new GPT-5.3-Codex model to the GitHub Copilot provider. The model is part of OpenAI's GPT Codex family and is designed for code generation with enhanced capabilities including reasoning, tool calls, structured outputs, and vision support.
Changes:
- Add TOML configuration file for GPT-5.3-Codex model with specifications including 400K token context window, 128K max output, zero-cost pricing, and multi-modal support (text and image inputs)
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| last_updated = "2026-02-05" | ||
| attachment = false | ||
| reasoning = true | ||
| temperature = false |
There was a problem hiding this comment.
Missing 'knowledge' field that is present in almost all other GitHub Copilot model configurations. This field specifies the knowledge cutoff date for the model's training data and should be added for consistency. For reference, gpt-5.2-codex has 'knowledge = "2025-08-31"' and gpt-5.1-codex has 'knowledge = "2024-09-30"'. The field should be placed between the 'temperature' and 'tool_call' fields following the established convention.
| temperature = false | |
| temperature = false | |
| knowledge = "2026-02-05" |
|
Not yet supported in opencode app (this is the 3rd or 4th pr at this point so if u wanna know why please read one of the closed ones) |
|
sry |
Summary
Model Details
Capabilities
Source
{ "capabilities": { "family": "gpt-5.3-codex", "limits": { "max_context_window_tokens": 400000, "max_output_tokens": 128000, "max_prompt_tokens": 272000, "vision": { "max_prompt_image_size": 3145728, "max_prompt_images": 1, "supported_media_types": [ "image/jpeg", "image/png", "image/webp", "image/gif" ] } }, "object": "model_capabilities", "supports": { "parallel_tool_calls": true, "streaming": true, "structured_outputs": true, "tool_calls": true, "vision": true }, "tokenizer": "o200k_base", "type": "chat" }, "id": "gpt-5.3-codex", "model_picker_category": "powerful", "model_picker_enabled": true, "name": "GPT-5.3-Codex", "object": "model", "policy": { "state": "enabled", "terms": "Enable access to the latest GPT-5.3-Codex model from OpenAI. [Learn more about how GitHub Copilot serves GPT-5.3-Codex](https://gh.io/copilot-openai)." }, "preview": false, "supported_endpoints": [ "/responses" ], "vendor": "OpenAI", "version": "gpt-5.3-codex" }