Skip to content

Model Parameter Validation & Temperature Error Flooding #192

@gltanaka

Description

@gltanaka

Problem

When running pdd generate with a single API key, users see a flood of temperature-related error messages. This is the most common user experience (single key) and creates a poor first impression.

From Dec 13 Benchmarking Meeting:

"When I ran PDD generate after doing setup with no changes at all manually, it gave me a bunch of error messages, a flood. So I got a bunch of temperature messages... 4 out of 5 models giving me some sort of temperature error"

Root Cause

  • System iterates through models based on strength parameter
  • Models have different parameter constraints (temperature ranges, required values)
  • Anthropic thinking models require temperature=1.0
  • Some models do not accept temperature parameter at all
  • No validation before attempting API call

Proposed Solution

Add model parameter constraint metadata:

# Option A: Add columns to llm_model.csv
temperature_min,temperature_max,temperature_required_value

# Option B: Separate model_constraints.yaml
claude-sonnet-4-5:
  temperature:
    min: 0.0
    max: 1.0
    required_with_thinking: 1.0
  thinking:
    type: budget
    max_tokens: 64000

Files to Modify

  • pdd/llm_invoke.py (lines ~1645-1705) - add constraint validation
  • .pdd/llm_model.csv - add constraint columns

Acceptance Criteria

  • No temperature errors printed when scanning models with valid single key
  • Model constraints documented in CSV or separate config
  • Auto-correct parameters when possible (e.g., force temp=1.0 for thinking models)
  • Unit tests for parameter validation

Related

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions