Skip to content

Using litellm proxy model (OpenAI compatible endpoint) #170

@pradhyumna85

Description

@pradhyumna85

Could you please help with configuring Morphik to use models on litellm proxy server (OpenAI compatible endpoint)?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions