-
Notifications
You must be signed in to change notification settings - Fork 196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using the application inference profile in Bedrock results in failed model invocations. #740
Comments
Thanks for the report, do you know what the expected HTTP path is? What endpoint is |
I try output debug log.
I hope this helps |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Amazon Bedrock has added a new feature called "application inference profiles".
Using application inference profiles is like adding an alias to a base model.
For Bedrock's Invoke Model, you can specify the application inference profile as the modelId.
However, when using the Anthropic SDK, specifying the application inference profile as the model results in an error.
This is likely because the model parameter is not expecting an ARN to be set.
Please let me know if you have any further questions regarding this.
The text was updated successfully, but these errors were encountered: