-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for ONNX Runtime? #11
Comments
Good idea to add ONNX RT. :) I have created a quick PR #12 that additionally installs ONNX RT into the We are planning a new release before the end of the year, anyway, and would then integrate this PR into that one. |
@lreiher thank you for your quick response. With your instructions and my use case, i receive the following message when creating an inference session:
|
Okay, guess we'll have to come up with a quick sample script to test ONNX RT (or perhaps you could even share yours?) ourselves and try to fix this issue. |
See #12 (comment), further discussion should take place there. |
Hi there, thanks for the amazing collection!
Are there any plans to include support for the ONNX runtime with either CUDA or TensorRT execution providers?
The text was updated successfully, but these errors were encountered: