Inference by Model Converter vs SDK #2864
Unanswered
ayush-AkulaTech
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
From what I understand, after model conversion, we can perform inference via "model converter" using the python mmdeploy.api module, and also via "Inference SDK" using the mmdeploy_runtime module in Python (as well as in other languages). What is the difference between the two? Which is better for which use case? Thanks!
Beta Was this translation helpful? Give feedback.
All reactions