Tensort inference speed slow in 1.3.0 #2502
Unanswered
wentout2007
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am using an rtmdet model to segment small objects in a big picture.
the model has been coverted for tensorrt use
in a previous version of mmdeploy, the mask return is the same size as the detected object,
however, in version 1.30, the mask return is the same size as the origin big picture.
What is WORSE, the inference speed is extremely slow in version 1.3, as if tensorrt is not working.
Inference takes 30 times longer!!!!
anybody tell me why??????
Beta Was this translation helpful? Give feedback.
All reactions