Replies: 0 comments 1 reply
-
I had this idea, but I've just been too busy lately to test how well MNN inference. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Mnn inference easily beats available inference pipelines like onnxruntime and ncnn by 40 to 60% speed boost. Are there any plans to have a rapidOcr implementation with MNN?
https://github.com/DayBreak-u/chineseocr_lite/tree/onnx/cpp_projects/OcrLiteMnn I see a v2 model implementation of paddle with Mnn here
Beta Was this translation helpful? Give feedback.
All reactions