Run inference in react native #13372
Unanswered
Eyal-Ben-Chaim
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey to everyone :)
I am trying for several weeks now to run inference using react native and I am encountering so many problems.
I have failed to run the inference using .tfjs .tflite .onnx in react native and now I am trying using ExecuTorch from PyTorch and I cant even convert the model to the necessary file .pte to try and run it on mobile.
I'd love for help and ideas for solutions.
I trained my model using yolov5 and got best.pt and last.pt and I have converted it to .tflite .tfjs .onnx and saved_model if that helps.
Thanks ahead
Beta Was this translation helpful? Give feedback.
All reactions