-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Core Dump where create_inference_graph #60
Comments
Have you found the solution? i am also facing the same problem. @lannyyip |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I follow the the code example to convert ssd_mobilenet2 models to trt model in Jetson Nano, as below:
For sure the model is trained in amd64 platform.
While executing "create_inference_graph", following core dump generate.
Not sure whether the problem related to following warning.
Could any one give me a hand on it? Thank you.
The text was updated successfully, but these errors were encountered: