-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add --builtins_only
Flag to CLI for generating TFLITE_BUILTINS-Only TensorFlow Lite Models
#707
Comments
Any untrained model will do, just provide us with an ONNX file whose structure we can understand. |
Apologies for not providing a clear explanation earlier. In my case, to prevent an increase in app size, I intended to use the TensorFlow Lite built into Android (GMS). This requires converting the model using only built-in operations, which is why I am suggesting this feature. Thank you for considering this proposal! |
Is it your intention that a model will be generated whether you specify I'm not rejecting your suggestion, just that I don't really understand the benefit of adding this new conversion option. This might make sense if you want to abort on an error if your model contains non-built-in operations. I don't know the specifications of I'm currently out and about, so I can't check what difference there is in the definition information of the .tflite file depending on whether or not the options are specified. |
Issue Type
Feature Request
OS
Linux
onnx2tf version number
1.26.0
onnx version number
1.16.1
onnxruntime version number
1.18.1
onnxsim (onnx_simplifier) version number
0.4.33
tensorflow version number
2.17.0
Download URL for ONNX
N/A
Parameter Replacement JSON
N/A
Description
Hello!
I have been using onnx2tf very effectively it's extremely useful.
After converting a model from ONNX to TensorFlow Lite, I use TensorFlow Lite via GMS(Google Mobile Services) to minimize the required additional size of the app. Specifically, I use:
com.google.android.gms:play-services-tflite-java:16.1.0
com.google.android.gms:play-services-tflite-support:16.1.0
However, GMS only supports models built with
tf.lite.OpsSet.TFLITE_BUILTINS
. To work around this, I manually comment out the parts that includetf.lite.OpsSet.SELECT_TF_OPS
(including other code that selects TensorFlow ops) in the following lines: https://github.com/PINTO0309/onnx2tf/blob/main/onnx2tf/onnx2tf.py#L1425-L1428It would be very useful if a flag like
--builtins_only
could be supported in the CLI to enable this functionality easily.Have a great weekend!
The text was updated successfully, but these errors were encountered: