Skip to content

Commit

Permalink
Fixed docs bugs (#1409)
Browse files Browse the repository at this point in the history
* Update VERSION_NUMBER

* update
  • Loading branch information
Zheng-Bicheng authored Oct 16, 2024
1 parent b2c486a commit 2146244
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 6 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ protobuf-*
# Temporary files automatically generated when executing Paddle2ONNX unit tests
*.pdmodel
*.pdiparams
*.pdiparams.info
*.onnx
*.temptxt
tests/__pycache_*
Expand Down
6 changes: 1 addition & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,10 +45,6 @@ paddle2onnx --model_dir saved_inference_model \
--model_filename model.pdmodel \
--params_filename model.pdiparams \
--save_file model.onnx
paddle2onnx --model_dir ch_ppstructure_mobile_v2.0_SLANet_infer \
--model_filename inference.pdmodel \
--params_filename inference.pdiparams \
--save_file inference.onnx
```

可调整的转换参数如下表:
Expand All @@ -60,7 +56,7 @@ paddle2onnx --model_dir ch_ppstructure_mobile_v2.0_SLANet_infer \
| --params_filename | **[可选]** 配置位于 `--model_dir` 下存储模型参数的文件名称 |
| --save_file | 指定转换后的模型保存目录路径 |
| --opset_version | **[可选]** 配置转换为 ONNX 的 OpSet 版本,目前支持 7~16 等多个版本,默认为 9 |
| --enable_onnx_checker | **[可选]** 配置是否检查导出为 ONNX 模型的正确性, 建议打开此开关, 默认为 False |
| --enable_onnx_checker | **[可选]** 配置是否检查导出为 ONNX 模型的正确性, 建议打开此开关, 默认为 True |
| --enable_auto_update_opset | **[可选]** 是否开启 opset version 自动升级功能,当低版本 opset 无法转换时,自动选择更高版本的 opset进行转换, 默认为 True |
| --deploy_backend | **[可选]** 量化模型部署的推理引擎,支持 onnxruntime/rknn/tensorrt, 默认为 onnxruntime |
| --save_calibration_file | **[可选]** TensorRT 8.X版本部署量化模型需要读取的 cache 文件的保存路径,默认为 calibration.cache |
Expand Down
2 changes: 1 addition & 1 deletion VERSION_NUMBER
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1.2.9
1.2.10

0 comments on commit 2146244

Please sign in to comment.