[Python] Inference code base ONNXRuntime, OpenVINO and ocrweb
v1.0.0
- Considering that it is tedious to directly use the inference code under the existing repository, we hereby release a version that packages the resources directory with each other inference scenario independently.
- The contents of the resources directory are shown below, and the models are the same as those in the download link given in the repository, which is the current The best model combination.
resources ├── fonts │ └── msyh.ttc ├── models │ ├── ch_ppocr_mobile_v2.0_cls_infer.onnx │ ├── ch_PP-OCRv3_det_infer.onnx │ └── ch_PP-OCRv3_rec_infer.onnx └── rec_dict └── ppocr_keys_v1.txt
- You can go to BaiduNetDisk | Google Drive to download other models according to your needs to download other models.
- The following attached zip file contains the complete runtime code and model, which can be downloaded directly and refer to the README to run the sample demo.
中文版:
- 考虑到直接使用现有仓库下的推理代码步骤繁琐,特此release一版,将resources目录与其他各个推理场景独立打包。
- 其中,resources目录下内容如下所示,模型与仓库中所给下载链接中的一致,为目前最优模型组合。
resources ├── fonts │ └── msyh.ttc ├── models │ ├── ch_ppocr_mobile_v2.0_cls_infer.onnx │ ├── ch_PP-OCRv3_det_infer.onnx │ └── ch_PP-OCRv3_rec_infer.onnx └── rec_dict └── ppocr_keys_v1.txt
- 小伙伴可根据需要,自行去 百度网盘 | Google Drive下载其他模型。
- 以下附件中的zip文件中含有完整的运行代码和模型,可直接下载,参照README来运行示例Demo。
- !!!如果下载网慢的话,可以去Gitee下载。