PaddleX/latest/pipeline_deploy/serving #3524
Replies: 45 comments 133 replies
-
|
您好,paddlex --serve --pipeline image_classification这样启动服务之后,关闭服务的命令是什么呀? |
Beta Was this translation helpful? Give feedback.
-
|
paddlex --serve --pipeline {产线名称或产线配置文件路径} [{其他命令行选项}] 如果在部署多个产线, 是否要启动多个服务吗? |
Beta Was this translation helpful? Give feedback.
-
|
服务化部署支持实时数据的高性能推理? |
Beta Was this translation helpful? Give feedback.
-
|
我在测试‘高稳定性服务化部署’,使用通用OCR SDK,本地docker启动成功,client.py测试GRPCInferenceService通过,Metrics Service访问成功,但是HTTPService请求失败,一直报错400 Bad Request。想问一下HTTPService的接口调用文档有没有,请求参数是什么,怎么才能正确访问HTTPService |
Beta Was this translation helpful? Give feedback.
-
|
SDK的下载链接都失效了,麻烦修复下吧 |
Beta Was this translation helpful? Give feedback.
-
|
您好 |
Beta Was this translation helpful? Give feedback.
-
|
docker部署的,为什么λ localhost ~/PaddleX paddlex --serve --pipeline OCR |
Beta Was this translation helpful? Give feedback.
-
|
高稳定性服务化部署如何通过http方式调用 |
Beta Was this translation helpful? Give feedback.
-
|
您好,这个SDK又不能下载了 |
Beta Was this translation helpful? Give feedback.
-
|
您好,想请教下CUDA版本为12.4的话 怎么获取ccr-2vdh3abv-pub.cnc.bj.baidubce.com/paddlex/hps:paddlex3.1-gpu的版本 这个只支持CUDA版本为11.8的吧? |
Beta Was this translation helpful? Give feedback.
-
|
服务化部署如何开启高性能推理? |
Beta Was this translation helpful? Give feedback.
-
|
执行 1.1 paddlex --install serving 一直报错啊,os 版本 AlmaLinux9.6, 在linux 上你们强烈建议使用 docker安装paddlex ,这里怎么没有使用docekr 安装的paddlex 执行服务部署的教程呀, 请大佬指教, 谢谢! Using cached future-1.0.0-py3-none-any.whl (491 kB) [notice] A new release of pip is available: 25.0.1 -> 25.1.1 During handling of the above exception, another exception occurred: Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
-
|
https://paddlepaddle.github.io/PaddleX/latest/pipeline_deploy/serving.html#23 调整后的 pipeline_config.yaml pipeline_name: OCR text_type: general use_doc_preprocessor: True SubPipelines: SubModules: |
Beta Was this translation helpful? Give feedback.
-
|
对于这种启动服务后还需下载的模型,因为网络原因无法下载怎么办,可以手动挂载吗 I0718 07:56:17.887946 7 grpc_server.cc:4117] Started GRPCInferenceService at 0.0.0.0:8001 The above exception was the direct cause of the following exception: Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
-
|
2.4.2 手动构造 HTTP 请求,构造的请求体格式不对,"data"中的json应该是String类型。 |
Beta Was this translation helpful? Give feedback.
-
|
镜像ccr-2vdh3abv-pub.cnc.bj.baidubce.com/paddlex/hps:paddlex3.2-gpu |
Beta Was this translation helpful? Give feedback.
-
|
您好,我已经按照教程在CUDA11.8的镜像中安装了 hpip-gpu和paddle2onnx。但是 paddlex --serve --pipeline OCR --port 8118 --use_hpip 这条命令行会提示: The Paddle Inference backend is selected with the default configuration. This may not provide optimal performance. paddlex --serve --pipeline OCR --port 8118 --use_hpip --hpi_config '{"backend": "onnxruntime"}' 这条命令行会报错:No inference backend and configuration could be suggested. Reason: 'onnxruntime' is not a supported inference backend. 请问我是遗漏了什么步骤吗? |
Beta Was this translation helpful? Give feedback.
-
|
http请求有文档嘛 |
Beta Was this translation helpful? Give feedback.
-
我下载【通用版面解析 v3】SDK ,使用docker run At: E1020 10:19:27.707704 7 model_repository_manager.cc:1186] failed to load 'layout-parsing' version 1: Internal: UnboundLocalError: local variable 'transpose_weight_keys' referenced before assignment At: I1020 10:19:27.707830 7 server.cc:522] I1020 10:19:27.707855 7 server.cc:549] I1020 10:19:27.707892 7 server.cc:592] I1020 10:19:27.707954 7 tritonserver.cc:1920] I1020 10:19:27.707966 7 server.cc:252] Waiting for in-flight requests to complete. |
Beta Was this translation helpful? Give feedback.
-
|
您好,使用paddlex3.0.1 镜像,安装服务化部署插件 paddlex --install serving Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
-
|
请问有对应的 api 文档吗? |
Beta Was this translation helpful? Give feedback.
-
|
CPU 模式 请求后 返回响应码404 请问是API的请求有改动了吗 |
Beta Was this translation helpful? Give feedback.
-
|
一言难尽的文档,要啥没啥。 |
Beta Was this translation helpful? Give feedback.
-
|
我在使用高稳定性服务化部署(docker)时,高并发请求访问调用grpc接口,约会有10%左右的请求返回空结果,但是实际上这些图片中有文字内容。当我将这些图片再次调用grpc接口进行计算时,其中又有一部分可以正常输出结果。我检查了docker容器日志,发现会偶发性产生如下错误: [ ERROR] [2025-11-03 06:21:12,130] [60098771ab20491380763bb02c35a93b] [b70b0af0-16c0-4e41-a52c-53d81ae7b4ea] - Unhandled exception
Traceback (most recent call last):
File "/paddlex/py310/lib/python3.10/site-packages/paddlex_hps_server/base_model.py", line 88, in execute
result_or_output = self.run(input_, log_id)
File "/paddlex/var/paddlex_model_repo/ocr/1/model.py", line 80, in run
images, data_info = utils.file_to_images(
File "/paddlex/py310/lib/python3.10/site-packages/paddlex/inference/serving/infra/utils.py", line 252, in file_to_images
data_info = get_image_info(images[0])
File "/paddlex/py310/lib/python3.10/site-packages/paddlex/inference/serving/infra/utils.py", line 261, in get_image_info
return ImageInfo(width=image.shape[1], height=image.shape[0])
AttributeError: 'NoneType' object has no attribute 'shape'想请问这种情况是什么原因产生,以及如何解决?谢谢! |
Beta Was this translation helpful? Give feedback.
-
|
能不能提供详细的http请求文档,想要用本机图片测试路径怎么设置,以及batch推理是否支持,该怎么传递图片 |
Beta Was this translation helpful? Give feedback.
-
|
ccr-2vdh3abv-pub.cnc.bj.baidubce.com/paddlex/paddlex:paddlex3.3.4-paddlepaddle3.2.0-gpu-cuda12.9-cudnn9.9这个镜像太大了,接近60G,有没有小一点的,适合部署的,我只需要OCR相关产线 |
Beta Was this translation helpful? Give feedback.
-
|
您好,我今天测试了paddlex_hps_PaddleOCR-VL_sdk,其中使用client.py测试一个pdf文件,该pdf文件多达几十页,运行client.py后,报错,报错信息为 请问如何解决 |
Beta Was this translation helpful? Give feedback.
-
|
高稳定性服务化部署在triton,但是OCRVL模型不支持加速好像,那么用什么最快 用原始的VLLm方式最快吗 |
Beta Was this translation helpful? Give feedback.
-
|
高稳定性部署会自动转化为onnx的吗 |
Beta Was this translation helpful? Give feedback.
-
|
我部署PP-chatocrV4,里面的server中的pipeline_config.yaml的chat-botllm怎么修改成其他的api呢,还是只能用百度智能云中的api |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
PaddleX/latest/pipeline_deploy/serving
https://paddlepaddle.github.io/PaddleX/latest/pipeline_deploy/serving.html
Beta Was this translation helpful? Give feedback.
All reactions