You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
WARNING: [TRT]: Calibration Profile is not defined. Calibrating with Profile 0I use deepstream 7.1 to perform yolov8 fp32 and fp16 reasoning normally, but an error occurs when int8 reasoning:
root@myai:export INT8_CALIB_IMG_PATH=/data/DeepStream-Yolo/calibration.txt
root@myai:/data/DeepStream-Yolo# export INT8_CALIB_BATCH_SIZE=4
root@myai:/data/DeepStream-Yolo# deepstream-app -c deepstream_app_config.txt
*** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***
Failed to query video capabilities: Invalid argument
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1152 Deserialize engine failed because file path: /data/DeepStream-Yolo/model_b4_gpu0_int8.engine open error
0:00:00.201818454 603 0x593cf8dffe60 WARN nvinfer gstnvinfer.cpp:681:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:2080> [UID = 1]: deserialize engine from file :/data/DeepStream-Yolo/model_b4_gpu0_int8.engine failed
0:00:00.201856666 603 0x593cf8dffe60 WARN nvinfer gstnvinfer.cpp:681:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2185> [UID = 1]: deserialize backend context from engine from file :/data/DeepStream-Yolo/model_b4_gpu0_int8.engine failed, try rebuild
0:00:00.201867746 603 0x593cf8dffe60 INFO nvinfer gstnvinfer.cpp:684:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:2106> [UID = 1]: Trying to create engine from model files
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:382 INT8 calibration file not specified/accessible. INT8 calibration can be done through setDynamicRange API in 'NvDsInferCreateNetwork' implementation
Building the TensorRT Engine
File does not exist: /data/DeepStream-Yolo/calib.table
WARNING: [TRT]: Calibration Profile is not defined. Calibrating with Profile 0
ERROR: [TRT]: [checkSanity.cpp::checkLinks::218] Error Code 2: Internal Error (Assertion item.second != nullptr failed. region should have been removed from Graph::regions)
Segmentation fault (core dumped)
WARNING: [TRT]: Calibration Profile is not defined. Calibrating with Profile 0I use deepstream 7.1 to perform yolov8 fp32 and fp16 reasoning normally, but an error occurs when int8 reasoning:
My configuration code is as follows:
I use the latest nvidia official docker image.
NVIDIA-SMI 560.35.03 Driver Version: 560.35.03 CUDA Version: 12.6.
The hardware is RTX 4070ti super.
The text was updated successfully, but these errors were encountered: