Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix cert when querying modelmesh / kserve #2081

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 17 additions & 18 deletions ods_ci/libs/Helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -223,7 +223,14 @@ def send_random_inference_request(
value_range=[0, 255],
shape={"B": 1, "C": 3, "H": 512, "W": 512},
no_requests=100,
deployment_type=None,
):
supported_deployment_types = ("kserve", "modelmesh")
if deployment_type and deployment_type not in supported_deployment_types:
self.BuiltIn.fail(
f"Unsupported deployment type: {deployment_type}. Supported types: {supported_deployment_types}"
)

for _ in range(no_requests):
data_img = [
random.randrange(value_range[0], value_range[1]) for _ in range(shape["C"] * shape["H"] * shape["W"])
Expand All @@ -243,25 +250,17 @@ def send_random_inference_request(
+ " }]}"
)

request_kwargs = {"url": endpoint, "headers": headers, "data": data}

# This file only exists when running on self-managed clusters
ca_bundle = Path("openshift_ca.crt")
knative_ca_bundle = Path("openshift_ca_istio_knative.crt")
if ca_bundle.is_file():
response = requests.post(
endpoint,
headers=headers,
data=data,
verify="openshift_ca.crt",
)
elif knative_ca_bundle.is_file():
response = requests.post(
endpoint,
headers=headers,
data=data,
verify="openshift_ca_istio_knative.crt",
)
else:
response = requests.post(endpoint, headers=headers, data=data)
if deployment_type == "modelmesh" and (ca_bundle := Path("openshift_ca.crt")):
request_kwargs["verify"] = ca_bundle

elif deployment_type == "kserve" and (knative_ca_bundle := Path("openshift_ca_istio_knative.crt")):
request_kwargs["verify"] = knative_ca_bundle

response = requests.post(**request_kwargs)

return response.status_code, response.text # pyright: ignore [reportPossiblyUnboundVariable]

@keyword
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@
${url}= Get Model Route Via UI ${MODEL_NAME}
${status_code} ${response_text}= Send Random Inference Request endpoint=${url} name=input:0
... shape={"B": 1, "H": 299, "W": 299, "C": 3} no_requests=1
... deployment_type=modelmesh
Should Be Equal As Strings ${status_code} 200
[Teardown] Run Keywords Run Keyword If Test Failed Get Modelmesh Events And Logs
... server_name=${RUNTIME_NAME} project_title=${namespace}
Expand Down Expand Up @@ -163,7 +164,7 @@
Recreate S3 Data Connection project_title=${namespace} dc_name=model-serving-connection
... aws_access_key=${S3.AWS_ACCESS_KEY_ID} aws_secret_access=${S3.AWS_SECRET_ACCESS_KEY}
... aws_bucket_name=ods-ci-s3
Create Model Server token=${FALSE} server_name=${RUNTIME_NAME} existing_server=${TRUE}
Create Model Server token=${FALSE} server_name=${RUNTIME_NAME} existing_server=${FALSE}
Serve Model project_name=${namespace} model_name=${MODEL_NAME} framework=tensorflow
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection
... model_path=inception_resnet_v2.pb
Expand All @@ -174,8 +175,9 @@
Verify Model Status ${MODEL_NAME} success
Set Suite Variable ${MODEL_CREATED} ${TRUE}
${url}= Get Model Route Via UI ${MODEL_NAME}
${status_code} ${response_text}= Send Random Inference Request endpoint=${url} name=input
${status_code} ${response_text}= Send Random Inference Request endpoint=${url} name=input:0

Check notice

Code scanning / Robocop

Variable '{{ name }}' is assigned but not used Note test

Variable '${response_text}' is assigned but not used
... shape={"B": 1, "H": 299, "W": 299, "C": 3} no_requests=1
... deployment_type=modelmesh
Should Be Equal As Strings ${status_code} 200
Serve Model project_name=${namespace} model_name=${MODEL_NAME} framework=openvino_ir
... existing_data_connection=${TRUE} data_connection_name=model-serving-connection
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ Verify Tensorflow Model Via UI (OVMS on Kserve) # robocop: off=too-long-test-
${url}= Get Model Route Via UI ${MODEL_NAME}
${status_code} ${response_text}= Send Random Inference Request endpoint=${url} name=input:0
... shape={"B": 1, "H": 299, "W": 299, "C": 3} no_requests=1
... deployment_type=kserve
Should Be Equal As Strings ${status_code} 200
[Teardown] Run Keywords Delete Project Via CLI By Display Name displayed_name=ALL AND
... Run Keyword If Test Failed Get Kserve Events And Logs
Expand Down Expand Up @@ -131,7 +132,7 @@ Verify GPU Model Deployment Via UI (OVMS on Kserve) # robocop: off=too-long-t
Verify Model Status ${MODEL_NAME_GPU} success
Set Suite Variable ${MODEL_CREATED} True
${url}= Get Model Route Via UI ${MODEL_NAME_GPU}
Send Random Inference Request endpoint=${url} no_requests=100
Send Random Inference Request endpoint=${url} no_requests=100 deployment_type=kserve
# Verify metric DCGM_FI_PROF_GR_ENGINE_ACTIVE goes over 0
${prometheus_route}= Get OpenShift Prometheus Route
${sa_token}= Get OpenShift Prometheus Service Account Token
Expand Down
Loading