[Bug] tvm.cuda().exist return false while torch.cuda.is_available() return true #17558
Labels
needs-triage
PRs or issues that need to be investigated by maintainers to find the right assignees to address it
type: bug
As the title mentions, tvm.cuda().exist return unexpected value
Expected behavior
tvm.cuda().exist return true
Actual behavior
tvm.cuda().exist return false
Environment
Any environment details, such as: Operating System, TVM version, etc
a) gpu
b) tvm: apache-tvm 0.11.1
c) mlc-ai: mlc-ai-nightly-cu122-0.1
Steps to reproduce
pip install apache-tvm
python3 -m pip install --pre mlc-ai-nightly-cu122 -- don't fix
Triage
Please refer to the list of label tags here to find the relevant tags and add them below in a bullet format (example below).
The text was updated successfully, but these errors were encountered: