Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] tvm.cuda().exist return false while torch.cuda.is_available() return true #17558

Open
vfdff opened this issue Dec 14, 2024 · 0 comments
Open
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug

Comments

@vfdff
Copy link

vfdff commented Dec 14, 2024

As the title mentions, tvm.cuda().exist return unexpected value

Expected behavior

tvm.cuda().exist return true

Actual behavior

tvm.cuda().exist return false

Environment

Any environment details, such as: Operating System, TVM version, etc
a) gpu

(venv) root@d00469708debug6-7c75445547-frh8j:/usr1/project/zhongyunde/source/osdi22_artifact/artifacts/roller# nvidia-smi   
Wed Dec 11 19:48:21 2024       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.129.03             Driver Version: 535.129.03   CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA L40S                    Off | 00000000:9A:00.0 Off |                    0 |
| N/A   27C    P8              34W / 350W |     13MiB / 46068MiB |      0%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
                                                                                         
+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|

b) tvm: apache-tvm 0.11.1
c) mlc-ai: mlc-ai-nightly-cu122-0.1

Steps to reproduce

pip install apache-tvm
python3 -m pip install --pre mlc-ai-nightly-cu122 -- don't fix

Triage

Please refer to the list of label tags here to find the relevant tags and add them below in a bullet format (example below).

  • needs-triage
@vfdff vfdff added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug labels Dec 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug
Projects
None yet
Development

No branches or pull requests

1 participant