Skip to content

Issues: vllm-project/llm-compressor

MODEL REQUESTS
#69 opened Aug 8, 2024 by robertgshaw2-neuralmagic
Open 61
Q3 ROADMAP
#30 opened Jul 22, 2024 by robertgshaw2-neuralmagic
Open 4
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

vllm not support W8A8_Int8 deepseek_v2 bug Something isn't working
#859 opened Oct 21, 2024 by jli943
Why is the speed does not increase after compressed it? bug Something isn't working
#852 opened Oct 18, 2024 by liho00
When to support multi-nodes quantization? enhancement New feature or request
#831 opened Oct 9, 2024 by IEI-mjx
SmoothQuant doesn't respect ignored modules for VLMs bug Something isn't working
#687 opened Sep 26, 2024 by mgoin
KV Cache Quantization example cause problem bug Something isn't working
#660 opened Sep 25, 2024 by weicheng59
[USAGE] FP8 W8A8 (+KV) with LORA Adapters enhancement New feature or request
#164 opened Sep 11, 2024 by paulliwog
Error in the file 2:4_w4a16_group-128_recipe.yaml bug Something isn't working
#154 opened Sep 10, 2024 by carrot-o0o
[Bug]: Index Error tuple out of range bug Something isn't working
#106 opened Aug 23, 2024 by SeanIsYoung
Layers not skipped with ignore=[ "re:.*"] bug Something isn't working
#91 opened Aug 15, 2024 by horheynm
Llava model quantization seems not be supported bug Something isn't working
#73 opened Aug 10, 2024 by caojinpei
MODEL REQUESTS enhancement New feature or request
#69 opened Aug 8, 2024 by robertgshaw2-neuralmagic
Q3 ROADMAP roadmap Items planned to be worked on
#30 opened Jul 22, 2024 by robertgshaw2-neuralmagic
12 of 21 tasks
ProTip! Exclude everything labeled bug with -label:bug.