-
Notifications
You must be signed in to change notification settings - Fork 16
Issues: vllm-project/production-stack
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Helm Chart Lacks Clear Support for Multi-Node vLLM Deployment
#50
opened Jan 31, 2025 by
shohamyamin
feat: Offline batched inference based on OpenAI offline batching API
#47
opened Jan 31, 2025 by
gaocegege
[Roadmap] vLLM production stack roadmap for 2025 Q1
#26
opened Jan 27, 2025 by
ApostaC
3 of 13 tasks
Does routing logic works depends on the just QPS of the endpoint?
#16
opened Jan 24, 2025 by
pandyamarut
ProTip!
no:milestone will show everything without a milestone.