I have tried adding custom OpenAI compatible endpoints from both llama.cpp and vLLM, both are throwing errors in Mux, but they do work correctly with other tools. It would be great to improve support for them for handling of non-cloud based environments.
Also it would be nice to be able to add more than one OpenAI compatible endpoint.