-
I'm using chat-ui with a mixtral vllm endpoint (using the OpenAI endpoint type) and trying to debug some errors I'm getting from the server ("Conversation roles must alternate user/assistant/user/assistant/..."; probably something to do with the chat prompt template?). Is there any way to dump the requests being made by chat-ui so I can see what the sequence of messages it's creating are? I tried starting the server with the environment variable
In case it's helpful, here's my
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Just throwing a |
Beta Was this translation helpful? Give feedback.
Just throwing a
console.log
into the source worked fine as a workaround here. The underlying issue was that the summarization feature was including a system prompt, and Mixtral doesn't support system prompts.