-
Notifications
You must be signed in to change notification settings - Fork 125
Issues: heshengtao/comfyui_LLM_party
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Error message occurred while importing the 'comfyui_LLM_party' module.
#162
opened Mar 3, 2025 by
PigNoEnvy
use user's api auto add "/chat/completions" in the request url, but it's not the service api.
#160
opened Mar 2, 2025 by
XieJunchen
求救!!Loading /root/AI/models/Embedding_Tools/glm-4-9b-chat requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option
trust_remote_code=True
to remove this error.
#140
opened Jan 3, 2025 by
cenzijing
Seems LLM Party and Reactor Nodes can't co-exist in the same env
#105
opened Oct 12, 2024 by
boricuapab
Previous Next
ProTip!
Find all open issues with in progress development work with linked:pr.