Running with a local API server? ControlNet depth instead of using checkpoint model? #798
-
Can I use this with my local API servers and have the depth obtained with ControlNet while having a specific model for the actual Texture generation? I use Stability Matrix to handle and unify A1111, Forge, and ComfyUI models so I don't have duplicates of different diffusion models and such, so it's a bit harder for me to work with Blender and Krita at the same time when another server is working with the one I'm using with Krita and the AI Render plugin for Blender. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
We have an API for creating custom backends. The addon comes with a HuggingFace Diffusers backend for running models on the local machine. But you can create a custom backend to connect Dream Textures to an API. You can see an example of a custom backend here: https://github.com/carson-katri/dream-textures/blob/main/community_backends/test.py Check the documentation comments on the |
Beta Was this translation helpful? Give feedback.
-
Groovy, thank you much! |
Beta Was this translation helpful? Give feedback.
We have an API for creating custom backends. The addon comes with a HuggingFace Diffusers backend for running models on the local machine. But you can create a custom backend to connect Dream Textures to an API.
You can see an example of a custom backend here: https://github.com/carson-katri/dream-textures/blob/main/community_backends/test.py
Check the documentation comments on the
Backend
class for more details: https://github.com/carson-katri/dream-textures/blob/main/api/backend/backend.py#L11