Auto-model installer for LocalAI users #19
Anto79-ops
started this conversation in
Ideas
Replies: 2 comments 1 reply
-
hey @acon96 what are you thoughts on hosting your models in the auto-installer by Midori-AI for LocalAI? |
Beta Was this translation helpful? Give feedback.
1 reply
-
thanks! A PR can be submited pointing users to the LocalAI how-tos which would have the option it setup LocalAI backend (from scratch) and with Home-LLM models (and other general LLM models) of their choice. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
hey,
@lunamidori5 creates the how-to docs (and other helpful things) for newcomers to LocalAI.
Since your integration now full works with LocalAI (thanks for that, btw!) would you object to having your model included as an option when newcomers want to install models and LocalAI on their computers?
So for example, right now if a new person would like to install a model in their LocalAI instance, they would simply run this program that asks them questions about which model they would like, like 7b, 13b 43b. etc. Having HomeLLM as an option woud automate the entire process, including making the model.yaml file plus the template files and even download the gguf file automatically.
https://io.midori-ai.xyz/howtos/easy-model-installer/
https://github.com/lunamidori5/Midori-AI/tree/master/other_files/model_installer
Your huggingface card will be linked in the above sites.
When I added your model to LocalAI, I had to manually make the files etc., so this would be a good option for those who are using (or want to use) LocalAI as the backend.
Please feel free to comment
Beta Was this translation helpful? Give feedback.
All reactions