Some QoL upgrade idea #998
EL-File4138
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
As a long-term user of distros featuring GNOME DE and active advocacy for GTK native/local first apps, I'm greatly glad for such a well-designed app as a daily-driven LLM chat portal. After some initial trial, I believe some QoL functionality upgrades could make the app better with minimal effort.
I've used ChatGPT and a developer-oriented client extensively in the past, during which I have aggregated an extensive amount of chat logs, and I think it's safe to say that those logs, in a raw form of knowledge collectively built by me and LLM, are an active mine of treasure for future me. I believe many else will have the same feeling. However, if they do not remain in close reach, I may not effectively use them. Transfer of data hosting could make user feels more coherent when switching, and it could prove to be useful if you plan to support long-term memory and past chat deduction like many platforms do.
If supporting export format for every platform is not something you would like to do, opening an interface for development of import deserializer plugins could still be useful to many - I would love to support your effort by committing an OpenAI ChatGPT importer.
I'm not an active user for local LLM deployment, so I'm not gonna discuss the technical implementation interfacing with Ollama. However, with a wide array of platform support also comes the expectation for customizability. I would be grateful if all parameters for the OpenAI-compliant API call were supported for custom configuration.
Although LLM is not perfect for all tasks, and Alpaca is not suitable for many (I wouldn't expect agentic coding ability), LLM is especially great for many repetitive text-related tasks, such as clean format, classify, summarize/expand, rewrite template correspondents, many of which are a combination of the same prompt template and input. It would be immensely useful if these usages could be optimized like the Quick Ask functionality: Storing a list of custom prompts (ideally using a simple format directive to support multiple parameters), and implementing a quick call panel that could call the prompt by pasting the input would suffice.
This one is trivial; some provider provides model names in a rather ugly way, and being able to customize them could be a nice touch.
Nonetheless, I'm already satisfied with the integrated experience using Alpaca, and I would like to sincerely thank you for your commitment to developing such a great app.
Beta Was this translation helpful? Give feedback.
All reactions