You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A lot has happened in the LLaMA / Alpaca / gpt4all world in the past two weeks. Many models have been released, each much better than the previous one.
With Vicunia 2.0, I want to have a good out-of-the-box experience for one of them. Here's the plan
Vicunia comes with a pre-compiled .cpp binary for all platforms (with an option to bring your own)
The "setup" will just be a button to download a model that works with the binaries
Markdown rendering in the chat
More fixes, of course
Right now, I think gpt4all might be the best to do this with. Let me know if a better one comes out in the meantime!
The text was updated successfully, but these errors were encountered:
A lot has happened in the LLaMA / Alpaca / gpt4all world in the past two weeks. Many models have been released, each much better than the previous one.
With Vicunia 2.0, I want to have a good out-of-the-box experience for one of them. Here's the plan
Right now, I think gpt4all might be the best to do this with. Let me know if a better one comes out in the meantime!
The text was updated successfully, but these errors were encountered: