Releases: mgonzs13/llama_ros
Releases · mgonzs13/llama_ros
3.1.1
- ggml renamed to llama_ggml
- This avoids mixing libllama.so libs from whisper.cpp and llama.cpp
3.1.0
- Jazzy Support added
- llama_cli setup.py fixed
- llama.cpp updated
3.0.1
- llama.cpp updated
- llama-3 updated to 3.1
- lora_base replaced by lora_adapter
3.0.0
Packages versions set and new packages for llama_ros
- llama_cpp_vendor: vendor package to download and build llama.cpp
- llama_demos: packages that contain the original demos from llama_ros
2.6.0
- stop added to GenerateResponse action
- LangChain wrapper supports VLMs
- llama.cpp
2.5.4
Stream feature added to llama_client_node and LangChain
2.5.3
llama.cpp updated (new detokenize function)
2.5.1
llama.cpp updated and minor fixes for llama_cli
2.5.0
llama_cli
- launch: command to launch LLMs
- prompt: command to generate responses using a prompt