Skip to content

Releases: mgonzs13/llama_ros

3.1.1

26 Jul 09:03
Compare
Choose a tag to compare
  • ggml renamed to llama_ggml
  • This avoids mixing libllama.so libs from whisper.cpp and llama.cpp

3.1.0

26 Jul 08:48
Compare
Choose a tag to compare
  • Jazzy Support added
  • llama_cli setup.py fixed
  • llama.cpp updated

3.0.1

24 Jul 07:56
Compare
Choose a tag to compare
  • llama.cpp updated
  • llama-3 updated to 3.1
  • lora_base replaced by lora_adapter

3.0.0

22 Jul 11:28
Compare
Choose a tag to compare

Packages versions set and new packages for llama_ros

  • llama_cpp_vendor: vendor package to download and build llama.cpp
  • llama_demos: packages that contain the original demos from llama_ros

2.6.0

21 Jul 17:38
Compare
Choose a tag to compare
  • stop added to GenerateResponse action
  • LangChain wrapper supports VLMs
  • llama.cpp

2.5.4

14 Jul 20:38
Compare
Choose a tag to compare

Stream feature added to llama_client_node and LangChain

2.5.3

09 Jul 19:21
Compare
Choose a tag to compare

llama.cpp updated (new detokenize function)

2.5.2

05 Jul 16:14
Compare
Choose a tag to compare

llama.cpp updated

2.5.1

04 Jul 18:17
Compare
Choose a tag to compare

llama.cpp updated and minor fixes for llama_cli

2.5.0

03 Jul 21:04
Compare
Choose a tag to compare

llama_cli

  • launch: command to launch LLMs
  • prompt: command to generate responses using a prompt