-
Notifications
You must be signed in to change notification settings - Fork 74
Llama agent
- Tools server
- Chat server (if search_source tool is used)
- Embeddings server (if search_source tool is used)
Llama agent uses AI and tools to answer questions, change and add files and do everythin eles, which is provided by the tools.
Llama agent is still in development, but could produce some results with intlligent models with tools support.
Llama agent doesn't ask for permission for each change of a file. Use VS Code's Source Control view or github to review and rollback (if needed) the changes.
Llama agent asks for permission for executing terminal command. However, if the setting Tool_permit_some_terminal_commands is enabled, it will stop asking for permissions for some commands, which are considered safe.
The best wey to prepare the environment for the agent is by selecting an Env (group of models). So, below is the standard workflow:
- Select "Show Llama Agent" from llama-vscode menu or Ctrl+Shift+A to show Llama Agent.
- Click "Select Env" button (visible if there is no selected env) and select env, which supports agent, for your needes. This will download the required models and start llama.cpp servers with them. For the external servers (like OpenRouter) llama-vscode will ask for api key if needed.
- Write your request and send it with Enter or the "Send" button.
Optional
- You could add files to the context with the @ button.
- Activating an agent (Ctrl+Shift+A or from llama-vscodd menu) adds the open file to the agent context
- You could select source code and activate the agent (Ctrl+Shift+A or from llama-vscodd menu) to attach the selected lines to the context
- You could choose the tools to be used from "Select Tools" button (on the right side of "New Chat" button). If you have installed and started MCP Servers in VS Code, their tools will be available for selection too. Don't forget to click the OK button after changing the tool selection.
Click button "Deselect Env" (vislble if there is a selected env with agent model) to deselect the env and selected models and stop the servers, which were started by llama-vscode. Click button "Selected Models" to show details about the selected models