This package supports a threaded AI chat inside any org-mode buffer. Here’s what it looks like:
The design is inspired by the ChatGPT web interface. There are many AI chat packages for Emacs, but I am not aware of any that naturally support multiple conversational pathways. (Update: gptel now has a similar feature, which I haven’t yet tried; it depends upon an pre-release version of Org.) I’m also using this package as an experiment in ways to tighten the editing loop with AI in Emacs, via context, diffs, etc.
The package comes with one main function, ai-org-chat-respond
, that operates in any org-mode buffer. It extracts a conversation history from the parent entries, treating an entry as belonging to the AI if its heading is “AI” and otherwise to the user. This conversation history is passed along to the OpenAI library to generate a response.
Narrowing provides a simple way to truncate the conversation history if it becomes too long – only the restricted part of the buffer is considered.
The PROPERTIES drawers are used to provide
- context, which augments the system message with the contents of certain buffers or files, and
- tools, which allow the AI to operate directly on the environment.
There is support for quickly launching Ediff sessions that compare modifications suggested by the AI to their sources, making it easier to review and apply changes (see Compare Feature).
Download this repository, install using M-x package-install-file
(or package-vc-install, straight, elpaca, …), and add (use-package ai-org-chat)
to your init file. For a more comprehensive setup, use something like the following:
(use-package ai-org-chat
:bind
(:map global-map
("C-c /" . ai-org-chat-new))
(:map ai-org-chat-minor-mode-map
("C-c <return>" . ai-org-chat-respond)
("C-c n" . ai-org-chat-branch)
("C-c e" . ai-org-chat-compare))
:custom
(ai-org-chat-user-name "Paul")
(ai-org-chat-dir "~/gpt")
:config
;; See below
;; (ai-org-chat-select-model "sonnet 3.5")
)
To use the package, you’ll need to set up either the gptel or llm library. By default, ai-org-chat
works with gptel
, so if you have that configured already, you’re all set. However, features related to “tools” or “function calls” are currently only supported by the llm
library, making it the recommended option for full functionality.
If you choose to use llm
, you’ll need to configure the ai-org-chat-provider
variable with a valid provider as defined by the llm
library. Here’s how to do that:
- Customize the
ai-org-chat-models
user option to include the models you want to use and the environment variables containing your API keys. - Use the
ai-org-chat-select-model
command to choose your preferred model. You can do this by uncommenting and adjusting the line at the bottom of the aboveuse-package
declaration.
As a final tip, the following makes environment variables available in Emacs on MacOS:
(use-package exec-path-from-shell
:ensure
:init
(exec-path-from-shell-initialize))
When you want to ask the AI something, do M-x ai-org-chat-new
(or C-c /
, if you followed the above configuration). This visits a new file in the specified directory (“~/gpt” by default). If the region was active, then it will be quoted in the new buffer. With a prefix argument (C-u
), it will immediately add visible buffers as context to the new chat. Example:
The org-mode buffer has ai-org-chat-minor-mode
activated, whose only purpose is to support user-defined keybindings like in the above use-package
declaration. If you want to work in some other org file, you can either activate this minor mode manually or do M-x ai-org-chat-setup-buffer
.
We provide the following commands:
ai-org-chat-respond
(C-c <return>
)- This is the main function, which tells the AI to generate a new response to the conversation node at point. It works in any org-mode buffer, not just ones created via
ai-org-chat-new
. ai-org-chat-branch
(C-c n
)- This is a convenience function that creates a new conversation branch at point.
ai-org-chat-compare
(C-c e
)- Launches an Ediff session to compare the org-mode block at point with the contents of another visible buffer. This helps you review and apply AI-suggested changes to your codebase. See Compare Feature for more details.
ai-org-chat-convert-markdown-blocks-to-org
- LLM’s often return code in markdown format (even when you instruct them otherwise). This function converts all markdown code blocks between (point) and (point-max) to org-mode code blocks.
ai-org-chat uses PROPERTIES drawers to manage context and tools for the AI conversation. These can be set at the top level of the file or in individual nodes.
Context is managed through the CONTEXT
property. This property can contain a list of items that provide additional information to the AI. These items can be:
- Buffer names
- File names as absolute paths, paths relative to the current directory, or paths relative to any subdirectory of the current Emacs project, searched in this order
- Elisp function names (functions that return strings to be included in the context)
Example:
:PROPERTIES: :CONTEXT: buffer-name.txt project-file.el my-context-function :END:
Tools (or “function calls”) are specified using the TOOLS
property. This property should contain a list of llm-tool-function
objects that the AI can use. This feature works only if:
- The variable
ai-org-chat-provider
is set to a provider from thellm
package - That provider supports tools/function calls
- The tools have been properly defined as
llm-tool-function
objects
Example:
:PROPERTIES: :TOOLS: my-tool-function another-tool-function :END:
While you can directly edit PROPERTIES drawers using Org mode’s built-in commands (e.g., C-c C-x p
for org-set-property
), ai-org-chat
provides some helper commands for managing context and tools:
ai-org-chat-add-buffer-context
: Add selected buffers as context.ai-org-chat-add-visible-buffers-context
: Add all visible buffers as context.ai-org-chat-add-file-context
: Add selected files as context.ai-org-chat-add-project-files-context
: Add all files from a selected project as context.ai-org-chat-add-tools
: Add selectedllm-tool-function
objects to the current node.
These commands are designed to simplify context/tool management, but are not required for using the package.
The “compare” feature streamlines the process of reviewing and applying code changes suggested by the AI, as follows.
- Narrow the buffer containing your original code to the function or section of interest.
- In the AI chat buffer, place your cursor on the AI-suggested code block.
- Execute the command
ai-org-chat-compare
(bound toC-c e
by default). - If you have multiple visible windows, you’ll be prompted to select the window containing the original code using ace-window.
- An Ediff session will launch in a new tab, comparing the AI-suggested code with your original code.
The Ediff session is launched in a new tab and cleaned up automatically when you’re done, keeping your workspace tidy.
There are many ways to do this. Here’s one typical workflow:
- Create a buffer containing the region that you want to modify, either immediately via
C-x n n
(narrow-to-region
) or after first doingM-x clone-indirect-buffer
(which I bind toC-x c
). C-x t 2
(tab-bar-new-tab
) to create a new tab containing just the buffer containing the region of interest, thenC-u C-c /
(orC-u M-x ai-org-chat-new
) to launch a chat session with the buffer of interest as context.- Ask the LLM to revise it, requesting the response in a source block (a good system message, or the function
ai-org-chat-convert-markdown-blocks-to-org
, maybe come in handy here). - When you receive a response in a source block, use
C-c e
(ai-org-chat-compare
) to inspect what was changed, and standard Ediff commands to apply parts of that change. - Iterate until you’re happy with the changes.
The point here is that this is a very flexible workflow that leverages built-in features such as narrowing and Ediff.
As a shortcut, the first step (narrowing the buffer) is not necessary if the code block consists of a single function – in that case, narrowing should be taken care of automatically provided that the relevant buffer is either visible or appears in the context.
Here’s an example of using functions as context to create an AI chat interface to your agenda. This setup gives the AI access to the current time, your diary, weekly agenda, and yearly project timeline. The setup assumes you use the diary, org-mode and the agenda for task management, with long-term travel-related items stored in a file called projects.org. It should be easy to adapt this setup to other ways of managing your schedule.
- Create an org file, say
planner.org
- Run
M-x ai-org-chat-setup-buffer
- Add a top-level PROPERTIES drawer containing:
:CONTEXT: my/current-date-and-time ~/.emacs.d/diary my/agenda-for-week my/projects-for-year
After these steps, the beginning of planner.org
should look like this:
# -*- eval: (ai-org-chat-minor-mode 1); -*- :PROPERTIES: :CONTEXT: my/current-date-and-time ~/.emacs.d/diary my/agenda-for-week my/projects-for-year :END:
The context consists of:
- A function that provides the current date and time
- Your diary file (~/.emacs.d/diary)
- A function that provides your agenda for the next seven days
- A function that provides your project timeline for the next year
With this setup, you can chat with an AI that has continuous access to your schedule and plans. For example, you can ask “Which afternoons do I have free this week?” or “When’s the best time to schedule a trip in March?”
The three agenda-related functions can be implemented as follows:
(defun my/current-date-and-time ()
"Return string describing current date and time."
(format-time-string "%A, %B %d, %Y at %I:%M %p"))
(defun my/agenda-for-week ()
"Return string containing full agenda for the next seven days."
(interactive)
(save-window-excursion
(require 'org-agenda)
(let ((org-agenda-span 'day)
(org-agenda-start-on-weekday nil) ; start from today regardless of weekday
(org-agenda-start-day (format-time-string "%Y-%m-%d"))
(org-agenda-ndays 7)
(org-agenda-prefix-format
'((agenda . " %-12:c%?-12t%6e %s"))))
(org-agenda nil "a")
(buffer-substring-no-properties (point-min) (point-max)))))
(defun my/filter-diary-contents ()
"Return diary contents without holiday entries."
(with-temp-buffer
(insert-file-contents diary-file)
(goto-char (point-min))
(keep-lines "^[^&]" (point-min) (point-max))
(buffer-string)))
(defun my/with-filtered-diary (fn)
"Execute FN with a filtered version of the diary.
Temporarily creates and uses a diary file without holiday entries."
(let ((filtered-contents (my/filter-diary-contents)))
(with-temp-file "/tmp/temp-diary"
(insert filtered-contents))
(let ((diary-file "/tmp/temp-diary"))
(funcall fn))))
(defun my/projects-for-year ()
"Return string containing projects.org agenda for next year.
Skips empty days and diary holidays."
(interactive)
(save-window-excursion
(require 'org-agenda)
(let ((org-agenda-files (list my-projects-file))
(org-agenda-span 365)
(org-agenda-start-on-weekday nil)
(org-agenda-start-day (format-time-string "%Y-%m-%d"))
(org-agenda-prefix-format
'((agenda . " %-12:c%?-12t%6e %s")))
(org-agenda-include-diary t)
(diary-show-holidays-flag nil)
(org-agenda-show-all-dates nil))
(my/with-filtered-diary
(lambda ()
(org-agenda nil "a")
(buffer-substring-no-properties (point-min) (point-max)))))))