Hi,
I've been using Emacs for decades and notmuch for years. I just want to
recommend that notmuch users check out ellama with a local ollama installation.
It has been working like a charm using notmuch to have my emails
grammar-checked by a local LLM installation. The only configuration change I
made was to override the default content-type "flowed", which was causing
issues for recipients.
I also recommend using unidecode because ollama returns text with unicode
characters, and I want to send pure ASCII emails.
;; settings in my emacs config
;; prevent format=flowed
(setq message-default-headers "Content-Type: text/plain; charset=utf-8")
;; setup ellama
(use-package ellama
:init
(setopt ellama-language "English")
(require 'llm-ollama)
(setopt ellama-provider
(make-llm-ollama
:chat-model "gpt-oss" :embedding-model "gpt-oss" :host
"my-gpu-machine-running-ollama")))
(global-set-key [f9] 'ellama-chat)
(global-set-key (kbd "<S-f9>") 'ellama-proofread)
Here're the links to the projects:
ellama: https://github.com/s-kostyaev/ellama
unidecode: https://github.com/sindikat/unidecode
ollama: https://github.com/ollama/ollama
Have fun,
Sebastian
_______________________________________________
notmuch mailing list -- [email protected]
To unsubscribe send an email to [email protected]