logolagy

Ollama with API keys and Emacs LLM

Published -

Ollama is a nice tool for managing language models and serving them for other programs to consume. I prefer to host it myself while integrate it with both my smartphone using gpt_mobile and my Emacs using Ellama.

To seal it off for malicious users from the Internet I decided make authentication mandatory. gpt_mobile supports this easily with API keys, but Ellama+LLM did not out of the box.

Fortunately, llm already supports adding arbitrary HTTP headers using plz so I just extended the existing Ollama structs to add the Bearer token with plz.

 1diff --git a/llm-ollama.el b/llm-ollama.el
 2index c428e25..233c0b3 100644
 3--- a/llm-ollama.el
 4+++ b/llm-ollama.el
 5@@ -57,10 +57,13 @@ default to localhost.
 6
 7 PORT is the localhost port that Ollama is running on.  It is optional.
 8
 9+KEY is an api key used to authenticate with to access restricted
10+Ollama service. It is optional.
11+
12 CHAT-MODEL is the model to use for chat queries.  It is required.
13
14 EMBEDDING-MODEL is the model to use for embeddings.  It is required."
15-  (scheme "http") (host "localhost") (port 11434) chat-model embedding-model)
16+  (scheme "http") (host "localhost") (port 11434) key chat-model embedding-model)
17
18 ;; Ollama's models may or may not be free, we have no way of knowing.  There's no
19 ;; way to tell, and no ToS to point out here.
20@@ -73,6 +76,9 @@ EMBEDDING-MODEL is the model to use for embeddings.  It is required."
21   (format "%s://%s:%d/api/%s" (llm-ollama-scheme provider )(llm-ollama-host provider)
22           (llm-ollama-port provider) method))
23
24+(cl-defmethod llm-provider-headers ((provider llm-ollama))
25+  `(("Authorization" . ,(format "Bearer %s" (llm-ollama-key provider)))))
26+
27 (cl-defmethod llm-provider-embedding-url ((provider llm-ollama) &optional _)
28   (llm-ollama--url provider "embed"))
29
30--

Now an Bearer token key can be added to Ollama requests by setting its associated key in your configuration.

1(make-llm-ollama
2 :scheme "https"
3 :key "some_secret_ley"
4 :host "ollama.example.com"
5 :port 443
6 :chat-model "zephyr"
7 :embedding-model "zephyr")

A smarter way to add authentication to Ollama requests probably exists, as this requires you to copy credentials into your configuration, but I scratched an itch 😀 and it makes me appreciate how simple it is to apply patches to simple problems with Emacs.

#development #emacs #llm #ollama

Reply to this post by email ↪