nixcfg/systems/x86_64-linux/mx/nextcloud-opencode-bot
Harald Hoyer 42c52bd87f refactor(mx): drive opencode bot via direct chat-completions API
The bot no longer shells out to `opencode run`. Instead it POSTs to the
OpenAI-compatible /chat/completions endpoint exposed by llama-server on
halo.hoyer.tail:8000 directly. This removes the Bun/sqlite cold-start
overhead per request, drops the pkgs.opencode runtime dependency, and
eliminates the ExecStartPre dance that materialized config.json into the
service's $HOME.

Conversation history is now stored as a proper OpenAI `messages` list
with system/user/assistant roles, instead of the XML blob that was
inlined into a single `opencode run` argument. The interactive opencode
setup (config/opencode/config.json) is unchanged — only the bot stops
depending on it.

The module gains a `modelBaseUrl` option; `model` is now the bare model
name (`halo-8000`) without the provider/ prefix that the opencode CLI
required.
2026-05-13 16:38:58 +02:00
..
bot.py refactor(mx): drive opencode bot via direct chat-completions API 2026-05-13 16:38:58 +02:00
default.nix refactor(mx): drive opencode bot via direct chat-completions API 2026-05-13 16:38:58 +02:00
module.nix refactor(mx): drive opencode bot via direct chat-completions API 2026-05-13 16:38:58 +02:00