nixcfg/systems/x86_64-linux
Harald Hoyer 42c52bd87f refactor(mx): drive opencode bot via direct chat-completions API
The bot no longer shells out to `opencode run`. Instead it POSTs to the
OpenAI-compatible /chat/completions endpoint exposed by llama-server on
halo.hoyer.tail:8000 directly. This removes the Bun/sqlite cold-start
overhead per request, drops the pkgs.opencode runtime dependency, and
eliminates the ExecStartPre dance that materialized config.json into the
service's $HOME.

Conversation history is now stored as a proper OpenAI `messages` list
with system/user/assistant roles, instead of the XML blob that was
inlined into a single `opencode run` argument. The interactive opencode
setup (config/opencode/config.json) is unchanged — only the bot stops
depending on it.

The module gains a `modelBaseUrl` option; `model` is now the bare model
name (`halo-8000`) without the provider/ prefix that the opencode CLI
required.
2026-05-13 16:38:58 +02:00
..
amd chore(x1,amd): disable cratedocs-mcp service 2026-05-13 11:35:59 +02:00
attic feat(headscale): add ACL policy, isolate mx, make mx an exit node 2026-05-13 09:06:40 +02:00
halo fix(halo): set --alias halo-8000 2026-05-13 14:52:49 +02:00
mx refactor(mx): drive opencode bot via direct chat-completions API 2026-05-13 16:38:58 +02:00
nixtee1 refactor(nix): extract common system configs into reusable modules 2026-01-30 10:42:09 +01:00
sgx refactor(opencode): extract serve service into shared NixOS module 2026-05-05 13:43:27 +02:00
t15 refactor(nix): extract common system configs into reusable modules 2026-01-30 10:42:09 +01:00
x1 chore(x1,amd): disable cratedocs-mcp service 2026-05-13 11:35:59 +02:00