feat(runtime): add reasoning toggle for ollama

This commit is contained in:
Chummy 2026-02-19 16:51:25 +08:00
parent 8f13fee4a6
commit a5d7911923
10 changed files with 289 additions and 31 deletions

View file

@ -67,6 +67,21 @@ credential is not reused for fallback providers.
- Cross-region inference profiles supported (e.g., `us.anthropic.claude-*`).
- Model IDs use Bedrock format: `anthropic.claude-sonnet-4-6`, `anthropic.claude-opus-4-6-v1`, etc.
### Ollama Reasoning Toggle
You can control Ollama reasoning/thinking behavior from `config.toml`:
```toml
[runtime]
reasoning_enabled = false
```
Behavior:
- `false`: sends `think: false` to Ollama `/api/chat` requests.
- `true`: sends `think: true`.
- Unset: omits `think` and keeps Ollama/model defaults.
### Kimi Code Notes
- Provider ID: `kimi-code`