docs: add custom provider endpoint configuration guide
Add comprehensive documentation for custom API endpoint configuration to address missing documentation reported in issue #567. Changes: - Create docs/custom-providers.md with detailed guide for custom: and anthropic-custom: formats - Add custom endpoint examples to README.md configuration section - Add note about daemon requirement for channels in Quick Start - Add reference link to custom providers guide Addresses: #567 Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
parent
dd454178ed
commit
f13553014b
2 changed files with 112 additions and 0 deletions
13
README.md
13
README.md
|
|
@ -186,6 +186,9 @@ zeroclaw channel bind-telegram 123456789
|
||||||
# Get integration setup details
|
# Get integration setup details
|
||||||
zeroclaw integrations info Telegram
|
zeroclaw integrations info Telegram
|
||||||
|
|
||||||
|
# Note: Channels (Telegram, Discord, Slack) require daemon to be running
|
||||||
|
# zeroclaw daemon
|
||||||
|
|
||||||
# Manage background service
|
# Manage background service
|
||||||
zeroclaw service install
|
zeroclaw service install
|
||||||
zeroclaw service status
|
zeroclaw service status
|
||||||
|
|
@ -431,6 +434,12 @@ default_provider = "openrouter"
|
||||||
default_model = "anthropic/claude-sonnet-4-20250514"
|
default_model = "anthropic/claude-sonnet-4-20250514"
|
||||||
default_temperature = 0.7
|
default_temperature = 0.7
|
||||||
|
|
||||||
|
# Custom OpenAI-compatible endpoint
|
||||||
|
# default_provider = "custom:https://your-api.com"
|
||||||
|
|
||||||
|
# Custom Anthropic-compatible endpoint
|
||||||
|
# default_provider = "anthropic-custom:https://your-api.com"
|
||||||
|
|
||||||
[memory]
|
[memory]
|
||||||
backend = "sqlite" # "sqlite", "lucid", "markdown", "none"
|
backend = "sqlite" # "sqlite", "lucid", "markdown", "none"
|
||||||
auto_save = true
|
auto_save = true
|
||||||
|
|
@ -531,6 +540,10 @@ api_url = "https://ollama.com"
|
||||||
api_key = "ollama_api_key_here"
|
api_key = "ollama_api_key_here"
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Custom Provider Endpoints
|
||||||
|
|
||||||
|
For detailed configuration of custom OpenAI-compatible and Anthropic-compatible endpoints, see [docs/custom-providers.md](docs/custom-providers.md).
|
||||||
|
|
||||||
## Python Companion Package (`zeroclaw-tools`)
|
## Python Companion Package (`zeroclaw-tools`)
|
||||||
|
|
||||||
For LLM providers with inconsistent native tool calling (e.g., GLM-5/Zhipu), ZeroClaw ships a Python companion package with **LangGraph-based tool calling** for guaranteed consistency:
|
For LLM providers with inconsistent native tool calling (e.g., GLM-5/Zhipu), ZeroClaw ships a Python companion package with **LangGraph-based tool calling** for guaranteed consistency:
|
||||||
|
|
|
||||||
99
docs/custom-providers.md
Normal file
99
docs/custom-providers.md
Normal file
|
|
@ -0,0 +1,99 @@
|
||||||
|
# Custom Provider Configuration
|
||||||
|
|
||||||
|
ZeroClaw supports custom API endpoints for both OpenAI-compatible and Anthropic-compatible providers.
|
||||||
|
|
||||||
|
## Provider Types
|
||||||
|
|
||||||
|
### OpenAI-Compatible Endpoints (`custom:`)
|
||||||
|
|
||||||
|
For services that implement the OpenAI API format:
|
||||||
|
|
||||||
|
```toml
|
||||||
|
default_provider = "custom:https://your-api.com"
|
||||||
|
api_key = "your-api-key"
|
||||||
|
default_model = "your-model-name"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Anthropic-Compatible Endpoints (`anthropic-custom:`)
|
||||||
|
|
||||||
|
For services that implement the Anthropic API format:
|
||||||
|
|
||||||
|
```toml
|
||||||
|
default_provider = "anthropic-custom:https://your-api.com"
|
||||||
|
api_key = "your-api-key"
|
||||||
|
default_model = "your-model-name"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration Methods
|
||||||
|
|
||||||
|
### Config File
|
||||||
|
|
||||||
|
Edit `~/.zeroclaw/config.toml`:
|
||||||
|
|
||||||
|
```toml
|
||||||
|
api_key = "your-api-key"
|
||||||
|
default_provider = "anthropic-custom:https://api.example.com"
|
||||||
|
default_model = "claude-sonnet-4"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
```bash
|
||||||
|
export ANTHROPIC_API_KEY="your-api-key"
|
||||||
|
zeroclaw agent
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Configuration
|
||||||
|
|
||||||
|
Verify your custom endpoint:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Interactive mode
|
||||||
|
zeroclaw agent
|
||||||
|
|
||||||
|
# Single message test
|
||||||
|
zeroclaw agent -m "test message"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Authentication Errors
|
||||||
|
|
||||||
|
- Verify API key is correct
|
||||||
|
- Check endpoint URL format (must include `https://`)
|
||||||
|
- Ensure endpoint is accessible from your network
|
||||||
|
|
||||||
|
### Model Not Found
|
||||||
|
|
||||||
|
- Confirm model name matches provider's available models
|
||||||
|
- Check provider documentation for exact model identifiers
|
||||||
|
|
||||||
|
### Connection Issues
|
||||||
|
|
||||||
|
- Test endpoint accessibility: `curl -I https://your-api.com`
|
||||||
|
- Verify firewall/proxy settings
|
||||||
|
- Check provider status page
|
||||||
|
|
||||||
|
## Examples
|
||||||
|
|
||||||
|
### Local LLM Server
|
||||||
|
|
||||||
|
```toml
|
||||||
|
default_provider = "custom:http://localhost:8080"
|
||||||
|
default_model = "local-model"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Corporate Proxy
|
||||||
|
|
||||||
|
```toml
|
||||||
|
default_provider = "anthropic-custom:https://llm-proxy.corp.example.com"
|
||||||
|
api_key = "internal-token"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cloud Provider Gateway
|
||||||
|
|
||||||
|
```toml
|
||||||
|
default_provider = "custom:https://gateway.cloud-provider.com/v1"
|
||||||
|
api_key = "gateway-api-key"
|
||||||
|
default_model = "gpt-4"
|
||||||
|
```
|
||||||
Loading…
Add table
Add a link
Reference in a new issue