Implement chat_with_tools() on CompatibleProvider so OpenAI-compatible endpoints (OpenRouter, local LLMs, etc.) can use structured tool calling instead of prompt-injected tool descriptions. Changes: - CompatibleProvider: capabilities() reports native_tool_calling, new chat_with_tools() sends tools in API request and parses tool_calls from response, chat() bridges to chat_with_tools() when ToolSpecs are provided - RouterProvider: chat_with_tools() delegation with model hint resolution - loop_.rs: expose tools_to_openai_format as pub(crate), add tools_to_openai_format_from_specs for ToolSpec-based conversion Adds 9 new tests and updates 1 existing test. |
||
|---|---|---|
| .. | ||
| anthropic.rs | ||
| compatible.rs | ||
| copilot.rs | ||
| gemini.rs | ||
| glm.rs | ||
| mod.rs | ||
| ollama.rs | ||
| openai.rs | ||
| openai_codex.rs | ||
| openrouter.rs | ||
| reliable.rs | ||
| router.rs | ||
| traits.rs | ||