3 KiB
3 KiB
CrateDocs MCP Usage Guide
This guide explains how to use the CrateDocs MCP server with different types of clients.
Client Integration
Using with MCP-compatible LLMs
Any LLM client that follows the Model Context Protocol (MCP) can connect to this documentation server. The LLM will gain the ability to:
- Look up documentation for any Rust crate
- Search the crates.io registry for libraries
- Get documentation for specific items within crates
Command-Line Client
For testing purposes, you can use a simple command-line client like this one:
use mcp_client::{Client, transport::StdioTransport};
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Create a client using stdio transport
let transport = StdioTransport::new();
let mut client = Client::new(transport);
// Connect to the server
client.connect().await?;
// Example: Looking up the 'tokio' crate
let response = client.call_tool(
"lookup_crate",
serde_json::json!({
"crate_name": "tokio"
})
).await?;
println!("Documentation response: {}", response[0].text());
Ok(())
}
Web Client
When using the Axum SSE mode, you can connect to the server using a simple web client:
// Connect to the SSE endpoint
const eventSource = new EventSource('http://127.0.0.1:8080/sse');
// Get the session ID from the initial connection
let sessionId;
eventSource.addEventListener('endpoint', (event) => {
sessionId = event.data.split('=')[1];
console.log(`Connected with session ID: ${sessionId}`);
});
// Handle messages from the server
eventSource.addEventListener('message', (event) => {
const data = JSON.parse(event.data);
console.log('Received response:', data);
});
// Function to send a tool request
async function callTool(toolName, args) {
const response = await fetch(`http://127.0.0.1:8080/sse?sessionId=${sessionId}`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
jsonrpc: '2.0',
method: 'call_tool',
params: {
name: toolName,
arguments: args
},
id: 1,
}),
});
return response.ok;
}
// Example: Search for async crates
callTool('search_crates', { query: 'async runtime', limit: 5 });
Example Workflows
Helping an LLM Understand a New Crate
- LLM client connects to the documentation server
- User asks a question involving an unfamiliar crate
- LLM uses
lookup_crate
to get general documentation - LLM uses
lookup_item
to get specific details on functions/types - LLM can now provide an accurate response about the crate
Helping Find the Right Library
- User asks "What's a good crate for async HTTP requests?"
- LLM uses
search_crates
with relevant keywords - LLM reviews the top results and their descriptions
- LLM uses
lookup_crate
to get more details on promising options - LLM provides a recommendation with supporting information