If you've ever wanted your AI agent to actually do something — read files, query databases, call APIs — you need MCP servers. The Model Context Protocol isn't just for clients consuming tools. It's a two-way street: servers expose capabilities, agents discover and call them.
I recently wrote about building MCP clients in Rust. But the flip side is just as important: building servers that expose your Rust functions as tools. This is where the magic happens — your code becomes callable by any MCP-compatible agent.
What MCP Servers Actually Do
An MCP server exposes functions as tools via JSON-RPC 2.0. The agent sends a request like:
{
"jsonrpc": "2.0",
"id": 1,
"method": "read_file",
"params": { "path": "/etc/passwd" }
}
And the server responds with the result. Simple. The power is in what functions you expose.
The rmcp Crate
The rmcp crate makes this surprisingly ergonomic. Add to your Cargo.toml:
[dependencies]
rmcp = { version = "0.1", features = ["server"] }
tokio = { version = "1", features = ["full"] }
serde = { version = "1", features = ["derive"] }
schemars = "0.8"
anyhow = "1.0"
The key crates:
- rmcp — Core protocol (server, client, tools)
- tokio — Async runtime for I/O operations
- serde — JSON serialization/deserialization
- schemars — Generate JSON schemas from your structs (helps the AI understand parameters)
- anyhow — Flexible error handling
Example 1: File Explorer
Let's expose file system operations. The agent needs to know what parameters each function accepts — schemars generates that automatically.
use rmcp::{tool, tool_box};
use serde::Deserialize;
use schemars::JsonSchema;
use anyhow::Result;
#[derive(Clone)]
struct FileServer;
#[derive(Deserialize, JsonSchema)]
struct PathRequest {
path: String,
}
#[tool_box]
impl FileServer {
#[tool(description = "List directory contents")]
fn list_dir(&self, #[tool(aggr)] req: PathRequest) -> Result<Vec<String>> {
Ok(std::fs::read_dir(&req.path)?
.filter_map(Result::ok)
.map(|e| e.file_name().to_string_lossy().to_string())
.collect())
}
#[tool(description = "Read a file")]
fn read_file(&self, #[tool(aggr)] req: PathRequest) -> Result<String> {
Ok(std::fs::read_to_string(&req.path)?)
}
}
The macros do the heavy lifting:
#[tool_box]— Marks the impl block, making methods discoverable as MCP tools#[tool(description = "...")]— Human-readable description for the AI#[tool(aggr)]— Maps incoming JSON parameters to the struct fields
Example 2: Currency Converter (Async HTTP)
Most real tools need to call external APIs. This one uses async HTTP:
use rmcp::{tool, tool_box};
use serde::Deserialize;
use schemars::JsonSchema;
use anyhow::Result;
use reqwest;
#[derive(Clone)]
struct CurrencyServer;
#[derive(Debug, Deserialize, JsonSchema)]
struct ConvertRequest {
from: String,
to: String,
amount: f64,
}
#[derive(Debug, Deserialize)]
struct ApiResponse {
result: f64,
}
#[tool_box]
impl CurrencyServer {
#[tool(description = "Convert currency using live exchange rates")]
async fn convert(&self, #[tool(aggr)] req: ConvertRequest) -> Result<String> {
let url = format!(
"https://api.exchangerate.host/convert?from={}&to={}&amount={}",
req.from, req.to, req.amount
);
let resp = reqwest::get(&url).await?
.json::<ApiResponse>().await?;
Ok(format!("{:.2} {}", resp.result, req.to))
}
}
The async fn is necessary because we're doing network I/O. MCP servers run on tokio, so async is first-class here.
Example 3: Stateful Task Manager
Sometimes you need memory. This tool maintains a task list across requests:
use rmcp::{tool, tool_box};
use serde::{Deserialize, Serialize};
use schemars::JsonSchema;
use anyhow::Result;
use std::sync::{Arc, Mutex};
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema)]
struct Task {
title: String,
done: bool,
}
#[derive(Clone)]
struct TaskManager {
tasks: Arc<Mutex<Vec<Task>>>,
}
impl TaskManager {
fn new() -> Self {
TaskManager {
tasks: Arc::new(Mutex::new(Vec::new())),
}
}
}
#[tool_box]
impl TaskManager {
#[tool(description = "Add a new task")]
fn add_task(&self, title: String) -> Result<String> {
let mut tasks = self.tasks.lock().unwrap();
tasks.push(Task { title, done: false });
Ok("Task added.".to_string())
}
#[tool(description = "List all tasks")]
fn list_tasks(&self) -> Result<Vec<Task>> {
let tasks = self.tasks.lock().unwrap();
Ok(tasks.clone())
}
#[tool(description = "Mark a task as done")]
fn complete_task(&self, index: usize) -> Result<String> {
let mut tasks = self.tasks.lock().unwrap();
if index < tasks.len() {
tasks[index].done = true;
Ok("Task completed.".to_string())
} else {
Err(anyhow::anyhow!("Task index out of bounds"))
}
}
}
Arc<Mutex<T>> is the standard pattern for sharing state across concurrent requests. The lock ensures only one request modifies the list at a time.
Example 4: JSON Formatter (Single Parameter)
Some tools take a single unnamed parameter:
use rmcp::{tool, tool_box};
use anyhow::Result;
use serde_json;
#[derive(Clone)]
struct JsonServer;
#[tool_box]
impl JsonServer {
#[tool(description = "Prettify a JSON string")]
fn format_json(&self, #[tool(param)] raw: String) -> Result<String> {
let value: serde_json::Value = serde_json::from_str(&raw)?;
Ok(serde_json::to_string_pretty(&value)?)
}
}
#[tool(param)] indicates a single unnamed parameter in the JSON-RPC params field.
How the Agent Discovers Tools
MCP servers advertise their capabilities via a tool list response. The agent doesn't just guess — it asks "what tools do you have?" and the server responds with schema information generated by schemars.
This is why the JsonSchema derive matters: it tells the agent exactly what parameters each function expects, their types, and any constraints.
Running the Server
The actual server initialization depends on your transport (stdio, HTTP, WebSocket). The rmcp crate provides server builders for each:
use rmcp::Server;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let server = Server::builder()
.with_tool(FileServer)
.with_tool(CurrencyServer::new())
.build();
server.run().await?;
Ok(())
}
What Makes This Powerful
MCP servers transform your Rust code into discoverable, type-safe tool APIs. The agent doesn't need to guess parameter names or types — the schema tells it everything.
Three patterns emerge as you build more servers:
-
Stateless functions — File operations, calculations, transformations. Easy to test, no state to manage.
-
External API wrappers — Currency rates, weather, databases. Use
async fnwith reqwest or your HTTP client of choice. -
Stateful services — Task managers, session tracking, caches.
Arc<Mutex<T>>for thread-safe state, orArc<RwLock<T>>if reads outnumber writes.
The protocol handles the rest: discovery, parameter validation, error responses, timeouts. Your job is just exposing functions.
The Bigger Picture
MCP servers are the building blocks of agentic systems. Every tool your agent can use — file reads, web searches, code execution — is backed by an MCP server (or something speaking the same protocol).
Build servers for:
- Your company's internal APIs
- Database queries
- File system operations
- CI/CD pipelines
- Monitoring and alerting
Then any MCP-compatible agent can consume them. The protocol standardizes what was previously custom integration work.
The protocol is evolving — Anthropic, OpenAI, and others are aligning on MCP as the standard for tool calling. If you're building AI-powered systems in Rust, this is the foundation.