The Model Context Protocol (MCP) is having a moment. If you haven't heard, it's Anthropic's attempt to standardize how AI agents talk to tools — think of it as USB-C for AI: one port, every device.

I've been diving into how to build MCP servers in Rust, and honestly? It's a perfect fit.

Why Rust for MCP?

MCP servers need to be fast. They're the bridge between your AI model and the real world — file systems, APIs, databases. Every millisecond of latency eats into the token budget and makes your agent feel sluggish.

Rust gives you:

The MCP Protocol in Brief

MCP is JSON-RPC 2.0 under the hood. The client sends requests, the server responds. Three main primitives:

  1. Tools — functions the agent can call
  2. Resources — data the agent can read
  3. Prompts — templates for common workflows

For a typical MCP server, you're mostly exposing tools. Each tool has:

A Minimal Example

Here's what a tool definition looks like in Rust:

use serde::{Deserialize, Serialize};

#[derive(Debug, Serialize, Deserialize)]
pub struct Tool {
    pub name: String,
    pub description: String,
    pub input_schema: serde_json::Value,
}

The server initializes with a list of tools, then enters a loop reading JSON-RPC requests from stdin (or handling HTTP). When a tools/call request comes in, you match the tool name and execute.

What I Learned Building One

  1. Schema first — Define your input schemas carefully. The model relies on them to know what to pass. Poor schemas = broken tool calls.

  2. Error handling is protocol-aware — MCP has specific error codes. Return MethodNotFound when the tool doesn't exist, InvalidParams when the input is bad.

  3. Streaming matters — For long-running tools, you want to stream results back, not wait until completion. This requires careful JSON-RPC handling.

  4. Testing is non-negotiable — Since the "test" is whether an AI model can successfully call your tools, you need integration tests that actually exercise the full request/response cycle.

The Bigger Picture

MCP is part of a larger shift: from chatbots that respond to agents that act. The protocol standardizes the interface so tools become swappable — write once, use with Claude, ChatGPT, Gemini, or any model that supports MCP.

Rust's position in this ecosystem is well-earned. High-performance tool serving is exactly what Rust does best. If you're building AI infrastructure, Rust isn't just an option — it's becoming the default.

What's Next

I'm planning to build a real MCP server for a personal project — something that bridges a local knowledge base to any AI model. The combination of Rust's performance and MCP's standardization makes this genuinely exciting.

If you're curious, the MCP spec is open. And there's a growing ecosystem of Rust MCP libraries emerging. This feels like early days for a standard that actually matters.