Zubnet AIAprenderWiki › MCP
Ferramentas

MCP

Também conhecido como: Model Context Protocol
Um protocolo aberto (criado pela Anthropic) que padroniza como modelos de IA se conectam a ferramentas e fontes de dados externas. Pense nele como o USB-C da IA — uma interface padrão em vez de integrações sob medida pra cada ferramenta. Servidores MCP expõem capacidades; clientes MCP (como o Claude) consomem essas capacidades.

Por que importa

Antes do MCP, toda integração ferramenta-IA era feita sob medida. MCP significa que uma ferramenta construída uma vez funciona com qualquer IA compatível. Já é suportado pelo Claude, Cursor e outros. É assim que a IA sai de chatbot e vira assistente de verdade.

Deep Dive

MCP works on a client-server architecture over JSON-RPC. An MCP server is a small program that exposes a set of tools, resources, and prompts through a standardized interface. An MCP client — typically an AI application like Claude Desktop, Cursor, or Windsurf — discovers what the server offers and makes those capabilities available to the model. When the model decides to use a tool, the client sends a JSON-RPC request to the server, gets the result, and feeds it back into the conversation. The transport layer is flexible: servers can communicate over stdio (for local processes), HTTP with server-sent events (for remote services), or streamable HTTP (the newest transport, combining request-response and streaming in a single connection).

Building a Server

Building an MCP server is deliberately simple. In Python, you can use the official mcp SDK and have a working server in about 20 lines of code — you decorate a function with @server.tool(), give it a description and typed parameters, and the SDK handles JSON-RPC, schema generation, and transport. In TypeScript, the @modelcontextprotocol/sdk package works the same way. The server declares its capabilities on initialization (what tools it has, whether it supports resources or prompts), and the client negotiates what it wants to use. This means you can start small — a server that just wraps your company's internal API — and add capabilities incrementally.

The USB Moment

The real power of MCP over bespoke tool integrations becomes clear when you consider the combinatorial problem it solves. Before MCP, if you had 10 AI applications and 10 tools, you needed 100 custom integrations. With MCP, you need 10 servers and 10 clients — each built once. This is exactly the same pattern that made USB successful: standardize the interface, and the ecosystem scales. In practice, this means a Postgres MCP server built by one developer works with Claude, Cursor, Zed, and any other MCP-compatible client without modification. The MCP server ecosystem already includes hundreds of community-built servers for databases, APIs, dev tools, and cloud services.

Gotchas and Nuances

There are some important nuances practitioners should know. First, MCP servers come in two flavors: local servers that run on your machine (great for file access, local databases, dev tools) and remote servers that run as hosted services (better for shared infrastructure, SaaS integrations). Second, security is a real consideration — an MCP server has whatever permissions the process running it has, so a poorly scoped server could expose sensitive data. The protocol includes a capability negotiation step, but access control and authentication for remote servers are still evolving. Third, MCP is not the same as tool use — it's a transport and discovery layer that sits underneath tool use. A model that supports tool calling can work with MCP, but MCP also handles things like resource subscriptions (live-updating context) and prompt templates that go beyond simple function calls.

Where It's Heading

Anthropic open-sourced MCP in late 2024, and adoption has been remarkably fast. By early 2025, it was supported by Claude, Cursor, Windsurf, Zed, Sourcegraph, and many others. The specification continues to evolve — the addition of streamable HTTP transport, OAuth-based auth for remote servers, and elicitation (letting servers ask the user for input mid-flow) all landed in 2025. If you're choosing between building a custom tool integration and building an MCP server, the MCP route almost always wins: you get compatibility with every MCP client for free, and the protocol is simple enough that migration cost is near zero.

Conceitos relacionados

← Todos os termos
← Masked Language Modeling Mechanistic Interpretability →
ESC