Have you ever tried asking an AI to read a file on your computer? Or query your database? Or call an internal API?
In most cases, the answer is: it can't. LLMs like ChatGPT, Claude, and Gemini are trained on public data, but they have no access to anything that's yours. They live in a bubble.
MCP (Model Context Protocol) is the protocol that pops that bubble.
The problem: smart AIs, but blind
Imagine a brilliant consultant who knows everything about programming, system architecture, and best practices. But this consultant is locked in a room with no internet, no access to your repositories, no way to see your logs. You have to copy and paste everything manually for them to analyze.
That's exactly how most people use AI today. Copy the code, paste it in the chat, get the response, go back to the editor. A manual, tedious loop.
MCP eliminates that loop.
What MCP actually is
MCP is an open protocol created by Anthropic that defines how an AI connects to external tools. Think of it as a "USB-C for AIs" — a standardized interface that lets any model communicate with any tool.
An MCP server is a program that exposes specific tools. Real examples:
- A filesystem MCP server lets the AI read, write, and search files on your computer
- A database MCP server lets the AI run SQL queries
- A GitHub MCP server lets the AI open PRs, read issues, do code review
- A Slack MCP server lets the AI send messages and read channels
The AI model is the client. It discovers which tools are available and decides when to use each one, based on what you ask.
How it works under the hood
Communication happens in two ways:
- stdio: the server runs as a local process and communicates via stdin/stdout. Ideal for tools on your computer.
- SSE (Server-Sent Events): the server runs remotely via HTTP. Ideal for cloud services.
The flow is simple:
- The client (the AI) connects to the MCP server
- The server announces its tools — name, description, parameters
- The AI decides to use a tool when it makes sense
- The server executes and returns the result
- The AI incorporates the result into its response
This all happens transparently. You ask "search my memory files for the February incident" and the AI knows it needs to call the MCP server's search tool.
Real example: connecting Claude Code to a knowledge base
I used an MCP server called mcpvault to connect Claude Code to my knowledge base — see how.
The configuration is minimal:
{
"mcpServers": {
"vault": {
"command": "npx",
"args": ["-y", "@bitbonsai/mcpvault", "/root/brain"]
}
}
}
With this, Claude Code gained 14 tools: keyword search, file reading, writing, metadata manipulation. It went from "AI that only answers questions" to "AI that consults my documentation and makes informed decisions."
Where MCP is already working
The ecosystem grew fast. Some popular MCP servers:
- filesystem — local file access
- postgres / mysql — database queries
- github — repository operations
- playwright — browser automation
- cloudflare — DNS and Workers management
- context7 — up-to-date library documentation
Claude Code, Cursor, and other AI-powered editors already support MCP natively. The trend is for this to become standard.
MCP vs. function calling: what's the difference?
Function calling (from OpenAI) and MCP solve similar problems, but in different ways.
Function calling is proprietary — each LLM provider has its own implementation. You define functions in code and the model decides when to call them. The code that executes the function is yours.
MCP is an open, standardized protocol. Servers are reusable across different models and editors. You install a GitHub MCP server and it works in Claude Code, Cursor, any compatible client.
In short: function calling is like every app having its own charger. MCP is USB-C.
MCP in the Real World
Theory and tutorial examples are one thing. But I want to show two cases where MCP (or its philosophy) solved real problems for me.
Design System + AI: living documentation via MCP
At a company I worked at, the team maintained a proprietary design system with dozens of components. The classic problem: documentation fell out of date, and anyone using AI to generate code with these components got suggestions with wrong props, obsolete patterns, invalid configurations.
The solution was to create an MCP server that exposed the complete design system documentation — each component with its valid props, validation rules, usage examples. When someone used Claude Code to build a page, the AI consulted the MCP to verify how to use each component correctly.
The result: the AI always had up-to-date documentation without needing to memorize anything. When a component changed, you just updated the MCP's data source. Zero retraining, zero copy-pasting docs. This is MCP at its best — a living bridge between documentation and AI-assisted development.
WhatsApp AI Bot: function calling in practice
In Sistema Reino — my church management SaaS — I built a WhatsApp AI bot that lets church administrators do things like:
- Look up member information ("what's Maria Silva's phone number?")
- View upcoming events and schedules
- Check financial reports
- Register attendance at services
The bot uses function calling — which is conceptually MCP's cousin. The AI receives a natural language message, decides which "function" to call (search member, list events, check finances), executes it, and responds with real data.
This is MCP's philosophy in action: giving the AI access to external capabilities it wouldn't have on its own. The difference is that function calling is tied to the LLM provider, while MCP is an open protocol. But the core concept — AI that acts instead of just opining — is the same.
Why this matters for devs
MCP is the bridge between "AI that suggests code" and "AI that operates systems." When AI can read your logs, query your database, access your internal documentation — it stops being a glorified autocomplete and becomes a coworker.
If you use Claude Code, Cursor, or any AI tool daily, it's worth investing 30 minutes to understand and configure your first MCP servers. The productivity gain is real and immediate.
I manage 5 production projects with a Telegram bot that uses MCP to access servers, databases, and files. This isn't the future — it's what's already running today.