What is MCP (Model Context Protocol)
Model Context Protocol (MCP) is an open standard created by Anthropic that defines how AI applications communicate with external tools and data sources. Think of MCP as the "USB-C of AI" — a universal connector that lets any AI client talk to any compatible server. Before MCP, every AI tool integration required custom code. MCP changes that with a standardized protocol that the entire industry is adopting.
Why MCP Matters
Without MCP, connecting an AI assistant to your tools requires building custom integrations for each combination of AI client and tool. If you have 5 AI clients and 10 tools, that's 50 custom integrations. With MCP, each tool exposes a single MCP server interface, and each AI client implements a single MCP client interface. Now those same 5 clients and 10 tools need only 15 implementations total — and any new client instantly works with all existing tools.
{
"without_mcp": {
"ai_clients": ["Claude Desktop", "Cursor", "Windsurf", "Continue", "Custom App"],
"tools": ["Database", "Slack", "GitHub", "Calendar", "CRM"],
"integrations_needed": "5 clients x 5 tools = 25 custom integrations"
},
"with_mcp": {
"ai_clients": ["Claude Desktop", "Cursor", "Windsurf", "Continue", "Custom App"],
"tools": ["Database MCP Server", "Slack MCP Server", "GitHub MCP Server", "Calendar MCP Server", "CRM MCP Server"],
"integrations_needed": "5 client implementations + 5 server implementations = 10 total"
}
}Industry Adoption
MCP Architecture
MCP follows a client-server architecture with three core primitives that servers can expose to clients:
- Tools — Functions that the AI can call to perform actions (e.g., "send_email", "query_database", "create_ticket"). Tools have typed input schemas and return results.
- Resources — Data sources the AI can read (e.g., file contents, database records, API responses). Resources provide context without executing actions.
- Prompts — Pre-built prompt templates that guide the AI for specific tasks. Prompts are reusable instructions optimized for particular workflows.
Transport Mechanisms
MCP supports two transport mechanisms for communication between clients and servers:
- stdio (Standard I/O) — The server runs as a local process, and the client communicates via stdin/stdout. Best for local tools like file system access or local databases. Low latency, no network required.
- SSE (Server-Sent Events) over HTTP — The server runs as a web service, and the client connects over HTTP. Best for remote servers, shared tools, and production deployments. This is what n8n uses for its MCP Server Trigger.
{
"mcp_message_example": {
"jsonrpc": "2.0",
"method": "tools/call",
"params": {
"name": "query_orders",
"arguments": {
"customer_email": "user@example.com",
"status": "shipped"
}
},
"id": 1
}
}n8n + MCP = Superpower
MCP in the n8n Ecosystem
n8n implements MCP in two directions. The MCP Server Trigger exposes your n8n workflows as MCP tools that external AI clients can call. The MCP Client node lets your n8n workflows consume external MCP servers, giving your AI agents access to tools provided by other systems. Together, these make n8n a central hub in any MCP-powered AI architecture.
Lesson 1 / 3