Model Context Protocol (MCP) is a way to build integrations for AI models.
Vercel now supports deploying MCP servers (which AI models can connect to) as well as MCP clients (AI chatbot applications which call the servers).
Get started with our Next.js MCP template today.
APIs allow different services to communicate together. MCP is slightly different.
Rather than thinking about MCP like a REST API, you can instead think about it like a tailored toolkit that helps an AI achieve a particular task. There may be multiple APIs and other business logic used behind the scenes for a single MCP tool.
If you are already familiar with tool-calling in AI, MCP is a way to invoke tools hosted on a different server.
MCP now supports a protocol similar to other web APIs, namely using HTTP and OAuth. This is an improvement from the previous stateful Server-Sent Events (SSE) protocol.
To simplify building MCP servers on Vercel, we’ve published a new package,
The majority of MCP clients currently only support the SSE transport option. To handle state required for the SSE transport, you can integrate a Redis server through any provider in our marketplace like Upstash and Redis Labs.
We’ve already seen customers successfully deploying MCP servers in production. One customer has seen over 90% savings using Fluid compute on Vercel versus traditional serverless. Fluid enables you to have full Node.js or Python compatibility, while having a more cost effective and performant platform for AI inference and agentic workloads.
Vercel's AI SDK has built-in support for connecting your Node.js or Next.js apps to MCP servers.
We’re looking forward to future MCP servers built with the HTTP transport and starting to explore the latest developments like OAuth support.
Other Vercel projects like shadcn/ui are exploring ways to integrate MCP. If you have suggestions for MCP server use cases on Vercel, you can share your feedback in our community.
Read more
Continue reading...
Vercel now supports deploying MCP servers (which AI models can connect to) as well as MCP clients (AI chatbot applications which call the servers).
Get started with our Next.js MCP template today.
How is MCP different than APIs?
APIs allow different services to communicate together. MCP is slightly different.
Rather than thinking about MCP like a REST API, you can instead think about it like a tailored toolkit that helps an AI achieve a particular task. There may be multiple APIs and other business logic used behind the scenes for a single MCP tool.
If you are already familiar with tool-calling in AI, MCP is a way to invoke tools hosted on a different server.
MCP now supports a protocol similar to other web APIs, namely using HTTP and OAuth. This is an improvement from the previous stateful Server-Sent Events (SSE) protocol.
Deploying MCP servers to Vercel
To simplify building MCP servers on Vercel, we’ve published a new package,
@vercel/mcp-adapter
, which supports both the older SSE transport and the newer stateless HTTP transport.The majority of MCP clients currently only support the SSE transport option. To handle state required for the SSE transport, you can integrate a Redis server through any provider in our marketplace like Upstash and Redis Labs.
We’ve already seen customers successfully deploying MCP servers in production. One customer has seen over 90% savings using Fluid compute on Vercel versus traditional serverless. Fluid enables you to have full Node.js or Python compatibility, while having a more cost effective and performant platform for AI inference and agentic workloads.
Get started with MCP
Vercel's AI SDK has built-in support for connecting your Node.js or Next.js apps to MCP servers.
We’re looking forward to future MCP servers built with the HTTP transport and starting to explore the latest developments like OAuth support.
Other Vercel projects like shadcn/ui are exploring ways to integrate MCP. If you have suggestions for MCP server use cases on Vercel, you can share your feedback in our community.
Read more
Continue reading...