Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

MCP and the innovation paradox: Why open standards will save AI from itself


Join our daily and weekly newsletters for the latest updates and exclusive content in the industry’s leading AI coverage. Learn more


The larger models do not control the next wave of the AI ​​update. The real disruption is calmer: standardization.

Launched by anthropic in November 2024 Model context protocol (MCP) standardizes how the AI ​​apps are in the interaction of the world outside the training information. Like http and recreation, standardized how web applications are connected to the services, MCP standardizes how AI models are connected to the tools.

You have probably read many articles that describe what MCP is. But what is the most missing thing – and strong – it is part of: MCP standard. Standards are not only organizing technology; Growth create flywheels. Take them early and ride the wave. Ignore them and you are behind. This article explains why the MCP changed what the MCP is now and how it applies and the ecosystem.

How does MCP spend us from chaos to context

Lily, get acquainted with a product manager in a cloud infrastructure company. O Juggling Projects Half half of half like Jira, Figma, GitHub, Slack, Gmail and merger. Like many, he drowns in the updates.

Until 2024, Lily saw how good it was in the synthesis of large language models (LLS). An opportunity found: If his team could nourish the tools in the model, automate and respond to updates, communication projects. However, each model had a special way to connect to the services. Each integration has deepened it to a seller’s platform. When you have to shoot in transcripts from Gong, it means to prepare another resistance connection by getting more difficult to switch to a better LLM.

Then launched anthropic MCP: A open protocol to standardize how LLMS flows in the context. McP supported quickly Open, Boring, Little, Microsoft Copilot Studio And soon Google. Available for Official SDKS Python, Shit, Java, C #, Rust, Kotlin and Quick. For community SDKS Go and others followed. Admission was rapidly.

Today, lily manages everything through Claude connected to business applications via local MCP server. Status reports are darkening themselves. Leadership updates are far from one type. As new models emerge, it can change them without losing any of their integration. When you write the code on the side, it uses the cursor with an Openai and a model from the same MCP server Claud. His he realized the building of his building. McP made it easier.

The strength and results of the standard

Lily’s story shows a simple truth: No one likes to use fragmented tools. No user likes to lock to vendors. And no company does not want to rewrite integrates when changing models. You want the freedom to use the best means. Provides MCP.

Now the result comes with the standards.

First, Saas providers are sensitive to wear without a strong folk apI. MCP tools depend on this API and customers will require support for AI applications. There is no excuse that arises in a de-facto standard.

Second, the development of the AI ​​application is preparing to accelerate sharply. Developers do not have to write a special code to test simple AI applications. Instead, MCP servers can combine with Clauded Desktop, Cursor and Cursor and Cursor and ready-made MCP clients.

Third, the transition costs collapses. Integrations are separated from special models, organizations will be able to reset the convenience to Openai to Openai – or mixed models, infrastructure. The future Llm providers The MCP will benefit from an existing ecosystem around, which allows them to pay attention to better price performance.

Difficulties with MCP

Each standard presents new friction points or has not been resolved to existing friction points. MCP is no exception.

Trust is important: Dozens of MCP registers offer a server stored by thousands of communities. However, if you do not control the server – or if you do not trust the party – you leak the secrets that leaked into an unknown third party. If you are a Saas company, give official servers. If you are a developer, look for official servers.

The quality is changeable: APIS can easily sync developed and poorly stored MCP servers. LLMS trusts high quality metadata to determine which tools will be used. There is no authorized MCP register, strengthens the need for official servers from reliable parties. If you are a Saas company, protect your servers for APIS. If you are a developer, look for official servers.

Large MCP servers increase costs and lower program: Collecting a large number of tools to a single server increases costs through signs and overweight models with multiple choices. LLS is easily confused if they can get a large number of tools. The worst of both worlds. Small, task-oriented servers will be important. Remember to do so as you build and distributing servers.

Authorization and personality problems continue: These problems were available before MCP and they are still available with MCP. Imagine Lily, sending emails and “Send a Quickly Chris status update” gave good intentional guidelines. Instead of sending your boss, send Chris, Chris to make sure that Chris has received anyone’s emails. People should stay in the loop for highly judged actions.

Looking forward

MCP is not Hype – the main turn of infrastructure for AI applications.

As before each adopted standard, MCP is a flywheel of self-impatient: every new integration, combines the speed of each new application.

New tools, platforms and registers, MCP servers are already generated to find out the building, testing, placement and discovering. As EcoSystem develops, AI applications will present simple interfaces to connect to new opportunities. The teams embracing the protocol will send products faster with better integrated stories. The companies offering Public APIS and official MCP servers can be part of the integration story. Late adopters will have to fight for the actuality.

Noah Schwartz is the head of the product Postman.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *