MCP is the coming of Web 2.0 2.0

  • The thing that a lot of people miss with MCP is that it's the right fit for enterprise software. LLMs, being universal translators, are the ideal glue between many disconnected systems that are extremely hard to connect without some fuzzy layer in between. And so that's why you see so much of the B2B SaaS world rolling out MCP servers, and internally at these companies, they're talking about how to re-jigger their APIs and restrictions for those APIs given different usage patterns.

    Yes, the protocol is not necessarily "enterprise ready" by various definitions, but as the author points out, this is not terribly important, and the history of standards shows us that even messy and "bad" things get widespread adoption because they hit the right notes at the right time for the right people.

  • > Compared to the olden days, when specs were written by pedantic old Unix dudes

    I think that is one of the reasons (among many others) that the semantic web failed (which doesn't contradict the author, whose point is literally the worse-is-better mantra).

    People really leaned into the eXtensible part of XML and I think a certain amount of fatigue set it. XSL, XHTML, XSD, WSDL, XSLT, RDF, RSS, et al. just became a bit too much. It was architecture astronautics for data formats when what the world at the time needed was simple interchange formats (and JSON fit the bill).

    But I actually believe XML's time has come. I've noticed that XML appears a lot in leaked system prompts from places like Anthropic. LLMs appear to work very well with structured text formats (Markdown and XML specifically).

    I believe that MCP is the wrong model, though. I believe we should be "pushing" context to the models rather than giving them directions on how to "pull" the context themselves.

  • "The rise of MCP gives hope that the popularity of AI amongst coders might pry open all these other platforms to make them programmable for any purpose, not just so that LLMs can control them."

    I think the opposite, MCP is destined to fail for the exact same reason the semantic web failed, nobody makes money when things aren't locked down.

    It makes me wonder how much functionality of things like AI searching the web for us (sorry, doing "deep-research") might have been solved in better ways. We could have had restaurants publish their menus in a metadata format and anyone could write a python script to say find the cheapest tacos in Texas, but no, the left hand locks down data behind artificial barriers and then the right hand builds AI (datacenters and all) to get around it. On a macro level its just plain stupid.

  • I pitty the fools thinking they will have access to anything because there's a MCP.

    those things will be hidden behind a dozen layers of payment validation and authentication. And whitelisted IPs (v4, of course).

    ERR 402; is all that will be visible to yall.

  • What worries me most about MCP isn't that the protocol is so poorly created, but that fixing/improving it is only at the mercy of internal teams at Anthropic and OpenAI. It doesn't seem like the folks who are coming up with the protocol are actual engineers trying to implement it as well.

    Vaguely seems like a Visa-Mastercard duopoly.

  • Isn’t conformance to a standard API arguably less necessary now that LLMs can read API docs and adapt? For me, the win is the expectation that sites _have_ an API, whether it conforms to the MCP spec or not.

  • We can now build the Semantic Web. All we have to do is create a tiny protocol (as an optional extension to MCP) for how organizations can share their SQL Table Create DDL as a static file that MCP apps can read, to understand data, and then, using the already-existing tools for AI/LLM function calling to SQL, that would become a Semantic Web.

    That would fill the missing link that always held back the Semantic Web which was the lack of any incentive for companies to bother to use a standard "Data Type" rather than all proprietary data types. Once we have an MCPQ (MCP with Queries), suddenly there's an incentive for organizations to collaborate at the data structure layer.

  • Turns out the “Semantic Web” was a syntactic web all along, and maybe this is the real deal?

  • Wonder how long it's going to be until someone makes an MCP server for controlling a cockroach (or similar)?

    For reference:

    * https://www.technologynetworks.com/informatics/news/robo-roa...

    * https://www.sciencealert.com/scientists-turned-cockroaches-i...

    And many other examples going back more than a decade...

  • > when specs were written by pedantic old Unix dudes

    I'm tickled pink that this generation imagines "old Unix dudes" as pedantic, when Unix was the ultimate "move fast and break things" rebellion against the MIT school. Some things never change :-)

  • Won't this be used to spam AI with ads and "search engine optimization" equivalent?

  • "MCP is very nearly just a vague set of ideas, a hallucination of a specification, appropriate to the current era, where even the constitution is just a suggestion. A ~~ vibe protocol ~~."

    Oh, goodie.

  • My hot take: MCP's value for the foreseeable future will be with automated testing

  • At a higher level, MCP seems wants to enforce a standard where no standard exists. I get that the low level technical implementation allows AI to utilize these tools.

    But there doesn't seem to be any standardization or method in how to describe the tool to the AI so that it can utilize it well. And I guess part of the power of AI is that you shouldn't need to standardize that? But shouldn't there at least be some way to describe the tool's functionality in natural language or give some context to the tool?

  • I don't know what to say other than "I sure hope not".

  • MCP could have cracked the web opeb. The terrible standard was all about clients and local servers all on the same host.

    Imagine it, everything is open, servers are as simple as a pip install ... You have full control of what servers you install. What functions you turn on, what access you allow.

    Now everyone and their blog is sticking MCPs on their servers and locking them down behind subscriptions and paywalls.

    What a wasted opportunity.

  • Everything seems to be susceptible to enshittification and so far I see no evidence that MCP is any exception. First the value will go to users, then the users ll cut short to drive value to shareholders and then it will turn to an absolute pile of garbage as businesses make every attempt to somehow cash in on this

  • nothing will replace the Web

  • While I understand where the author is coming from, and I get his sentiment(s), I don't think what he proposes is actually possible: his vision relies on faux open tools and protocols and having access to walled gardens. The means of computation for these kinds of things are owned by a tiny minority. Nearly everything is a SaaS or is based, one way or the other, on rent extraction. We're essentially subject to the whims of someone who is letting us do something for as long as we play nice.

    >There is a chance, though, that younger developers, and those who weren't around to build back during that last era a generation ago, are going to get inspired by MCP to push for the web to go back towards its natural architecture. It was never meant to be proprietary.

    Alas, the reason APIs started closing and being metered is because, after all, there's someone owning and paying for the hardware upon which you are making calls and requests.

    As long as there's no way to agree upon how to have a bunch of servers providing computation for anyone and at the same time ensuring their upkeep without the need for a central authority, I don't think such vision is sustainable long term. The current state of the Internet is proof of it.

  • MCP is not an open standard.

    People routinely mistake "protocol specification uploaded to GitHub, PRs welcome" as open standards. They are not. Calling them "open protocols" because they are open source, not open standards (no standards body was involved in the making of this protocol!) is essentially a form of openwashing.

    This has been happening way too frequently lately (see also: ATProto), and it really needs to be called out.

  • People said the same thing about "APIs" 10-15 years ago when they were a craze. Everything had to be an API! Doesn't matter whether it made sense to not. It's going to change the world! We're going to have San Francisco events with microbrews for APIs! Everyone's going to publish API frameworks! Let me make api-blog-blog.blogger.blog!

    Blah. Bay Area Tech regularly goes through these bursts of baseless enthusiasm for rehashes of existing technology. Is MCP useful? Yeah, probably. Is the current froth based on anything but SF-scene self-licking-ice-cream-code social cred? No.

  • Fun writing, and something to think about. To me, Web 2.0 is kind of a joke; jQuery, REST, AJAX, CSS2, RSS, single page apps were going to change everything overnight, it was THE buzzword, and then... incremental improvements. In retrospect, everything did change, but that loose collection of technologies was just links in the chain of incremental progress. So yeah, Web 2.0 2.0 makes sense.

    I've seen a lot of talk around here, and everywhere, about MCP. A lot of enthusiasm and a lot of criticism. I've written a few MCP servers and plan to write some more. It isn't how I would design a protocol, but it works, and everyone is using it, so hooray for interoperability.

    I think the hype represents the wider enthusiasm that people have about this moment in time, and the transformative power of these new tools. It's easy to look at MCP and say there it is, now it's easy to connect these tools to the things that I care about, it feels accessible, and there's community.

    MCP is NOT the new TCP for AI. It is, essentially, an RPC layer for chat. And while it could be extended to a wider variety of use cases, I expect that it will remain primarily a method for wiring up tool calls for user-facing use cases. We recognize the power of these tools and anticipate deep changes to workflows and systems, but we don't know how that will shake out. If I am writing a classifier for a backend system, I am not going to use MCP, even if I could. Because it's inefficient. Every additional tool we offer the model consumes tokens and increases latency. I expect that the primary user of LLMs is going to be business automation of all kinds, and I don't expect them to reach for MCP to wire things up. Yeah, it's really cool to hook tools up to the chat, for that to feel accessible, to know how to do things in an idiomatic and standards-compliant way, that feels good! And yeah, the hype is overblown.

  • I really wish we'd learn from Web 2.0.

    All the mistakes of "hey everything has an API now" that we learned from we're back to repeating.

    I feel like that meme from Watchmen with the blue guy sitting on Mars.

  • Rent seeking is the name of the game for much of b2b SaaS.

    MCP is an attempt to make that easy, but the issue here is that a lot of the companies offering integration could be disintermediated entirely by LLMs. Hard to say what that means.

  • [flagged]

  • [flagged]

  • [flagged]

  • [flagged]

  • [flagged]

  • [flagged]

  • MCP is a brilliant way to wrap pre-existing APIs with LLM interpretation and in a way, marking standard REST apis as outdated and soon you’ll see language calling it “legacy”.

    What could just be a simple api call or set of api calls combined into a single response now “must” be wrapped in an MCP server ensuring that LLM providers effectively take a cut of every api call.