Just when the dust was settling on my recent post "MCP: The Ultimate API Consumer (Not the API Killer)", the landscape has shifted dramatically. In a fascinating development that aligns with the evolution discussed in that piece, OpenAI has announced full support for MCP across its product line.

Sam Altman confirmed that MCP support is immediately available in OpenAI's Agents SDK, with support for ChatGPT's desktop app and the Responses API coming soon. As quoted in VentureBeat: "We're entering the protocol era of AI. This is how agents will actually do things."

The Convergence Has Begun

This isn't just another tech announcement—it's the beginning of a significant consolidation around standards in the AI industry. With both Anthropic and OpenAI now backing the protocol, we're seeing the emergence of a common language for AI-tool interactions that will benefit the entire ecosystem.

Microsoft has also thrown its weight behind MCP, releasing a Playwright-MCP server that allows AI assistants like Claude to browse the web and interact with sites using the Chrome accessibility tree. This collaboration demonstrates how quickly major tech players are aligning around MCP as a standard.

What makes this particularly fascinating is how quickly it's happening. The industry seems to be converging on what many in the API space have observed: MCP isn't replacing APIs—it's amplifying their usage and effectiveness by creating standardized ways for AI to consume them.

OAuth2 Support: Enterprise-Grade Security Arrives

One of the key concerns I highlighted in my original article was security—specifically the expanded "blast radius" of potential vulnerabilities when AI agents interact with systems on our behalf. The MCP community has moved quickly to address this.

The recent addition of OAuth 2.1 support to the MCP specification may prove to be a crucial factor in enterprise adoption. MCP's authorization specification now implements OAuth 2.1 with appropriate security measures for both confidential and public clients, enabling secure authentication between clients and restricted servers.

This matters because it transforms MCP from an interesting technical experiment into something that can be deployed in production environments with proper security controls. OAuth 2.1 provides the authorization layer that enterprises require before they'll consider integrating AI systems with their critical business data.

What This Means for API Ecosystems

The rapid adoption we're seeing suggests that MCP is indeed functioning as an API consumer rather than an API replacement. MCP's growing success appears to be driving more API traffic as AI agents tap into more services through standardized connections.

Let's think about what this means for different players in the ecosystem:

Local vs. Remote: The Next Evolution

While most of the initial MCP implementations focused on local connections (running servers on your own machine), we're already seeing movement toward remote MCP servers. Cloudflare just announced support for building and deploying remote MCP servers, creating the opportunity to reach users who aren't going to install and run MCP servers locally.

This transition from local to remote MCP connections mirrors the evolution we saw from desktop software to web-based applications. It's a necessary step to reach mainstream adoption, and the OAuth 2.1 support we discussed earlier becomes even more critical in this context.