Speakeasy Logo
Skip to Content

Real-world MCP: How companies are using MCP in production today

When Anthropic released the Model Context Protocol (MCP)  in November 2024, the usual suspects wasted no time experimenting. But here’s the thing: Instead of being stuck as a demo on GitHub, MCP quietly started showing up in real-world production systems.

To understand the real impact of MCP, we investigated how companies are implementing it in production, from Zed  building its Agent Panel around MCP from day one, to Solana  developers using MCP to manage DeFi protocols through conversational interfaces, Stripe  using MCP to monetize AI tools, and more.

Here’s what we learned.

Block’s MCP implementation

Between Cash App  and Square , Block  handles billions of dollars in payments  and processes transactions for over four million merchants .

Block’s AI agent goose  (which uses Claude in Databricks  as its default model) started out as a coding assistant. Now that goose can use MCP to connect to Block’s internal systems, the tool powers work across the entire company. About 4,000 of Block’s 10,000 employees actively use goose across 15 different job roles, from sales and design to customer success and operations.

The MCP-enabled system democratizes data access at Block: Employees who don’t know SQL can solve their own data problems by describing what they need in plain English. Security analysts create detection rules by describing threats naturally instead of wrestling with complex query syntax.

Block developers can build an MCP server for any tool and instantly make it available to AI agents. As the setup uses OAuth with short-lived credentials, employees don’t need to manage API keys.

Three-quarters of Block’s engineers report saving 8-10 hours per week using goose. The tool has become so effective that one engineer says , “90% of my lines of code are now written by goose.”

Zed makes MCP feel native

The team behind performance-focused code editor Zed  built its Agent Panel around MCP from the start, rather than adding AI features to an existing architecture.

Take Zed’s database integration. You can type /pg-schema users and get the schema for your users table instantly.

Zed Postgres

But it gets more interesting with Zed’s Neon integration, which lets you safely modify production databases:

That’s the kind of workflow that would be nerve-wracking without proper guardrails, but MCP’s stateful sessions make it possible to build safely.

The technical implementation is clean: MCP servers run as separate processes that communicate with Zed’s Agent Panel via stdio, with support for HTTP and server-sent events coming soon. Extension developers can register servers in their extension.toml and implement a simple context_server_command method.

Replit’s MCP playground

Replit  is a cloud-based development environment platform that handles setup and hosting for coding projects. It supports multiple programming languages and lets you run code directly in the browser without any local installations.

Recently, Replit pivoted to agent-powered coding, positioning itself as an AI-first development platform for vibe coding entire applications with zero setup.

Replit’s MCP templates come with pre-configured servers, so you can start a new environment and have working AI integrations without installing or configuring anything yourself.

The Learn About MCP template includes YouTube processing, filesystem access, and external API integration out of the box. The configuration lives in a simple JSON file:

What makes this compelling is that you can type something like “Summarize this video https://youtu.be/1qxOcsj1TAg  and write the summary to summary.txt” and watch it happen automatically. The AI coordinates between multiple services, fetching the video transcript, processing it, and saving the output – all through standardized MCP interfaces.

Replit Agent takes this multi-service coordination further by automatically integrating services like EmailJS and Mailtrap when you describe what you want to build. Ask for a React app with email functionality, and the agent configures the MCP servers, sets up the integrations, and builds the app. It’s the kind of seamless experience that makes MCP’s complexity worthwhile.

Code intelligence platforms embrace MCP

Sourcegraph  and Codeium  both see MCP as a way to differentiate their code intelligence platforms. The implementations are telling, showing how established companies are using MCP not just for novelty, but for genuine competitive advantage.

Code intelligence diagram

Sourcegraph implements MCP through OpenCtx  (Sourcegraph’s own standard for external context), with Cody  acting as the MCP client. Sourcegraph also has a batch changes MCP server  that automates large-scale modifications across repositories, and a React Props server that helps developers understand component usage patterns across entire codebases.

The workflow feels natural: Cody can analyze your database schema to suggest query optimizations, pull GitHub issues directly into your editor context, or search across multiple repositories with semantic understanding. It’s code intelligence that actually understands the broader context of your work.

Codeium took a different approach with the Windsurf  IDE. Windsurf Wave 3 includes MCP support, allowing Cascade  to connect to multiple servers simultaneously. The configuration is straightforward:

Windsurf also features UI panels for managing MCP servers, one-click setup for popular servers, and proper sandboxing. It’s the kind of polish that suggests MCP integration is becoming table stakes for development tools.

Solana blockchain data access

Several developers have built MCP servers that make Solana blockchain data accessible to AI assistants. While you can’t execute transactions, these tools turn complex blockchain queries into simple conversations.

The most documented implementation comes from QuickNode’s comprehensive tutorial . This MCP server handles some typical blockchain analysis tasks: checking wallet balances, retrieving token accounts, examining transaction details, and querying account information. What makes it practical is the integration with Solana Agent Kit , which handles all the RPC complexity behind a clean interface.

The Solscan MCP  server takes a different approach. It’s built on Solscan’s  API infrastructure, giving you access to richer data analysis capabilities, like token metadata (names, symbols, logos), market data from various exchanges, holder distribution analytics, and detailed DeFi activity tracking. The server can tell what tokens a wallet holds, how those holdings have changed over time, and which DeFi protocols the wallet has interacted with.

Both Solana MCP implementations focus on data access rather than transaction creation. You can ask an AI assistant to “analyze this wallet’s DeFi activity” or “explain what happened in this transaction” and get useful answers without any security risks. The read-only design means you get the benefits of natural language blockchain analysis without worrying about accidental transactions or compromised keys. For most blockchain analysis use cases, this is exactly what you want: insight without risk.

Stripe introduces paid MCP servers

The Stripe Agent Toolkit  lets developers create MCP servers that charge for tool usage. The toolkit runs on Cloudflare Workers and handles OAuth authentication.

When an AI agent calls a paid tool, the MCP server checks if the user has already paid. If not, it creates a Stripe checkout session and returns a payment URL. After payment, the tool becomes available.

Stripe MCP monetization

The system supports one-time payments, subscriptions, and usage-based billing through Stripe’s metering API. Developers can charge per API call, search query, or processing job.

The growing MCP ecosystem

The MCP ecosystem has grown to over 15,000 community-built servers, creating powerful network effects.

AWS ecosystem integration

The hundreds of services available from AWS are notoriously complex to navigate. MCP servers simplify this by letting AI assistants work directly with AWS through standard interfaces.

Amazon Bedrock Samples  enables natural language queries to knowledge bases with vector database searches, RAG pipeline integration with automatic document chunking, and multi-modal retrieval for text, images, and structured data.

The AWS Cloud Development Kit (CDK)  generates infrastructure code from natural language descriptions, creating CloudFormation templates with validation and resource dependency mapping that includes cost estimation.

Cost Analysis tools decode AWS billing complexity through usage analysis for optimization recommendations, CloudWatch-based rightsizing suggestions, and cost forecasting with budget alerts.

Database platform innovations

MCP bridges conversational AI with complex data operations, letting developers work with sophisticated databases using natural language instead of learning new APIs.

DataStax Astra DB  combines NoSQL with vector search for AI applications, handling vector search for knowledge bases, document CRUD operations with schema validation, collection management with automatic scaling, and real-time analytics with aggregation pipelines.

The ClickHouse MCP Server  makes high-performance analytics accessible to non-specialists through time-series analysis for metrics and events, columnar storage optimization for analytical workloads, distributed query execution across cluster nodes, and real-time data ingestion with exactly-once processing.

The server handles query optimization automatically, turning natural language questions into optimized SQL that processes billions of events in seconds.

Creative tool integration

Creative MCP implementations solve real workflow problems for artists, game developers, and content creators working with complex tools.

BlenderMCP  by Siddharth Ahuja removes barriers between creative vision and technical execution. Instead of memorizing hundreds of keyboard shortcuts, users can focus on creativity through 3D scene generation from natural language descriptions, object manipulation with physics simulation, animation creation with keyframe interpolation, and rendering pipeline control with material and lighting setup. The server automatically configures rigid body dynamics, gravity, and collision detection when you say “make these objects fall realistically.”

MCP Unity Editor  by Miguel Tomas streamlines game development workflows through game object creation and component management, scene manipulation with hierarchy management, script generation with C# code compilation, and asset pipeline integration with import and export workflows.

The script generation understands Unity’s patterns and conventions, producing properly structured C# code with appropriate component references rather than generic code that won’t work in Unity’s context.

Why companies are choosing MCP

Let’s take a look at the compelling technical and business reasons companies are adopting MCP.

The USB moment for AI tools

Just as the USB standard  changed computing by replacing proprietary connectors, MCP standardizes how AI connects to tools.

Without MCP, connecting an AI assistant to Zendesk, Salesforce, and Slack means building three separate integrations from scratch. Each integration needs custom authentication, error handling, and data formatting – like having a different port for every device.

With MCP, you write one server implementation that exposes Zendesk’s capabilities to any AI system that speaks the protocol. Instead of building N integrations for every M tools (N x M complexity), you only need to build N servers and M clients. This matters when you’re dealing with dozens of internal tools and multiple AI agents.

Inherent security

MCP creates a single chokepoint for AI access to company systems. Instead of tracking permissions across dozens of custom integrations, connections flow through MCP servers that can be monitored, logged, and controlled consistently.

For example, Gram’s implementation of MCP  allows for environment-specific tool access and the ability to define different credentials and server URLs for different environments.

With Gram, you can create separate staging and production environments, each with its own credentials and server URL tailored to different access requirements.

Switching models without rebuilding from scratch

Since MCP servers are model-agnostic, companies can experiment with different AI providers without throwing away their tool integrations. This flexibility helps with avoiding vendor lock-in, testing new models for specific tasks, and lowering costs as inference pricing comes down.

Getting started with MCP

If you’re considering MCP for your organization, MCP.so  has an extensive list of MCP servers and clients, including MCP servers that may integrate with your existing tools.

To learn more about MCP and how to try it out, check out our quickstart guide to installing MCP servers.

Last updated on