Speakeasy Logo
Skip to Content

Using LangChain with Gram-hosted MCP servers

LangChain  and LangGraph  support MCP servers through the langchain-mcp-adapters library, which allows you to give your LangChain agents and LangGraph workflows direct access to your tools and infrastructure by connecting to Gram-hosted MCP servers.

This guide demonstrates how to connect LangChain to a Gram-hosted MCP server using an example Push Advisor API . You’ll learn how to create an MCP server from an OpenAPI document, set up the connection, configure authentication, and use natural language to query the example API.

Find the full code and OpenAPI document in the Push Advisor API repository .

Prerequisites

You’ll need:

Creating a Gram MCP server

If you already have a Gram MCP server configured, you can skip to connecting LangChain to your Gram-hosted MCP server. For an in-depth guide to how Gram works and to creating a Gram-hosted MCP server, check out our introduction to Gram.

Setting up a new Gram project

In the Gram dashboard , click New Project to start the guided setup flow for creating a toolset and MCP server.

When prompted, upload the Push Advisor OpenAPI document .

Follow the steps to configure a toolset and publish an MCP server. At the end of the setup, you’ll have a Gram-hosted MCP server ready to use.

For this guide, we’ll use the public server URL https://app.getgram.ai/mcp/canipushtoprod

For authenticated servers, you’ll need an API key. Generate an API key in the Settings tab.

Connecting LangChain to your Gram-hosted MCP server

LangChain supports MCP servers through the langchain-mcp-adapters library using the MultiServerMCPClient class. Here’s how to connect to your Gram-hosted MCP server:

Installation

First, install the required packages:

Environment setup

Set up your environment variables by creating a .env file:

Load these in your Python code:

To run the async code given in the sections to follow, you can import asyncio and wrap the code in an async function as shown below:

Basic connection (public server)

Here’s a basic example using a public Gram MCP server with Streamable HTTP transport:

Authenticated connection

For authenticated Gram MCP servers, include your Gram API key in the headers:

Understanding the configuration

Here’s what each parameter in the MultiServerMCPClient configuration does:

  • The server key (for example, "gram-pushadvisor") provides a unique identifier for your MCP server.
  • url adds your Gram-hosted MCP server URL.
  • transport: "streamable_http" specifies HTTP-based communication for remote servers.
  • headers adds optional HTTP headers for authentication.

Using MCP tools in LangGraph workflows

LangChain MCP tools work smoothly with LangGraph workflows using the ToolNode:

Connecting to multiple MCP servers

LangChain’s MultiServerMCPClient allows you to connect to multiple MCP servers simultaneously:

Error handling

Proper error handling ensures your application gracefully handles connection issues:

Working with tool results

When using MCP tools in LangChain, you can access detailed tool call information:

Streaming responses

LangChain supports streaming responses with MCP tools:

Using local MCP servers with stdio

LangChain also supports connecting to local MCP servers using stdio transport:

Complete example

Here’s a complete example that demonstrates connecting to a Gram MCP server and using it with LangChain:

Differences from other MCP integrations

LangChain’s approach to MCP differs from direct API integrations:

Connection method

  • LangChain uses MultiServerMCPClient with support for multiple servers.
  • OpenAI uses the tools array with type: "mcp" in the Responses API.
  • Anthropic uses the mcp_servers parameter in the Messages API.
  • The Vercel AI SDK uses experimental_createMCPClient with a single server.

Transport support

  • LangChain supports both streamable_http and stdio transports.
  • OpenAI supports direct HTTP and HTTPS connections.
  • Anthropic supports URL-based HTTP connections.
  • The Vercel AI SDK supports SSE, stdio, and custom transports.

Tool management

  • LangChain fetches tools via the get_tools() method, used with agents or workflows.
  • OpenAI allows tool filtering via the allowed_tools parameter.
  • Anthropic uses a tool configuration object with an allowed_tools array.
  • The Vercel AI SDK allows schema discovery and definition and uses activeTools for filtering.

Framework integration

  • LangChain includes deep integration with LangGraph for workflows and chains.
  • Others are limited to direct API usage without workflow abstractions.

Multi-server support

  • LangChain has native support for multiple MCP servers in one client.
  • Others only allow a single server per connection (requiring you to create multiple clients).

Testing your integration

If you encounter issues during integration, follow these steps to troubleshoot:

Validate MCP server connectivity

Before integrating into your application, test your Gram MCP server in the Gram Playground  to ensure the tools work correctly.

Use the MCP Inspector

Anthropic provides an MCP Inspector  command line tool that helps you test and debug MCP servers before integrating them with LangChain. You can use it to validate your Gram MCP server’s connectivity and functionality.

To test your Gram MCP server with the Inspector, run this command:

In the Transport Type field, select Streamable HTTP.

Enter your server URL in the URL field, for example:

Click Connect to establish a connection to your MCP server.

Screenshot of the MCP Inspector connecting to a Gram MCP server

Use the Inspector to verify that your MCP server responds correctly before integrating it with your LangChain application.

Debug tool discovery

You can debug which tools are available from your MCP server:

Environment setup

Ensure your environment variables are properly configured:

Then load them in your application:

What’s next

You now have LangChain and LangGraph connected to your Gram-hosted MCP server, giving your agents and workflows access to your custom APIs and tools.

LangChain’s powerful abstractions for agents, chains, and workflows, combined with MCP tools, enable you to build sophisticated AI applications that can interact with your infrastructure.

Ready to build your own MCP server? Try Gram today  and see how easy it is to turn any API into agent-ready tools that work with LangChain and all major AI frameworks.

Last updated on