Speakeasy Logo
Skip to Content

Using OpenAI's Agents SDK with Gram-hosted MCP servers

The OpenAI Agents SDK  is a production-ready framework for building agentic AI applications. The Agents SDK provides advanced features like persistent sessions, agent handoffs, guardrails, and comprehensive tracing for complex workflows.

When combined with Gram-hosted MCP servers, the Agents SDK enables you to build sophisticated agents that can interact with your APIs, databases, and other services through natural language conversations with persistent context.

This guide shows you how to connect the OpenAI Agents SDK to a Gram-hosted MCP server using Taskmaster, a full-stack CRUD application for task and project management. Taskmaster includes a web UI for managing projects and tasks, a built-in HTTP API, OAuth 2.0 authentication, and a Neon PostgreSQL database for storing data. Try the demo app  to see it in action.

You’ll learn how to set up the connection, configure agents with MCP tools, and build conversational task management workflows.

Understanding OpenAI API options

OpenAI provides three main approaches for integrating with MCP servers:

  • The Responses API: An API with a simple request-response pattern, ideal for basic tool calling and quick integrations.
  • The Agents SDK (this guide): An advanced agent framework with sessions, handoffs, and persistent context that is perfect for complex conversational workflows.
  • ChatGPT Connectors: Connectors offer direct ChatGPT integration to end users via a web UI.

If you’re unsure which approach fits your needs, start with the Responses API guide for simpler implementations or try ChatGPT Connectors for a web UI solution.

Prerequisites

To follow this tutorial, you need:

Creating a Taskmaster MCP server

Before connecting the OpenAI Agents SDK to a Taskmaster MCP server, you first need to create one.

Follow the guide to creating a Taskmaster MCP server, which walks you through:

  • Setting up a Gram project with the Taskmaster OpenAPI document
  • Getting a Taskmaster API key from your instance
  • Configuring environment variables
  • Publishing your MCP server with the correct authentication headers

Once you have your Taskmaster MCP server configured, return here to connect it to the OpenAI Agents SDK.

Connecting Agents SDK to your Gram-hosted MCP server

The OpenAI Agents SDK supports MCP servers through the HostedMCPTool class. Here’s how to connect to your Gram-hosted MCP server:

Installation

First, install the required packages:

Set your OpenAI API key and Taskmaster credentials:

:::note[Code Examples] Throughout this guide, replace your-taskmaster-slug with your actual MCP server URL and update the header names to match your server configuration from the guide to creating a Taskmaster MCP server. :::

Basic connection (public server)

Here’s a basic example using a public Gram MCP server:

Authenticated connection

For authenticated Gram MCP servers, include the appropriate authentication headers. The exact format varies by MCP server.

Understanding the configuration

Each parameter in the tool_config does the following:

  • type: "mcp" specifies that this is an MCP tool.
  • server_label adds a unique identifier for your MCP server.
  • server_url adds your Gram-hosted MCP server URL.
  • headers adds authentication headers (optional for public servers).
  • require_approval controls tool call approval behavior.

Advanced agent features

The Agents SDK provides several advanced features that go beyond simple tool calling.

Persistent sessions

Unlike the Responses API, agents maintain conversation history automatically:

Tool approval workflows

For production environments, you can implement approval workflows:

Error handling and retries

The Agents SDK provides built-in error handling:

Building conversational workflows

The following complete example demonstrates building a conversational task management workflow:

Tool filtering and permissions

You can control which tools are available to your agent:

For example, Taskmaster MCP servers provide tools like taskmaster_get_tasks, taskmaster_create_task, taskmaster_delete_task, taskmaster_get_projects, and taskmaster_create_project. The exact tool names depend on your MCP server configuration.

Testing your integration

Validating MCP server connectivity

Before building complex workflows, test your Gram MCP server in the Gram Playground  to ensure the tools work correctly.

Screenshot of the Gram Playground testing Taskmaster tools

Using the MCP Inspector

You can also use the MCP Inspector  to test your Gram MCP server:

When the browser opens:

  • In the Transport Type field, select Streamable HTTP (not the default stdio).
  • Enter your server URL: https://app.getgram.ai/mcp/your-taskmaster-slug.
  • For authentication, add API Token Authentication:
    • Header name: MCP-TASKMASTER-API-KEY
    • Bearer token: Your Taskmaster API key
  • Click Connect to test the connection.

Note: Taskmaster servers use custom authentication headers that may not be fully supported by the standard MCP Inspector interface. For guaranteed testing, use the Gram Playground or the code examples in this guide.

Screenshot of the MCP Inspector connecting to a Taskmaster MCP server

Debugging agent interactions

The Agents SDK provides built-in tracing for debugging:

What’s next

You now have the OpenAI Agents SDK connected to your Gram-hosted MCP server with advanced task management capabilities.

The Agents SDK’s features, like persistent sessions, approval workflows, and built-in error handling, make it ideal for building production-ready conversational agents that can handle complex workflows.

Ready to build your own MCP server? Try Gram today  and see how easy it is to turn any API into agent-ready tools that work with both OpenAI and Anthropic models.

Last updated on