Speakeasy Logo
Skip to Content
Gram DocumentationExamplesBuild a ChatGPT App

OpenAI Apps SDK with Gram Functions

Deploy your first ChatGPT App

This walkthrough shows how to build and deploy an interactive ChatGPT App using the OpenAI Apps SDK backed by Gram Functions. You’ll learn how to:

  • Create a ChatGPT Apps SDK example and use it in your ChatGPT client
  • Serve MCP tools and resources from a hosted Gram Functions MCP Server

The finished experience lets a user ask for pizza recommendations, watch ChatGPT drop pins onto an interactive map, and keep the conversation flowing—all without leaving the chat.

Pizza Map widget rendered inside ChatGPT

If you’re new to Apps, the official Apps SDK docs  provides a great overview of how connectors show up inside the product. You can view the full code for this pizza example starting at the pizza-app-gram README .

The OpenAI Apps SDK gives you a fast way to package tools, resources, and UI widgets for ChatGPT. Gram Functions provide the same ergonomic developer experience for deployments and observability. In this guide we’ll rebuild the Pizza Map sample from the official Apps SDK examples and ship it as a hosted Gram MCP server so you can light it up inside ChatGPT in a few minutes.

Enable ChatGPT developer mode

Custom connectors only appear after you turn on developer mode in ChatGPT:

  • Open ChatGPT .
  • Navigate to Settings → Apps & Connectors → Advanced Settings.
  • Toggle Developer mode on.

Once enabled you can install and test the Pizza Map app directly in your ChatGPT sidebar.

What you’ll do

  • Inline the Pizza Map widget so it can be shipped as a Gram function bundle
  • Deploy the packaged function to Gram
  • Implement the MCP server that exposes both tools and HTML widget resources
  • Install your new MCP server in ChatGPT and use it as an app

Prerequisites

  • Node.js 22.18+ and pnpm
  • Gram CLI - install via the official guide
  • The OpenAI Apps SDK CLI and an API key that can create ChatGPT apps
  • Basic familiarity with MCP concepts (tools, resources)

Project layout

Clone the Speakeasy fork for the OpenAI pizza app sample and install dependencies:

git clone https://github.com/speakeasy-api/openai-apps-sdk-examples.git cd openai-apps-sdk-examples/pizzaz_server_node/pizza-app-gram pnpm install

From here you’ll see two sibling directories:

PathPurpose
pizzaz_server_node/srcSource for the MCP server plus the bundled JS/HTML/CSS widget templates it serves locally
pizzaz_server_node/pizza-app-gramThin wrapper that knows how to package and deploy that server to Gram Functions

Inline the widget assets

The base Pizza Map example expects you to host the widget’s JS/CSS/HTML from a separate asset server. Gram Functions can proxy that setup, but to keep deployment simple we’ll inline everything into a static blob the MCP server can serve directly. The project ships an inline:app script that snapshots the Pizza Map React UI into a widget-template.ts module:

pnpm inline:app

Under the hood scripts/build-inlined.ts walks the Pizza Map web assets, minifies them, and writes a WIDGET_HTML_TEMPLATES map that the MCP server can read without making additional network calls. Re-run the script whenever you touch the UI.

Build the MCP server

All of the interesting Apps SDK wiring happens inside pizzaz_server_node/src/mcp-server.ts. The sample already sets up everything you need—no edits required—but it helps to understand how the tool and widget are structured. The module defines a single pizza-map tool, the HTML resource that backs the widget, and enough metadata for ChatGPT to know it can render inline UI. Here’s the core of that file:

pizzaz_server_node/src/mcp-server.ts
import { Server } from "@modelcontextprotocol/sdk/server/index.js"; import { CallToolRequestSchema, ListToolsRequestSchema } from "@modelcontextprotocol/sdk/types.js"; import { z } from "zod"; import { WIDGET_HTML_TEMPLATES } from "./widget-templates.ts"; const widgets = [ { id: "pizza-map", title: "Show Pizza Map", templateUri: "ui://widget/pizza-map.html", invoking: "Hand-tossing a map", invoked: "Served a fresh map", html: WIDGET_HTML_TEMPLATES.pizzaz, responseText: "Rendered a pizza map!", }, ]; const toolInputParser = z.object({ pizzaTopping: z.string() }); export function createPizzazServer(): Server { const server = new Server({ name: "pizzaz-node", version: "0.1.0" }, { capabilities: { resources: {}, tools: {} } }); server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools: widgets.map((widget) => ({ name: widget.id, description: widget.title, inputSchema: { type: "object", properties: { pizzaTopping: { type: "string", description: "Topping to mention in the widget." } }, required: ["pizzaTopping"], }, _meta: { "openai/outputTemplate": widget.templateUri, "openai/resultCanProduceWidget": "true", "openai/widgetAccessible": "true", }, })), })); server.setRequestHandler(CallToolRequestSchema, async (request) => { const args = toolInputParser.parse(request.params.arguments ?? {}); const widget = widgets.find((w) => w.id === request.params.name); if (!widget) { throw new Error(`Unknown tool: ${request.params.name}`); } return { content: [{ type: "text", text: widget.responseText }], structuredContent: { pizzaTopping: args.pizzaTopping }, _meta: { "openai/outputTemplate": widget.templateUri, "openai/toolInvocation/invoking": widget.invoking, "openai/toolInvocation/invoked": widget.invoked, }, }; }); return server; }

Wrap the MCP server with Gram

Gram just needs a default export that returns a withGram-wrapped server. The pizza-app-gram sub-project keeps that glue tiny:

pizza-app-gram/src/mcp.ts
import { createPizzazServer } from "../../src/mcp-server.ts"; export const server = createPizzazServer();
pizza-app-gram/src/gram.ts
import { withGram } from "@gram-ai/functions/mcp"; import { server } from "./mcp.ts"; export default withGram(server);

Because withGram speaks MCP over stdio automatically, you don’t have to write an HTTP transport or worry about session stickiness. Gram handles the hosting, authentication, and scaling for you once you push the bundle.

Build and deploy to Gram

pnpm build # runs `gf build`, producing dist/functions.js gram auth # once per machine pnpm push # wraps `gf push` to upload the bundle

After the push completes, your Gram project exposes a hosted MCP endpoint. Copy the connection string (or hosted URL) from the Gram dashboard—you’ll need it for the Apps SDK transport config in the next step.

Create a Gram Toolset and MCP Server

Open the Gram dashboard and wire everything together:

  1. Create a new Toolset (e.g., “Pizza Map”).
  2. Add the pizza-map tool that shipped with your deployment.
  3. Attach the pizza-map resource so ChatGPT can fetch the widget HTML.

Adding the Pizza App tool to your toolset

Adding the Pizza Map resource to the toolset

With the toolset configured, publish the MCP server and make it public so ChatGPT can reach it over HTTPS.

Creating a public Pizza MCP server

Add your MCP Server to ChatGPT

In ChatGPT, navigate to Settings → Apps & Connectors, click Create, and register a new connector that points to the public Gram MCP endpoint URL you just created.

Adding the Pizza MCP server inside ChatGPT

Last updated on