If you have any kind of SaaS application, you’re probably used to your users using it via a frontend. But now, instead of clicking through filters and forms, they want to just ask an AI agent for what they need. For example:
This guide shows you how to add a chat interface to an existing application. You can give your users natural language interaction, while your current backend, security model, and business logic remain unchanged.
This guide will show you how to use Gram to create an MCP server that works with your existing API. We’ll also create a basic chat modal to interact with that MCP server via OpenAI.
An overview of the TaskBoard project
The starting task management app is structured like this:
The app interface before adding chat:
Users interact with the app by clicking, typing, and moving tasks.
The app works, but we want users to be able to chat with their tasks.
Overview of the TaskBoard project with added chat
We’ll add a small chat microservice and popup chat component to the existing app:
Here’s the app interface with the chat component added:
Understanding the data flow of a chat-based task board
For the chat service, we’ll use the OpenAI Agents SDK to connect to the existing TaskBoard API. Here’s the complete flow:
User types “Show me overdue tasks.”
Chat popup sends request to Next.js backend, which forwards it to the chat service.
OpenAI Agents SDK calls the /api/items endpoint.
Response flows back with results formatted as natural language.
TaskBoard UI updates to show the filtered results.
The existing app doesn’t change: Same database, authentication, and business logic. We’re just adding a new component to what is already built.
Traditional vs chat-enabled workflows
Before the chat functionality is added, users only interact with the app manually:
User clicks “Add Task” → TaskBoard UI → POST /api/items → database → UI updates
User drags task to “Done” → TaskBoard UI → PATCH /api/items → database → UI updates
After users have the option to use the chat popup:
User types “Create a task for the client meeting tomorrow”
Chat UI → /api/chat → FastAPI chat service → mcp-agent
TaskBoard UI automatically refreshes and shows the new task
Note that the existing /api/items endpoints don’t change at all. The chat functionality just adds a way to call the same APIs the UI already uses, with an LLM interpreting the user’s intent.
The complete chat-enabled architecture looks like this:
First, get the basic CRUD app without chat functionality running. TaskBoard is already dockerized for easy setup.
Visit http://localhost:3000, register an account, and create a few tasks. Click the arrows to move tasks between columns. This is the starting point: A working task management app that users interact with manually.
Step 1: Create an MCP server
The MCP server exposes the TaskBoard API operations as tools that AI agents can use.
To create the server, you need to generate an OpenAPI document and upload it to Gram. Gram will use the OpenAPI document to convert the API endpoints into MCP tools and host them as an MCP server. The OpenAI Agents SDK can connect to this server and use the tools to interact with the TaskBoard API.
Generate the OpenAPI document
The TaskBoard app includes a script to generate an OpenAPI document from the JSDoc comments.
Generate the OpenAPI document with the following commands:
This creates the complete API specification, including authentication requirements and data schemas, at public/swagger.json.
The existing JSDoc comments in the API routes (like /api/items/route.ts) provide the structure Gram needs to understand the TaskBoard endpoints.
:::tip
If you’re adding chat functionality to your own CRUD app, you likely already have an OpenAPI document. If not, tools like next-swagger-doc and swagger-jsdoc can generate one from your existing code comments. To learn more about how to generate an OpenAPI document in other frameworks, visit the Speakeasy OpenAPI hub .
:::
Upload the OpenAPI document to Gram
Gram automatically transforms the uploaded OpenAPI document into a hosted MCP server.
If you’re using Gram for the first time:
In the Gram dashboard, click Toolsets in the sidebar (under CREATE).
Click Get Started.
Upload the OpenAPI document (public/swagger.json).
Name the API (for example, “TaskBoard”), toolset, and server slug (for example, “taskboard-demo”).
If you’re an existing Gram user:
Go to Toolsets in the sidebar.
In the API Sources section, click + ADD API.
Upload the OpenAPI document (public/swagger.json).
Name the API (for example, “TaskBoard”).
Click Continue.
In the Toolsets section of the Toolsets tab, click + ADD TOOLSET.
In the “Create a toolset” modal, give the toolset a name (for example, “TaskBoard”).
Click Enable All on the TaskBoard toolset page.
Gram parses the uploaded OpenAPI document and converts the endpoints into MCP tools.
In the MCP tab, find the MCP server and set it to Public under Visibility.
In the MCP Installation section, copy the server URL from the "args" array (for example, https://app.getgram.ai/mcp/your-server-id). You’ll use this when you configure the chat microservice in Step 3.
Expose your local API with ngrok
For local development, expose the TaskBoard API so Gram can access it:
Copy the public URL from the ngrok terminal output (such as https://abc123.ngrok.io).
In Gram, go to Environments from the sidebar, and select the Default environment. Add the public URL you copied as a TASKBOARD_SERVER_URL environment variable.
:::note[Production deployment]
This guide uses ngrok for local development testing. In production, you’d replace TASKBOARD_SERVER_URL with your actual API URL (like https://api.taskboard.com).
:::
Test the MCP server
To verify that the MCP server has been configured correctly, navigate to the Playground from the sidebar (under CONSUME) and select the TaskBoard toolset.
Try queries like “Show me all my tasks” or “Create a task called ‘Review quarterly reports.’”
:::note
In the Playground, you’ll need to provide a userId or ask the agent to use authentication tools. The chat integration will handle user authentication automatically.
:::
Step 2: Add a chat popup to the frontend
Next, you’ll add a floating chat interface to the TaskBoard app that won’t interfere with existing functionality.
The chat component will appear as a floating button in the lower-right corner of the screen. When clicked, the button expands into a chat window.
Integrate the chat component into the main dashboard:
Install the required dependencies
Install the chat component’s dependencies:
Style the component
The chat component uses custom CSS classes and animations that need to be defined in the global stylesheet.
The complete CSS for the component is in globals.css in the completed-ai-chat branch of the TaskBoard repo.
Add the CSS classes and animations for the chat component from the completed version’s globals.css to your local globals.css file.
Step 3: Create the chat microservice
At this point, you should have a chat interface that looks functional but doesn’t connect to AI yet. The next step is to add the backend to make it work.
You’ll create a small FastAPI service (about 100 lines of code) that handles the AI processing using the OpenAI Agents SDK. Here’s how the service works:
Receives chat messages from the TaskBoard frontend.
Creates an AI agent that accesses the TaskBoard MCP tools via Gram.
The agent decides which tools to use based on user intent.
Returns natural language responses with the results.
The OpenAI Agents SDK handles understanding user intent, choosing which MCP tools to call to achieve the user’s intent, and generating responses based on the results of the MCP tools.
Set up the chat service
Create the chat service directory:
Create the requirements file:
Create the main FastAPI service. Important: Replace https://app.getgram.ai/mcp/your-server-id on line 525 with your actual MCP server URL from Step 1.
How the agent works
Here’s how the OpenAI Agents SDK automates the workflow:
Tool discovery: The SDK connects the agent to the MCP server to access the available TaskBoard tools (like get_items and create_item).
Action planning: When a user says “show overdue tasks,” the agent decides which MCP tools to call.
Execution: The agent calls the appropriate MCP tools, which in turn call the /api/items endpoint.
Response formatting: The agent converts the raw JSON response into natural language.
You don’t need to engineer prompts, write tool calling logic, or format responses. The SDK handles it all.
Set up environment variables
In the mcp-agent-service directory, create a .env file with your OpenAI API key and ngrok URL. In the following command, replace your_openai_api_key_here and https://your-ngrok-url.ngrok.io with your actual values:
Step 4: Connect the frontend to the chat service
Now let’s create the API route that connects the chat frontend to the chat service, ensuring user authentication is preserved.
Add the chat API route
Create the chat API endpoint in the Next.js app:
Update the middleware
Update this section of middleware.ts in the demo code to include the new chat route:
Set up Docker
Now set up Docker to run the TaskBoard app and the chat service together.
Create the Dockerfile inside the mcp-agent-service directory:
Update the docker-compose.yml file in the root TaskBoard directory to include both services:
Set up environment variables for Docker
Update the .env file in the root TaskBoard directory (same level as docker-compose.yml), replacing the placeholder values with your actual API key and ngrok URL:
Start both services from the root TaskBoard directory (where docker-compose.yml is located):
Test the complete integration
Visit http://localhost:3000, log in, and click the chat button. Try these test queries:
You should see:
Chat responses in the chat window
Automatic TaskBoard updates when the agent performs actions
Only your tasks are visible and editable (user permissions respected)
Troubleshooting
Chat not responding
Check that both services are running (localhost:3000 and localhost:8085).
Verify that your OpenAI API key is valid.
Look for errors in both terminal windows.
AI can’t access your tasks
Confirm ngrok is running and you’ve updated the TASKBOARD_SERVER_URL in Gram.
Test the MCP server in the Gram Playground first.
Check that the user is properly authenticated in TaskBoard.
Permission errors
Make sure the userId is being passed correctly through the chat flow.
Verify that the API middleware is working: curl "http://localhost:3000/api/items?userId=test".