Skip to Content

Context-length management and frontend tool turn resumption

This release adds two context-length management primitives that prevent upstream “prompt is too long” errors on long, tool-heavy conversations, and fixes chat turn resumption after client-side frontend tools complete.

Features

  • Tool output byte cap and context auto-compaction #2316  - Added two context-length management primitives to ElementsConfig: tools.maxOutputBytes caps the UTF-8 byte size of any single MCP tool call’s result using a head-plus-tail truncation strategy with a notice suffix (disabled by default, opt in per-page), and contextCompaction auto-compacts older turns when the estimated token count passes a configurable fraction of the model’s context ceiling (default 70%, enabled by default). (Author: @simplesagar )

Bug fixes

  • Resume chat turn after frontend tools #2322  - useChatRuntime now wires sendAutomaticallyWhen: lastAssistantMessageIsCompleteWithToolCalls, so Skip and Save clicks on frontend-tool forms no longer leave the conversation stuck with an unresolved tool call. (Author: @danielkov )

Last updated on

AI everywhere.