TL;DR useChat stores messages in React component state. When the component unmounts - on navigation, reload, or tab close - the history is gone. The AI SDK's built-in stream resumption covers one case: reconnecting to a generation that is still in progress after a page reload. Returning to a completed conversation is a different problem. A channel with persistent history covers initial load, live updates, and reconnect catch-up in a single mechanism, without separate infrastructure for each.
By default, useChat keeps messages in React component state. That means a page reload wipes them. There's no automatic write to localStorage or a database. Vercel made this a deliberate non-decision: how you store and retrieve conversation history depends on your stack and your sync requirements. The SDK doesn't pick for you.
What stream resumption covers
The AI SDK's built-in stream resumption covers one specific scenario: a page reload during an active generation. If you reload mid-stream, the resume option in useChat can reconnect to the active stream stored in Redis and pick up from where delivery stopped.
If the generation has already finished, there's nothing to reconnect to. The history doesn't exist anywhere to resume from. Coming back to a completed conversation needs a persistence layer. That's a different thing to build.
Storing messages yourself
The most common approach: write messages to a database after each exchange, then load them into useChat via the messages prop on the next visit. The AI SDK documentation covers this pattern in its message persistence guide.
It works well enough until requirements get more specific. A second device viewing the same conversation in real time needs a sync layer on top. A colleague picking up the thread needs access control and live updates. A user who dropped out and reconnected needs you to decide what to replay and from where. None of these come included.
Most teams hit the same realization at roughly the same point: they've ended up building several things that could have been one.
Using a channel with persistent history
A channel with history stores every message as it's published and makes it available to subscribers. A client that connects late, or drops and reconnects, gets the messages it missed from where it left off.
That takes care of a few things at once: the initial load, live updates, reconnect catch-up, and syncing across devices. You're not managing separate systems for each. The session state lives in the channel rather than spread across a database, a polling service, and a WebSocket connection.
Most purpose-built realtime platforms provide channel history as part of their infrastructure. Retention windows vary by provider and plan.
Why this matters more for AI applications
Message loss matters more in AI applications than in most chat apps. In a regular chat, losing a few messages on reconnect is annoying. In an AI conversation, losing the history means the model loses its context. A response generated on day one shapes the question you ask on day two. If that's gone, the next generation is working without the reasoning chain that makes the conversation coherent.
There are a couple of other scenarios worth keeping in mind. AI responses take time. People often come back to them later, on a different device or shared with a colleague. And if you're running agents that take minutes to complete, you need those results to reach the client even if the user navigated away mid-run.
Connecting channel history to useChat
The ChatTransport interface in the AI SDK makes the transport layer pluggable. A transport connected to a channel with history handles history replay automatically on connect, so useChat receives messages without a separate fetch:
When you return to the session, the transport subscribes to the channel and rewinds to the start. The messages array fills in from history automatically. From the component's perspective, it's just messages arriving.
Choosing an approach
If you're building something simple for a single user on a single device, a database plus the messages prop is the direct path. The AI SDK persistence documentation covers it.
Channel history is worth the extra step when things get more complicated: multiple devices, colleagues sharing a session, agents running in the background while users are away. Basically any scenario where the connection between the user and the generation is unreliable or non-linear.
For long-term retention, you'd typically combine both. Channel history covers the active window. A database handles everything older than that.
What to look for in a transport for persistent messaging
Channel history on subscribe. The transport should replay stored messages on connect without requiring a separate API call. A session that handles initial load, live updates, and reconnect catch-up from the same mechanism avoids building separate infrastructure for each.
Offset-based reconnection. A client returning after a disconnect should receive only the messages it missed, from its last known position. Full replay on every reconnect overstates the loss and creates unnecessary processing.
Multi-device fan-out. The same session should be accessible from any device with the right session ID. A second device joining mid-conversation should receive history and then live updates without additional infrastructure.
Configurable retention. Short windows cover reconnects. Longer windows cover returning sessions. For permanent storage, you'd combine channel history with a database — but the transport determines how much of that work you need to do yourself.
Ably AI Transport implements the Vercel AI SDK ChatTransport interface and handles persistent message history as part of the session layer, covering reconnects, multi-device sync, and delivery to clients that were offline when results arrived. Visit the Ably AI Transport overview, read the documentation, or sign up free to start building.
Sources: AI SDK UI chatbot message persistence; AI SDK stream resumption; resumable-stream; AI SDK UI Transport documentation. GitHub issues cited: #8390 (resume and stop incompatible, acknowledged by Vercel).
Recommended Articles
WebSockets on Vercel: why serverless functions can't host them
Vercel serverless functions can't host WebSocket connections, even with Fluid Compute. Options and how to connect a WebSocket provider to Vercel AI SDK.
Vercel AI SDK ChatTransport: implementing a custom WebSocket transport
ChatTransport in Vercel AI SDK 5 lets you replace the default HTTP transport with WebSockets. Application code, agents, and UI stay unchanged.
Durable sessions for Vercel AI SDK applications
Vercel AI SDK's SSE transport breaks in production: proxy buffering, no reconnect, serverless limits. ChatTransport makes it swappable. Options compared.