Interruption lets users send a new message while the agent is still streaming a response. When a user sends a new message, the client transport creates a new turn via an HTTP POST to the server. Because turns are independent, the new turn starts immediately regardless of whether a previous turn is still streaming.
How it works
When the user sends a new message, the client transport posts it to the server, which starts a new turn. The user doesn't need to wait for the current response to finish before sending a new message.
There are two patterns for handling interruption:
| Pattern | Behavior | Use case |
|---|---|---|
| Cancel-then-send | Cancel the current turn, then send a new message | Stop button + new prompt |
| Send-alongside | Send a new message while the current turn continues | Follow-up without waiting |
With cancel-then-send, the active turn is aborted before the new message is dispatched. The agent stops generating, cleans up, and starts a fresh turn. With send-alongside, both turns run concurrently - each with its own stream and cancel handle.
Cancel-then-send
Detect whether a turn is active, cancel it, then send a new message. This is the most common interruption pattern - it mimics a user pressing a stop button and immediately re-prompting.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
import { useActiveTurns, useClientTransport, useView } from '@ably/ai-transport/react'
function Chat({ channel, clientId }) {
const transport = useClientTransport({ channel, clientId })
const { nodes, send } = useView(transport)
const activeTurns = useActiveTurns(transport)
const handleSend = async (text) => {
// If the agent is streaming, cancel first
if (activeTurns.size > 0) {
await transport.cancel()
}
await send([{ id: crypto.randomUUID(), role: 'user', parts: [{ type: 'text', text }] }])
}
}transport.cancel() publishes a cancel signal on the channel. The server's abort signal fires, the LLM stream stops, and the turn ends with reason 'cancelled'. The new message is then sent on a clean turn.
Send-alongside
Send a new message without cancelling the active turn. Both turns run concurrently - the agent continues streaming the first response while processing the new input.
1
2
3
4
const handleSend = async (text) => {
// Send without cancelling - both turns run concurrently
await send([{ id: crypto.randomUUID(), role: 'user', parts: [{ type: 'text', text }] }])
}Each concurrent turn has its own stream and its own cancel handle. You can cancel them independently:
1
2
// Cancel a specific turn, leave others running
await transport.cancel({ turnId: specificTurnId })Detect active turns
The useActiveTurns hook returns a Map<clientId, Set<turnId>> of all currently streaming turns. Use it to check whether the agent is mid-response:
1
2
3
4
5
6
7
8
const activeTurns = useActiveTurns(transport)
// Check if any turns are streaming
const isStreaming = activeTurns.size > 0
// Check if a specific client has active turns
const agentTurns = activeTurns.get('agent-client-id')
const agentIsStreaming = agentTurns && agentTurns.size > 0This is useful for toggling UI between a send button and a stop button, or for disabling input while a cancellation is in progress.
Related features
- Cancellation - cancel signals, filters, and server-side abort handling
- Concurrent turns - multiple turns running in parallel
- Double texting - handling multiple user messages in quick succession
- Client transport API - reference for
cancel,send, and other client methods. - Sessions and turns - how bidirectional sessions enable interruption.
- Get started - build your first AI Transport application.