Chain of thought

Open in

Chain of thought in AI Transport streams reasoning content alongside the main response text. The codec supports multiple stream types within a single turn - text and reasoning are delivered as separate streams that render independently in the UI.

How it works

When an LLM produces reasoning or thinking tokens, the codec multiplexes them alongside text tokens on the same Ably channel. Each stream type is tagged so the client can route reasoning content to one part of the UI and response text to another.

With the Vercel AI SDK integration, reasoning arrives as a separate reasoning stream type within the UI message stream:

JavaScript

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

// Server: stream a response that includes reasoning
app.post('/api/chat', async (req, res) => {
  const { turnId, clientId, messages } = req.body
  const turn = transport.newTurn({ turnId, clientId })

  const result = streamText({
    model: anthropic('claude-sonnet-4-20250514'),
    messages,
    abortSignal: turn.abortSignal,
  })

  // The codec handles multiplexing text and reasoning streams
  await turn.streamResponse(result.toUIMessageStream())
  await turn.end('complete')
  res.json({ ok: true })
})

No additional server configuration is needed. If the model produces reasoning tokens, the codec encodes them as a distinct stream within the turn.

Display reasoning in the UI

On the client, message nodes include both text and reasoning content. Render them separately to show the agent's thinking process:

JavaScript

1

2

3

4

5

6

7

8

9

10

11

12

const { nodes } = useView(transport)

// Each node may contain text parts, reasoning parts, or both
for (const node of nodes) {
  for (const part of node.message.parts) {
    if (part.type === 'reasoning') {
      renderThinkingPanel(part.reasoning)
    } else if (part.type === 'text') {
      renderResponsePanel(part.text)
    }
  }
}

Both streams update in real time. Users see the reasoning content appear as the model thinks, followed by (or alongside) the response text.