Get started with Vercel AI SDK

Open in

Build a streaming AI chat application using Vercel AI SDK and Ably AI Transport. By the end, you'll have a chat app with durable sessions that survive disconnections and work across multiple tabs.

Prerequisites

  • Node.js 20+
  • An Ably account with an API key
  • An Anthropic API key (or OpenAI - adapt the model config)

Install dependencies

npm install @ably/ai-transport ably ai @ai-sdk/react @ai-sdk/anthropic next react react-dom jsonwebtoken

Set up environment variables

Create .env.local:

ABLY_API_KEY=your-ably-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key

Step 1: Create an Ably token endpoint

Create the file app/api/auth/ably-token/route.ts. This endpoint issues JWT tokens for client authentication. The token contains the client's identity and channel permissions.

JavaScript

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

import jwt from 'jsonwebtoken'
import { NextResponse } from 'next/server'

export async function GET(req) {
  const apiKey = process.env.ABLY_API_KEY
  const [keyName, keySecret] = apiKey.split(':')

  const url = new URL(req.url)
  const clientId = url.searchParams.get('clientId') ?? `user-${crypto.randomUUID().slice(0, 8)}`

  const token = jwt.sign(
    {
      'x-ably-clientId': clientId,
      'x-ably-capability': JSON.stringify({ '*': ['publish', 'subscribe', 'history'] }),
    },
    keySecret,
    { algorithm: 'HS256', keyid: keyName, expiresIn: '1h' },
  )

  return new NextResponse(token, {
    headers: { 'Content-Type': 'application/jwt' },
  })
}

Step 2: Create an Ably provider

Create the file app/providers.tsx. Wrap your app with the Ably provider. It creates an authenticated Ably client using the token endpoint.

JavaScript

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

'use client'

import { useEffect, useState } from 'react'
import * as Ably from 'ably'
import { AblyProvider } from 'ably/react'

export function Providers({ clientId, children }) {
  const [client, setClient] = useState(null)

  useEffect(() => {
    const ably = new Ably.Realtime({
      authCallback: async (_tokenParams, callback) => {
        try {
          const response = await fetch(`/api/auth/ably-token?clientId=${encodeURIComponent(clientId ?? '')}`)
          const jwt = await response.text()
          callback(null, jwt)
        } catch (err) {
          callback(err instanceof Error ? err.message : String(err), null)
        }
      },
    })
    setClient(ably)
    return () => ably.close()
  }, [clientId])

  if (!client) return null
  return <AblyProvider client={client}>{children}</AblyProvider>
}

Step 3: Create the server API route

Create the file app/api/chat/route.ts. The server handles user messages, invokes the LLM, and streams the response through an Ably channel. The HTTP response returns immediately - tokens flow through the durable session.

JavaScript

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

import { after } from 'next/server'
import { streamText, convertToModelMessages } from 'ai'
import { anthropic } from '@ai-sdk/anthropic'
import Ably from 'ably'
import { createServerTransport } from '@ably/ai-transport/vercel'

const ably = new Ably.Realtime({ key: process.env.ABLY_API_KEY })

export async function POST(req) {
  const { messages, history, id, turnId, clientId, forkOf, parent } = await req.json()
  const channel = ably.channels.get(id)

  const transport = createServerTransport({ channel })
  const turn = transport.newTurn({ turnId, clientId, parent, forkOf })

  await turn.start()

  if (messages.length > 0) {
    await turn.addMessages(messages, { clientId })
  }

  const allMessages = [...(history ?? []).map((h) => h.message), ...messages.map((m) => m.message)]

  const result = streamText({
    model: anthropic('claude-sonnet-4-20250514'),
    system: 'You are a helpful assistant.',
    messages: await convertToModelMessages(allMessages),
    abortSignal: turn.abortSignal,
  })

  after(async () => {
    const { reason } = await turn.streamResponse(result.toUIMessageStream())
    await turn.end(reason)
    transport.close()
  })

  return new Response(null, { status: 200 })
}

Step 4: Create the chat component

Create the file app/chat.tsx. The client uses Vercel's useChat hook with an AI Transport adapter. The transport subscribes to the Ably channel and syncs messages across all connected clients.

JavaScript

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

'use client'

import { useChat } from '@ai-sdk/react'
import { useChannel } from 'ably/react'
import { useClientTransport, useActiveTurns, useView } from '@ably/ai-transport/react'
import { useChatTransport, useMessageSync } from '@ably/ai-transport/vercel/react'
import { UIMessageCodec } from '@ably/ai-transport/vercel'
import { useState } from 'react'

export function Chat({ chatId, clientId }) {
  const { channel } = useChannel({ channelName: chatId })
  const [input, setInput] = useState('')

  const transport = useClientTransport({ channel, codec: UIMessageCodec, clientId })
  const chatTransport = useChatTransport(transport)
  const { messages, setMessages, sendMessage, stop } = useChat({
    id: chatId,
    transport: chatTransport,
  })

  useMessageSync(transport, setMessages)
  useView(transport, { limit: 30 })

  const activeTurns = useActiveTurns(transport)
  const isStreaming = activeTurns.size > 0

  return (
    <div>
      {messages.map((msg) => (
        <div key={msg.id}>
          <strong>{msg.role}:</strong>{' '}
          {msg.parts.map((part, i) =>
            part.type === 'text' ? <span key={i}>{part.text}</span> : null
          )}
        </div>
      ))}
      <form onSubmit={(e) => {
        e.preventDefault()
        sendMessage({ text: input })
        setInput('')
      }}>
        <input value={input} onChange={(e) => setInput(e.target.value)} placeholder="Type a message..." />
        {isStreaming ? (
          <button type="button" onClick={stop}>Stop</button>
        ) : (
          <button type="submit">Send</button>
        )}
      </form>
    </div>
  )
}

Step 5: Wire it together

Create the file app/page.tsx:

JavaScript

1

2

3

4

5

6

7

8

9

10

11

import { Providers } from './providers'
import { Chat } from './chat'

export default function Page() {
  const chatId = 'my-chat-session'
  return (
    <Providers>
      <Chat chatId={chatId} />
    </Providers>
  )
}

Run npm run dev and open http://localhost:3000. Open a second tab to the same URL - both tabs share the same durable session.

What's happening

  1. The client sends the user message via HTTP POST to your API route.
  2. The server publishes the message to the Ably channel, invokes the LLM, and streams tokens to the channel.
  3. Every client subscribed to the channel receives tokens in realtime.
  4. If a client disconnects, it automatically reconnects and resumes from where it left off.

This is a durable session. The HTTP request triggers the agent, but all communication flows through the Ably channel.

What to explore next