Set up the assistant-ui runtime provider that manages the connection between your frontend and backend. This handles message streaming, tool execution, and state management.
What this demonstrates:
useChatRuntime creates a runtime that connects to your backend API
AssistantChatTransport handles the communication protocol with your /api/chat endpoint
AssistantRuntimeProvider makes the runtime available to all child components
"use client";import { AssistantRuntimeProvider } from "@assistant-ui/react";import { useChatRuntime, AssistantChatTransport,} from "@assistant-ui/react-ai-sdk";function App() { const runtime = useChatRuntime({ transport: new AssistantChatTransport({ api: "/api/chat" }), }); return ( <AssistantRuntimeProvider runtime={runtime}> {/* Your chat UI */} </AssistantRuntimeProvider> );}
Create a tool on your server that the LLM can call. The tool returns structured JSON that matches a Tool UI component schema. Here we're creating a link preview tool that returns data for the LinkPreview component.
What this demonstrates:
SerializableLinkPreviewSchema ensures your tool output matches the LinkPreview component's expected format
id identifies this rendering in the conversation
parseSerializableLinkPreview on the frontend validates the result at runtime before rendering
Your tool can fetch real data (APIs, databases) and return it in the component schema format
import { streamText, tool, convertToModelMessages, jsonSchema } from "ai";import { openai } from "@ai-sdk/openai";export async function POST(req: Request) { const { messages } = await req.json(); const result = streamText({ model: openai("gpt-4o"), messages: convertToModelMessages(messages), tools: { previewLink: tool({ description: "Show a preview card for a URL", inputSchema: jsonSchema<{ url: string }>({ type: "object", properties: { url: { type: "string", format: "uri" } }, required: ["url"], additionalProperties: false, }), async execute({ url }) { // In production, you'd fetch real metadata here return { id: "link-preview-1", href: url, title: "Example Site", description: "A description of the linked content", image: "https://example.com/image.jpg", }; }, }), }, }); return result.toUIMessageStreamResponse();}
Connect your backend tool to a frontend component. When the LLM calls the previewLink tool, assistant-ui will automatically render the LinkPreview component with the returned data.
What this demonstrates:
makeAssistantToolUI creates a component that listens for tool calls by name
parseSerializableLinkPreview validates the result at runtime and returns typed props
The toolName must exactly match your backend tool name
Simply including <PreviewLinkUI /> in your app registers the UI for that tool
import { makeAssistantToolUI } from "@assistant-ui/react";import { LinkPreview, LinkPreviewErrorBoundary, parseSerializableLinkPreview,} from "@/components/tool-ui/link-preview";export const PreviewLinkUI = makeAssistantToolUI({ toolName: "previewLink", // Must match backend tool name render: ({ result }) => { // Tool outputs stream in; `result` will be `undefined` until the tool resolves. if (result === undefined) { return ( <div className="bg-card/60 text-muted-foreground w-full max-w-md rounded-2xl border px-5 py-4 text-sm shadow-xs"> Loading preview… </div> ); } const preview = parseSerializableLinkPreview(result); return ( <LinkPreviewErrorBoundary> <LinkPreview {...preview} maxWidth="420px" /> </LinkPreviewErrorBoundary> ); },});// Register in your appfunction App() { return ( <AssistantRuntimeProvider runtime={runtime}> <PreviewLinkUI /> <Thread /> </AssistantRuntimeProvider> );}
That's it! When a user asks the assistant to preview a link, it will call your tool and render a beautiful LinkPreview component.
Installation
The quickest way to get started is using the assistant-ui CLI, which will set up your project with all necessary dependencies:
npx assistant-ui@latest init
Or install packages manually if you prefer to configure yourself:
pnpm add @assistant-ui/react @assistant-ui/react-ai-sdk ai @ai-sdk/openai zod
Add your OpenAI API key to your environment variables:
OPENAI_API_KEY=sk-...
Frontend tools (interactive Tool UIs)
Some Tool UIs (especially decision surfaces like Option List) are easiest to model as frontend tools: the model calls a tool with Tool UI props as the arguments, and your UI calls addResult(...) after the user confirms.
If you're using AssistantChatTransport, it forwards system instructions and registered tools in the request body. Make sure your /api/chat route reads and forwards them, otherwise the model won't see your frontend tools.
import { openai } from "@ai-sdk/openai";import { streamText, convertToModelMessages, jsonSchema, type UIMessage, type JSONSchema7,} from "ai";type ForwardedTools = Record< string, { description?: string; parameters: JSONSchema7 }>;function toStreamTextTools(tools?: ForwardedTools) { if (!tools) return undefined; return Object.fromEntries( Object.entries(tools).map(([name, tool]) => [ name, { ...(tool.description ? { description: tool.description } : {}), inputSchema: jsonSchema(tool.parameters), }, ]), );}export async function POST(req: Request) { const body = (await req.json()) as { messages: UIMessage[]; system?: string; tools?: ForwardedTools; }; const result = streamText({ model: openai("gpt-4o"), messages: convertToModelMessages(body.messages), system: body.system, tools: toStreamTextTools(body.tools), }); return result.toUIMessageStreamResponse();}
Register a frontend tool (assistant-ui)
When the tool is frontend/interactive, you register the tool on the client (so it gets forwarded by AssistantChatTransport) and render a Tool UI when it is called.
"use client";import { makeAssistantTool } from "@assistant-ui/react";import { OptionList, parseSerializableOptionList, SerializableOptionListSchema, type SerializableOptionList, type OptionListSelection,} from "@/components/tool-ui/option-list";export const SelectFormatTool = makeAssistantTool< SerializableOptionList, OptionListSelection>({ toolName: "selectFormat", description: "Ask the user to choose an output format.", parameters: SerializableOptionListSchema, render: ({ args, result, addResult, toolCallId }) => { // Tool args can be incomplete while streaming; wait until required fields exist. if (!Array.isArray((args as any)?.options)) return null; const optionList = parseSerializableOptionList({ ...args, // `id` is required and should be stable/meaningful; fall back for DX. id: (args as any)?.id ?? `format-selection-${toolCallId}`, }); // Receipt state if (result !== undefined) { return ( <OptionList {...optionList} value={undefined} confirmed={result} /> ); } return ( <OptionList {...optionList} // Avoid controlled selection in LLM-driven payloads. value={undefined} onConfirm={(selection) => addResult(selection)} /> ); },});
Mount <SelectFormatTool /> under <AssistantRuntimeProvider> (just like the Tool UI registration step above) to register the tool and UI.
Auto-continue after addResult(...) (AI SDK runtime)
If you want the assistant to continue automatically after the user confirms (instead of waiting for the next user message), enable AI SDK's auto-resubmit behavior:
import { useChatRuntime, AssistantChatTransport,} from "@assistant-ui/react-ai-sdk";import { lastAssistantMessageIsCompleteWithToolCalls } from "ai";const runtime = useChatRuntime({ transport: new AssistantChatTransport({ api: "/api/chat" }), sendAutomaticallyWhen: lastAssistantMessageIsCompleteWithToolCalls,});
Other Frameworks
Tool UI components work with any React app. Without assistant-ui, you'll need to manually parse tool outputs and render components. For the best experience, we recommend using assistant-ui.
When copying components manually, always include the shared directory alongside the component you need. It contains utilities (action buttons, schemas, formatters) used by all Tool UI components.
Runtime Options
assistant-ui supports multiple runtimes: AI SDK, LangGraph, LangServe, Mastra, or custom backends. The examples above use AI SDK v5.
"use client";import { AssistantRuntimeProvider } from "@assistant-ui/react";import { useChatRuntime, AssistantChatTransport,} from "@assistant-ui/react-ai-sdk";function App() { const runtime = useChatRuntime({ transport: new AssistantChatTransport({ api: "/api/chat" }), }); return ( <AssistantRuntimeProvider runtime={runtime}> {/* Your chat UI */} </AssistantRuntimeProvider> );}
import { streamText, tool, convertToModelMessages, jsonSchema } from "ai";import { openai } from "@ai-sdk/openai";export async function POST(req: Request) { const { messages } = await req.json(); const result = streamText({ model: openai("gpt-4o"), messages: convertToModelMessages(messages), tools: { previewLink: tool({ description: "Show a preview card for a URL", inputSchema: jsonSchema<{ url: string }>({ type: "object", properties: { url: { type: "string", format: "uri" } }, required: ["url"], additionalProperties: false, }), async execute({ url }) { // In production, you'd fetch real metadata here return { id: "link-preview-1", href: url, title: "Example Site", description: "A description of the linked content", image: "https://example.com/image.jpg", }; }, }), }, }); return result.toUIMessageStreamResponse();}
npx assistant-ui@latest init
pnpm add @assistant-ui/react @assistant-ui/react-ai-sdk ai @ai-sdk/openai zod
import { openai } from "@ai-sdk/openai";import { streamText, convertToModelMessages, jsonSchema, type UIMessage, type JSONSchema7,} from "ai";type ForwardedTools = Record< string, { description?: string; parameters: JSONSchema7 }>;function toStreamTextTools(tools?: ForwardedTools) { if (!tools) return undefined; return Object.fromEntries( Object.entries(tools).map(([name, tool]) => [ name, { ...(tool.description ? { description: tool.description } : {}), inputSchema: jsonSchema(tool.parameters), }, ]), );}export async function POST(req: Request) { const body = (await req.json()) as { messages: UIMessage[]; system?: string; tools?: ForwardedTools; }; const result = streamText({ model: openai("gpt-4o"), messages: convertToModelMessages(body.messages), system: body.system, tools: toStreamTextTools(body.tools), }); return result.toUIMessageStreamResponse();}
"use client";import { makeAssistantTool } from "@assistant-ui/react";import { OptionList, parseSerializableOptionList, SerializableOptionListSchema, type SerializableOptionList, type OptionListSelection,} from "@/components/tool-ui/option-list";export const SelectFormatTool = makeAssistantTool< SerializableOptionList, OptionListSelection>({ toolName: "selectFormat", description: "Ask the user to choose an output format.", parameters: SerializableOptionListSchema, render: ({ args, result, addResult, toolCallId }) => { // Tool args can be incomplete while streaming; wait until required fields exist. if (!Array.isArray((args as any)?.options)) return null; const optionList = parseSerializableOptionList({ ...args, // `id` is required and should be stable/meaningful; fall back for DX. id: (args as any)?.id ?? `format-selection-${toolCallId}`, }); // Receipt state if (result !== undefined) { return ( <OptionList {...optionList} value={undefined} confirmed={result} /> ); } return ( <OptionList {...optionList} // Avoid controlled selection in LLM-driven payloads. value={undefined} onConfirm={(selection) => addResult(selection)} /> ); },});
import { useChatRuntime, AssistantChatTransport,} from "@assistant-ui/react-ai-sdk";import { lastAssistantMessageIsCompleteWithToolCalls } from "ai";const runtime = useChatRuntime({ transport: new AssistantChatTransport({ api: "/api/chat" }), sendAutomaticallyWhen: lastAssistantMessageIsCompleteWithToolCalls,});
import { makeAssistantToolUI } from "@assistant-ui/react";import { LinkPreview, LinkPreviewErrorBoundary, parseSerializableLinkPreview,} from "@/components/tool-ui/link-preview";export const PreviewLinkUI = makeAssistantToolUI({ toolName: "previewLink", // Must match backend tool name render: ({ result }) => { // Tool outputs stream in; `result` will be `undefined` until the tool resolves. if (result === undefined) { return ( <div className="bg-card/60 text-muted-foreground w-full max-w-md rounded-2xl border px-5 py-4 text-sm shadow-xs"> Loading preview… </div> ); } const preview = parseSerializableLinkPreview(result); return ( <LinkPreviewErrorBoundary> <LinkPreview {...preview} maxWidth="420px" /> </LinkPreviewErrorBoundary> ); },});// Register in your appfunction App() { return ( <AssistantRuntimeProvider runtime={runtime}> <PreviewLinkUI /> <Thread /> </AssistantRuntimeProvider> );}