Overview

Tool UI is a React component framework for conversation‑native UIs inside AI chats.
Tools return JSON; Tool UI renders it as inline, narrated, referenceable surfaces.

At a glance

  • Conversation‑native components that live inside messages, optimized for chat width and scroll.
  • Schema‑first rendering: every surface is driven by a serializable schema with stable IDs.
  • Assistant‑anchored: the assistant introduces, interprets, and closes each surface.
  • Stack‑agnostic: works with any LLM/provider and orchestration layer.

Where it fits

Radix/shadcn (primitives) → Tool UI (conversation‑native components & schema) → AI SDK / LangGraph / etc. (LLM orchestration)

Who it’s for

  • React devs & design engineers building LLM chat apps.
  • Teams who want predictable, accessible UI that the assistant can reference (“the second row”).
  • Anyone who wants typed, schema‑validated tool outputs instead of brittle HTML strings.

Why Tool UI

  • Predictable rendering: Tools emit schema‑validated JSON; components render consistently across themes/devices.
  • Built for chat: Mobile‑first, glanceable, minimal chrome; no in‑message navigation flows.
  • Developer control: Components render; your app owns side effects via callbacks/server actions.
  • Type safety: Serializable schemas on the server, parsers on the client; works great with AI SDK InferUITools.
  • Lifecycle & time aware: Clear phases and honest "As of …" markers; stale/superseded behavior is explicit.

Mental model

See UI Guidelines for the full philosophy. Quick summary:

  • 5 Fundamentals: One intent · Inline · Schema‑first · Assistant‑anchored · Lifecycle & temporal honesty
  • Roles: information, decision, control, state, composite
  • Lifecycle: invocation → output‑pending → interactive → committing → receipt → errored
  • Conversation coherence: Every important action ↔ one canonical sentence; side effects yield a durable receipt (status, summary, identifiers, timestamp)

How it works

The assistant calls a tool, the tool returns JSON matching a schema, and the UI renders inline.

Assistant calls tool → JSON output → <Component {...props} /> → User interacts → Your app handles effects

Minimal example

Server side: Define a tool that returns schema-validated JSON.

What this demonstrates:

  • serializableMediaCardSchema from @tool-ui/media-card ensures type-safe output
  • The returned object includes temporal metadata (asOf) for freshness tracking
  • Actions are defined with both UI labels and canonical sentences for conversation coherence
  • The schema guarantees the frontend receives reconstructable, addressable data
import { streamText, tool } from "ai";import { openai } from "@ai-sdk/openai";import { z } from "zod";import { serializableMediaCardSchema } from "@tool-ui/media-card";export async function POST(req: Request) {  const { messages } = await req.json();  const result = streamText({    model: openai("gpt-4o"),    messages,    tools: {      previewLink: tool({        description: "Show a preview card for a URL",        inputSchema: z.object({ url: z.string().url() }),        outputSchema: serializableMediaCardSchema,        async execute({ url }) {          return {            id: "link-1",            kind: "link",            href: url,            title: "Example Site",            description: "A description of the linked content",            thumb: "https://example.com/image.jpg",            asOf: new Date().toISOString(),            actions: [{ id: "open", label: "Open", sentence: "Open the link" }],          };        },      }),    },  });  return result.toUIMessageStreamResponse();}

Client side: Register the component and let assistant-ui handle rendering.

What this demonstrates:

  • makeAssistantToolUI connects the tool name to a React component
  • The result object is automatically typed and validated against your schema
  • Simply mounting <PreviewLinkUI /> registers it—no manual plumbing required
  • assistant-ui manages the runtime, streaming, and tool call lifecycle
"use client";import { AssistantRuntimeProvider } from "@assistant-ui/react";import { useChatRuntime, AssistantChatTransport, makeAssistantToolUI } from "@assistant-ui/react-ai-sdk";import { MediaCard } from "@tool-ui/media-card";const PreviewLinkUI = makeAssistantToolUI({  toolName: "previewLink",  render: ({ result }) => <MediaCard {...result} maxWidth="420px" />,});export default function App() {  const runtime = useChatRuntime({ transport: new AssistantChatTransport({ api: "/api/chat" }) });  return (    <AssistantRuntimeProvider runtime={runtime}>      <PreviewLinkUI />      {/* your <Thread /> component here */}    </AssistantRuntimeProvider>  );}

Anatomy of a surface

Every Tool UI surface is addressable and reconstructable. Here's the common schema structure:

What this shows:

  • id makes the surface referenceable ("the second card")
  • role declares the primary purpose (information, decision, control, state, or composite)
  • asOf enables temporal honesty and staleness detection
  • actions with sentence fields enable natural language interaction
  • receipt provides durable proof of side effects with timestamps and identifiers
{  id: string;                     // stable across turns  role: "information"|"decision"|"control"|"state"|"composite";  asOf?: string;                  // ISO time for freshness  actions?: Array<{ id: string; label: string; sentence: string }>;  // optional receipt after side effects:  receipt?: {    outcome: "success"|"partial"|"failed"|"cancelled";    summary: string;    identifiers?: Record<string, string>;    at: string;                   // ISO timestamp  };}

Next steps