opncrafter
Module 4 of 10: Generative UI

Rendering Tool Outputs as UI

Rendering Tool Outputs as UI

Jan 2, 2026 • 20 min read

This is where the magic happens. When the LLM calls a tool (e.g., `get_stock_price`), instead of returning raw JSON text, we return a Interactive Component.

1. The Mental Model: "Tool as a Component Factory"

In traditional LLM apps, tools return Data Strings.
In Generative UI, tools return React Components.

2. The `render()` Function Mechanics

The Vercel AI SDK provides a `render()` method that cleanly maps tool names to generator functions.

import { render } from 'ai/rsc';
import { z } from 'zod';
import StockChart from '@/components/StockChart';
import StockSkeleton from '@/components/StockSkeleton';

const ui = render({
  model: 'gpt-4o',
  messages: history,
  tools: {
    get_stock_price: {
      description: 'Get stock price and history',
      parameters: z.object({ symbol: z.string() }),
      
      // THE MAGIC: Return a Component!
      generate: async function* ({ symbol }) {
        // Yield 1: Instant Feedback (The Skeleton)
        yield <StockSkeleton symbol={symbol} />; 
        
        try {
            // Yield 2: The API Call (Server-side)
            const data = await fetchStockData(symbol);
            
            // Yield 3: The Final Interactive Chart
            return <StockChart data={data} />; 
        } catch (e) {
            // Yield 4: Graceful Error Handling
            return <ErrorMessage error="Failed to fetch stock data" />;
        }
      }
    }
  }
});

return ui;

3. Deep Dive: Generator Functions (`function*`)

Notice the `function*`. This is a JavaScript Generator. It allows a function to return multiple times (stream updates).

yield <Skeleton />

Sent immediately via HTTP chunk. Browser renders it. Suspense boundary resolves.

await fetch()

Server keeps connection open. Browser shows Skeleton. Server waits for DB/API.

return <Chart />

Server sends final payload. Browser replaces Skeleton with Chart. Stream closes.

4. Making the Component Interactive

The component you yield (`<StockChart />`) is a normal React component. It runs on the client. It can even call Server Actions to loop back to the AI!

// Client Component: StockChart.jsx
'use client';
import { useActions } from 'ai/rsc';

export default function StockChart({ data, symbol }) {
  const { buyStock } = useActions(); // Access Server Action

  return (
    <div className="chart-card">
      <LineChart data={data} />
      
      <button 
        onClick={async () => {
            const result = await buyStock(symbol);
            toast.success("Bought " + symbol);
        }}
        className="btn-primary"
      >
        Buy {symbol}
      </button>
    </div>
  );
}

This creates a powerful "Human-in-the-Loop" workflow. The AI proposes an action (showing the Buy button), and the User confirms it (clicking).

5. Best Practices

  • Error Boundaries: Always wrap your tool logic in `try/catch` and return a friendly error component instead of crashing the whole stream.
  • Zod Descriptions: The LLM decides which component to show based on the tool description. Be descriptive. "Display a candlestick chart for a stock symbol."
  • Skeleton Height: Ensure your skeleton component has the same pixel height as the final component to avoid CLS (Cumulative Layout Shift).

Conclusion

By treating Tools as Component Factories, we transform the chat from a text log into a dynamic dashboard. The user doesn't just read about data; they interact with it.