opncrafter
Module 3 of 10: Generative UI

Deep Dive: Vercel AI SDK (RSC)

Deep Dive: Vercel AI SDK (RSC)

Jan 2, 2026 • 25 min read

v3.4.15 Updated

The Vercel AI SDK is the industry standard library for building AI apps in Next.js. Specifically, its RSC API (`ai/rsc`) creates the "magic glue" between your Server Actions and Client State.

1. The Architecture: State Triad

Understanding the distinction between these three states is the difference between a junior and senior AI engineer.

1. AIState

Where: Server & Client (Synced)
Type: Serializable JSON
Use Case: The "Truth". Message history, User IDs, Session Metadata. Use this to hydrate the chat on page load.

2. UIState

Where: Client Only
Type: React Components (ReactNode[])
Use Case: The "View". A list of <StockChart /> and <UserMessage /> components. This is what the user sees.

3. Actions

Where: Server Functions
Type: async (input) => UI
Use Case: The "Bridge". They accept user input, mutate AIState, and stream back UIState / JSX.

2. Code: The "createAI" Setup

This is your root provider. It wraps your layout.

// app/action.js
import { createAI } from 'ai/rsc';

// 1. Define the Action
export async function submitUserMessage(content) {
  'use server';
  const aiState = getMutableAIState();
  
  // Update Server State
  aiState.update([...aiState.get(), { role: 'user', content }]);

  return <AssistantMessage content="I received your message" />;
}

// 2. Export the Provider
export const AI = createAI({
  actions: { submitUserMessage },
  initialAIState: [],
  initialUIState: [],
});

3. The Client Patterns

On the client, we use useUIState to render the chat, and useActions to send messages.

Optimistic Updates

Never make the user wait for the server to acknowledge their input. Render it instantly.

// Client Component (Chat.jsx)
const [messages, setMessages] = useUIState(); 
const { submitUserMessage } = useActions();

async function handleSubmit(text) {
  // 1. OPTIMISTIC UPDATE: Show user message instantly
  setMessages(current => [
    ...current,
    <UserMessage key={Date.now()}>{text}</UserMessage>
  ]);

  try {
      // 2. SERVER CALL: Get the response (which is JSX)
      const response = await submitUserMessage(text);
    
      // 3. SYNC: Append the response
      setMessages(current => [...current, response]);
  } catch (err) {
      // 4. ROLLBACK (Optional): Remove the optimistic message if failed
      toast.error("Failed to send");
  }
}

4. Advanced: Streaming Data Objects

Sometimes you don't want to stream JSX. You want to stream raw JSON data (e.g., for a progress bar or a coordinate map).

Use createStreamableValue.

import { createStreamableValue } from 'ai/rsc';

// Server Action
export async function generateReport() {
  const stream = createStreamableValue({ status: 'init', percent: 0 });

  (async () => {
    // Simulate long running process
    for (let i = 0; i <= 100; i+=10) {
        stream.update({ status: 'generating', percent: i });
        await sleep(200);
    }
    stream.done({ status: 'complete', percent: 100 });
  })();

  return stream.value;
}

// Client Component
const [data, error, pending] = useStreamableValue(streamSource);
return <ProgressBar value={data.percent} />;

5. Debugging Network Traefik

Pro Tip: Open Chrome DevTools > Network. Look for the `fetch` request. The Response tab won't show JSON. It shows a weird line-delimited format.

0:["$@1",["$","div",null,{"children":"Hello world"}]]
1:"Sreact.suspense"

This is the RSC Payload. If you see this, your streaming is working correctly.

Conclusion

The Vercel AI SDK (RSC) is State Management for the AI era. It replaces Redux/Zustand for chat applications by syncing the "Truth" (AIState) on the server with the "View" (UIState) on the client.