opncrafter
Module 10 of 10: Generative UI

Capstone Project: AI Financial Analyst Dashboard

Capstone Project: AI Financial Analyst Dashboard

Jan 3, 2026 • 30 min read

You've mastered RSC streaming, skeleton loading states, generative charts, AI form generation, and voice control. Now we combine them all into a single cohesive application: a conversational financial analyst that renders charts, streams analysis, confirms trade orders, and accepts voice commands. This is how all the pieces fit together in production.

1. Application User Stories

User InputExpected OutputPattern Used
"Show me AAPL vs GOOGL last 6 months"Multi-line area chart, streaming commentaryRSC tool → Chart skeleton → Recharts
"Why did Tesla drop Friday?"Streaming text analysis with news citationsstreamText with citations
"Buy 10 shares of NVDA"Pre-filled purchase confirmation formAI Form Generation + Server Action
(hold mic) "Add Amazon to watchlist"Whisper transcription → agent actionVoice Control → submitUserMessage
Page refreshFull conversation history restoredPostgres persistence + hydration

2. Tech Stack

npx create-next-app@latest financial-agent \
  --typescript --app --src-dir --tailwind --use-npm

cd financial-agent

npm install \
  ai @ai-sdk/openai \          # Vercel AI SDK v3 with RSC
  openai \                      # OpenAI SDK (for Whisper transcription)
  zod \                         # Schema validation (shared client/server)
  recharts \                    # Chart rendering
  react-hook-form @hookform/resolvers/zod \  # AI form generation
  @prisma/client prisma \       # Postgres persistence
  lucide-react clsx tailwind-merge  # UI utilities

# Database setup (Postgres via Neon.tech for serverless)
npx prisma init
# Add DATABASE_URL to .env.local

# Prisma schema for chat persistence:
# model Chat {
#   id        String    @id @default(cuid())
#   createdAt DateTime  @default(now())
#   messages  Message[]
# }
# model Message {
#   id        String   @id @default(cuid())
#   chatId    String
#   role      String   // "user" | "assistant"
#   content   Json     // Stores both text and structured tool results
#   createdAt DateTime @default(now())
#   chat      Chat     @relation(fields: [chatId], references: [id])
# }

3. The AI State and Actions (Core File)

// src/app/actions.tsx
'use server';

import { streamUI, createAI, getMutableAIState } from 'ai/rsc';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';
import { StockChartSkeleton } from '@/components/skeletons/StockChartSkeleton';
import { StockChart } from '@/components/StockChart';
import { AnalysisSkeleton } from '@/components/skeletons/AnalysisSkeleton';
import { PurchaseForm } from '@/components/PurchaseForm';
import { BotCard, BotMessage } from '@/components/Message';
import { db } from '@/lib/db';

const PurchaseSchema = z.object({
    symbol: z.string(),
    shares: z.number().positive(),
    order_type: z.enum(['market', 'limit']),
    limit_price: z.number().positive().optional(),
});

export async function submitUserMessage(content: string, chatId: string) {
    const aiState = getMutableAIState<typeof AI>();

    // Append user message to AI state
    aiState.update([...aiState.get(), { role: 'user', content }]);

    const ui = await streamUI({
        model: openai('gpt-4o'),
        system: `You are a financial analyst assistant for a trading dashboard.
Available actions:
- show_stock_chart: Display price history for one or more stock symbols
- analyze_stock: Stream a detailed analysis of a stock's recent performance  
- buy_stock: Show a purchase confirmation form (MUST show form, not execute immediately)
- add_to_watchlist: Add symbols to user's watchlist

Always prefer showing data visualizations over text summaries.
Today's date: ${new Date().toLocaleDateString()}`,

        messages: aiState.get(),

        tools: {
            show_stock_chart: {
                description: 'Display a historical price chart for one or more stock tickers',
                parameters: z.object({
                    symbols: z.array(z.string()).describe('Stock tickers, e.g. ["AAPL", "GOOGL"]'),
                    period: z.enum(['1W', '1M', '3M', '6M', '1Y', '5Y']).default('3M'),
                }),
                generate: async function* ({ symbols, period }) {
                    // Immediate yield — user sees skeleton in <50ms
                    yield <BotCard><StockChartSkeleton symbols={symbols} period={period} /></BotCard>;

                    // Fetch real market data (replace with your data provider)
                    const data = await fetchStockData(symbols, period);

                    return (
                        <BotCard>
                            <StockChart data={data} symbols={symbols} period={period} />
                        </BotCard>
                    );
                },
            },

            analyze_stock: {
                description: 'Stream a detailed analysis of a stock including recent news and technicals',
                parameters: z.object({
                    symbol: z.string(),
                    focus: z.string().describe('What aspect to focus on: earnings, technicals, news, etc.'),
                }),
                generate: async function* ({ symbol, focus }) {
                    yield <BotCard><AnalysisSkeleton symbol={symbol} /></BotCard>;

                    const [priceData, newsData] = await Promise.all([
                        fetchStockData([symbol], '1M'),
                        fetchRecentNews(symbol),
                    ]);

                    // Stream the analysis text progressively
                    return (
                        <BotMessage>
                            <StockAnalysis priceData={priceData} newsData={newsData} symbol={symbol} focus={focus} />
                        </BotMessage>
                    );
                },
            },

            buy_stock: {
                description: 'Show a trade confirmation form for purchasing shares',
                parameters: z.object({
                    symbol: z.string(),
                    suggested_shares: z.number().describe('AI-suggested quantity based on context'),
                    order_type: z.enum(['market', 'limit']).default('market'),
                }),
                generate: async function* ({ symbol, suggested_shares, order_type }) {
                    yield <BotCard><div style={{color:'var(--text-secondary)'}}>Preparing purchase form...</div></BotCard>;

                    const currentPrice = await fetchCurrentPrice(symbol);

                    // Server action called when user submits the form
                    async function executePurchase(formData: z.infer<typeof PurchaseSchema>) {
                        'use server';
                        // Re-validate on server — never trust client data
                        const validated = PurchaseSchema.parse(formData);
                        // Execute trade via your broker API
                        const order = await brokerAPI.createOrder(validated);
                        return { success: true, orderId: order.id };
                    }

                    return (
                        <BotCard>
                            <PurchaseForm
                                defaultValues={{ symbol, shares: suggested_shares, order_type }}
                                currentPrice={currentPrice}
                                onSubmit={executePurchase}
                            />
                        </BotCard>
                    );
                },
            },
        },

        text: ({ content, done }) => {
            if (done) {
                aiState.done([...aiState.get(), { role: 'assistant', content }]);
            }
            return <BotMessage>{content}</BotMessage>;
        },
    });

    // Persist chat to database when response completes
    await db.message.create({
        data: { chatId, role: 'user', content },
    });

    return { id: Date.now(), display: ui.value };
}

// AI state provider (wraps your Next.js app)
export const AI = createAI({
    actions: { submitUserMessage },
    initialAIState: [] as CoreMessage[],
    initialUIState: [] as { id: number; display: React.ReactNode }[],
    onSetAIState: async ({ state, done }) => {
        'use server';
        if (done) await saveChatHistory(state);
    },
});

4. Deployment to Vercel

# 1. Push to GitHub
git init && git add . && git commit -m "Initial financial agent"
git remote add origin https://github.com/you/financial-agent.git
git push -u origin main

# 2. Deploy to Vercel
npx vercel --prod

# 3. Set environment variables in Vercel Dashboard:
# OPENAI_API_KEY = sk-...
# DATABASE_URL = postgresql://...  (Neon.tech serverless Postgres)
# MARKET_DATA_API_KEY = ...        (Polygon.io, Alpha Vantage, or similar)
# BROKER_API_KEY = ...             (Alpaca, Interactive Brokers sandbox)

# 4. Run Prisma migrations on production DB
npx prisma migrate deploy

# 5. Verify streaming works:
# RSC requires non-edge runtime for Prisma and complex node modules
# Add to app/page.tsx:
# export const runtime = 'nodejs';  ← NOT 'edge'
# export const maxDuration = 60;    ← Allow up to 60s for complex analysis

Frequently Asked Questions

How do I handle real-time stock price streaming in the dashboard?

The AI response streaming (RSC) is separate from market data streaming. For real-time prices, use a WebSocket connection from Polygon.io or Alpaca markets alongside the AI chart generation. React Query or SWR handles polling for near-real-time data (1-5 second intervals). The chart component can subscribe to a WebSocket and update live while the LLM-generated commentary is streamed in separately. Use an EventSource pattern or server-sent events for continuous price updates that don't require round-trip AI calls for every tick.

Can RSC handle concurrent users without sessions bleeding into each other?

Yes — React Server Components are request-scoped, not shared across users. Each streamUI call creates an independent stream. The AI state (getMutableAIState) is isolated per request. The key is to never store AI state in module-level variables (which persist across requests on warm serverless instances) — always use the SDK's state management. With Postgres persistence, each chat session has a unique chatId, and all state reads/writes are scoped to that ID. The application is safe for concurrent users as long as you follow the SDK's patterns.

Course Complete 🎓

You have now built a production-grade Generative UI application combining RSC streaming, skeleton loading states, Recharts visualizations, voice input, AI form generation with server-side re-validation, and Postgres chat persistence. The patterns you've learned — yield skeleton before async work, validate data on both client and server, separate AI intent from data fetching, granular Suspense boundaries — transfer to any Generative UI application domain. The future of interfaces is intent-driven: users describe what they want, and the UI creates itself to meet that need in real time.

🎓 You are now a Generative UI Engineer

You understand the principles, patterns, and implementation details of the most advanced AI UI architecture available today.

Back to Course Hub →
👨‍💻
Written by

Vivek

AI Engineer

Full-stack AI engineer with 4+ years building LLM-powered products, autonomous agents, and RAG pipelines. I've shipped AI features to production for startups and worked hands-on with GPT-4o, LangChain, LlamaIndex, and the Vercel AI SDK. I started OpnCrafter to share everything I wish I had when learning — no fluff, just working code and real-world context.

GPT-4oLangChainNext.jsVector DBsRAGVercel AI SDK

Continue Reading

👨‍💻
Written by

Vivek

AI Engineer

Full-stack AI engineer with 4+ years building LLM-powered products, autonomous agents, and RAG pipelines. I've shipped AI features to production for startups and worked hands-on with GPT-4o, LangChain, LlamaIndex, and the Vercel AI SDK. I started OpnCrafter to share everything I wish I had when learning — no fluff, just working code and real-world context.

GPT-4oLangChainNext.jsVector DBsRAGVercel AI SDK