Skip to content
Open
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
225 changes: 225 additions & 0 deletions docs/content/docs/openui-lang/examples/analytics-demo.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,225 @@
---
title: Analytics
description: A conversational analytics app that streams OpenUI Lang charts, tables, and metric cards using Google Gemini and server-side tool execution.
---

A full-stack example that combines [OpenUI Lang](/docs/openui-lang/overview) with server-side tool calling to build a conversational analytics assistant. It uses `@openuidev/react-ui`'s [FullScreen](/docs/chat/fullscreen) layout and the built-in `openuiChatLibrary` — no custom component definitions needed — to render charts, tables, and metric cards from live tool data.

[View source on GitHub →](https://github.com/thesysdev/crayon/tree/main/examples/analytics-demo)

<div className="bg-[rgba(0,0,0,0.05)] dark:bg-gray-900 rounded-2xl h-125 flex p-2">
<img
src="/images/openui-lang/analytics-demo.png"
alt="Analytics demo screenshot"
className="w-full rounded-lg m-auto object-contain"
/>
</div>

The example includes:

- A **Next.js frontend** with a FullScreen chat layout and built-in conversation starters
- A **Next.js API route** that calls Google Gemini with streaming and a multi-round tool execution loop
- **Four mock analytics tools** with built-in sample data, so nothing external is required

## Architecture

```
Browser (FullScreen) -- POST /api/chat --> Next.js route --> Gemini
<-- SSE stream -- (OpenUI Lang + tool calls)
```

The client sends a conversation to `/api/chat`. The API route loads a generated `system-prompt.txt`, runs a tool-calling loop against Gemini (up to 5 rounds), and streams the final OpenUI Lang response as SSE. On the client side, `openAIAdapter()` parses the stream, and `openuiChatLibrary` maps each node to a chart, table, or metric card that renders progressively as tokens arrive.

## Project layout

```
examples/analytics-demo/
|- src/app/ # Next.js app (layout, page, API route)
|- src/data/ # Built-in sample data
|- src/tools/ # Analytics tool definitions and implementations
|- src/generated/ # Generated system prompt
|- src/library.ts # Re-exports openuiChatLibrary and promptOptions
```

## Run the example

Run these commands from `examples/analytics-demo`.

1. Install dependencies:

```bash
cd examples/analytics-demo
pnpm install
```

2. Set your Gemini API key:

```bash
export GEMINI_API_KEY=your-key-here
```

3. Start the dev server:

```bash
pnpm dev
```

This automatically generates the system prompt from the library definition before starting Next.js.

## What to expect

Open the app and try one of the built-in conversation starters:

- "Revenue trends" — a line or area chart of monthly revenue, expenses, and profit
- "Q1 vs Q2 sales" — a grouped bar chart comparing sales by product category across quarters
- "Key metrics" — metric cards for MRR, ARR, churn rate, NPS, CAC, LTV, and more
- "Customer segments" — a pie or bar chart breaking down customers by tier or region

All responses are streamed progressively. The assistant picks the right chart type for the data and can combine multiple visualizations (a summary metric, a chart, and a detail table) in a single response card.

## Key files

### `src/app/page.tsx` — FullScreen chat setup

The page wires up the FullScreen layout with the built-in `openuiChatLibrary`:

```tsx
import { openAIAdapter, openAIMessageFormat } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";
import { openuiChatLibrary } from "@openuidev/react-ui/genui-lib";

export default function Page() {
return (
<div className="h-screen w-screen overflow-hidden relative">
<FullScreen
processMessage={async ({ messages, abortController }) => {
return fetch("/api/chat", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
messages: openAIMessageFormat.toApi(messages),
}),
signal: abortController.signal,
});
}}
streamProtocol={openAIAdapter()}
componentLibrary={openuiChatLibrary}
agentName="Analytics Demo"
theme={{ mode: "light" }}
conversationStarters={{
variant: "short",
options: [
{
displayText: "Revenue trends",
prompt: "Show me monthly revenue trends for the past year.",
},
{
displayText: "Q1 vs Q2 sales",
prompt: "Compare Q1 vs Q2 sales by product category.",
},
{ displayText: "Key metrics", prompt: "What are our key business metrics right now?" },
{
displayText: "Customer segments",
prompt: "Break down our customer base by segment and show spending patterns.",
},
],
}}
/>
</div>
);
}
```

Because this example uses the standard `openuiChatLibrary`, there is no custom component library to define. The same library is used by the CLI to generate the system prompt.

See [Chat FullScreen Layout](/docs/chat/fullscreen) and [GenUI](/docs/chat/genui) for the full API.

### `src/library.ts` — library re-export

The library file re-exports the built-in library and prompt options so the CLI can discover them:

```ts
export {
openuiChatLibrary as library,
openuiChatPromptOptions as promptOptions,
} from "@openuidev/react-ui/genui-lib";
```

The `generate:prompt` script points the CLI at this file. The exported names `library` and `promptOptions` are the convention the CLI uses for auto-detection.

### `src/app/api/chat/route.ts` — Gemini streaming with tool execution

The API route implements a multi-round tool-calling loop against the Gemini API, then streams the final text response as OpenAI-compatible SSE:

```ts
const MAX_TOOL_ROUNDS = 5;
for (let round = 0; round < MAX_TOOL_ROUNDS; round++) {
const response = await ai.models.generateContentStream({ model, contents, config });

let hasToolCalls = false;
const functionResponses = [];

for await (const chunk of response) {
if (chunk.text) enqueue(sseContentChunk(encoder, chunk.text));

if (chunk.functionCalls?.length) {
hasToolCalls = true;
for (const fc of chunk.functionCalls) {
const result = await toolImpls[fc.name](fc.args ?? {});
functionResponses.push({ name: fc.name, response: JSON.parse(result) });
}
}
}

if (hasToolCalls) {
contents.push({
role: "model",
parts: functionResponses.map((fr) => ({ functionCall: { name: fr.name, args: {} } })),
});
contents.push({
role: "user",
parts: functionResponses.map((fr) => ({
functionResponse: { name: fr.name, response: fr.response },
})),
});
continue;
}
break;
}
```

The response is formatted as OpenAI-compatible SSE so the client-side `openAIAdapter()` can parse it without modification — this is a useful pattern when switching between providers.

### `src/tools/analytics-tools.ts` — tool definitions

Each tool is defined with a JSON Schema description (for the LLM) and a plain async function (executed server-side):

```ts
{
type: "function",
function: {
name: "query_revenue",
description: "Query revenue, expenses, and profit data. Can return monthly or quarterly breakdowns.",
parameters: {
type: "object",
properties: {
period: { type: "string", description: "Time period: 'monthly' or 'quarterly'." },
},
},
function: queryRevenue,
},
}
```

## Tools

The API route defines four server-side tools with built-in sample data:

| Tool | Description |
| ----------------- | -------------------------------------------------------------------------------------- |
| `query_revenue` | Monthly or quarterly revenue, expenses, and profit. Includes YoY growth. |
| `query_sales` | Sales grouped by product category (per quarter), region, or individual product. |
| `query_metrics` | Key business metrics: MRR, ARR, churn, NPS, conversion rate, CAC, LTV, customer count. |
| `query_customers` | Customer segmentation by tier (Enterprise, Mid-Market, SMB, Individual) or region. |

No external API keys or data sources are needed. In a production app, replace the mock implementations with real database queries or API calls.
1 change: 1 addition & 0 deletions docs/content/docs/openui-lang/meta.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
"---Examples---",
"examples/shadcn-chat",
"examples/react-native",
"examples/analytics-demo",
"examples/vercel-ai-chat"
]
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
41 changes: 41 additions & 0 deletions examples/analytics-demo/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.

# dependencies
/node_modules
/.pnp
.pnp.*
.yarn/*
!.yarn/patches
!.yarn/plugins
!.yarn/releases
!.yarn/versions

# testing
/coverage

# next.js
/.next/
/out/

# production
/build

# misc
.DS_Store
*.pem

# debug
npm-debug.log*
yarn-debug.log*
yarn-error.log*
.pnpm-debug.log*

# env files (can opt-in for committing if needed)
.env*

# vercel
.vercel

# typescript
*.tsbuildinfo
next-env.d.ts
50 changes: 50 additions & 0 deletions examples/analytics-demo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
This is an [OpenUI](https://openui.com) Analytics Demo — a conversational analytics chat app showcasing OpenUI's chart, table, and metric card components.

## Getting Started

First, set your Gemini API key:

```bash
export GEMINI_API_KEY=your-key-here
```

Then run the development server:

```bash
npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev
```

Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.

Ask natural-language analytics questions like:

- "Show me monthly revenue trends"
- "Compare Q1 vs Q2 sales by category"
- "What are our key business metrics?"
- "Break down customers by segment"

The LLM fetches sample data via tools and generates rich UI (charts, tables, metric cards) using OpenUI Lang, streamed progressively to the browser.

## How It Works

The demo includes four mock analytics tools with built-in sample data:

- **query_revenue** — monthly/quarterly revenue, expenses, and profit
- **query_sales** — sales by product category, region, or product
- **query_metrics** — key business metrics (MRR, ARR, churn, NPS, CAC, LTV, etc.)
- **query_customers** — customer segmentation by tier or region

No external data source is needed — everything works out of the box.

## Learn More

To learn more about OpenUI, take a look at the following resources:

- [OpenUI Documentation](https://openui.com/docs) - learn about OpenUI features and API.
- [OpenUI GitHub repository](https://github.com/thesysdev/openui) - your feedback and contributions are welcome!
18 changes: 18 additions & 0 deletions examples/analytics-demo/eslint.config.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import { defineConfig, globalIgnores } from "eslint/config";
import nextVitals from "eslint-config-next/core-web-vitals";
import nextTs from "eslint-config-next/typescript";

const eslintConfig = defineConfig([
...nextVitals,
...nextTs,
// Override default ignores of eslint-config-next.
globalIgnores([
// Default ignores of eslint-config-next:
".next/**",
"out/**",
"build/**",
"next-env.d.ts",
]),
]);

export default eslintConfig;
8 changes: 8 additions & 0 deletions examples/analytics-demo/next.config.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
import type { NextConfig } from "next";

const nextConfig: NextConfig = {
turbopack: {},
transpilePackages: ["@openuidev/react-ui", "@openuidev/react-headless", "@openuidev/react-lang"],
};

export default nextConfig;
33 changes: 33 additions & 0 deletions examples/analytics-demo/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
{
"name": "analytics-demo",
"version": "0.1.0",
"private": true,
"scripts": {
"generate:prompt": "pnpm --filter @openuidev/cli build && pnpm exec openui generate src/library.ts --out src/generated/system-prompt.txt",
"dev": "pnpm generate:prompt && next dev",
"build": "next build",
"start": "next start",
"lint": "eslint"
},
"dependencies": {
"@google/genai": "^1.45.0",
"@openuidev/cli": "workspace:*",
"@openuidev/react-headless": "workspace:*",
"@openuidev/react-lang": "workspace:*",
"@openuidev/react-ui": "workspace:*",
"lucide-react": "^0.575.0",
"next": "16.1.6",
"react": "19.2.3",
"react-dom": "19.2.3"
},
"devDependencies": {
"@tailwindcss/postcss": "^4",
"@types/node": "^20",
"@types/react": "^19",
"@types/react-dom": "^19",
"eslint": "^9",
"eslint-config-next": "16.1.6",
"tailwindcss": "^4",
"typescript": "^5"
}
}
7 changes: 7 additions & 0 deletions examples/analytics-demo/postcss.config.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
const config = {
plugins: {
"@tailwindcss/postcss": {},
},
};

export default config;
Loading