Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 41 additions & 0 deletions examples/analytics-demo/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.

# dependencies
/node_modules
/.pnp
.pnp.*
.yarn/*
!.yarn/patches
!.yarn/plugins
!.yarn/releases
!.yarn/versions

# testing
/coverage

# next.js
/.next/
/out/

# production
/build

# misc
.DS_Store
*.pem

# debug
npm-debug.log*
yarn-debug.log*
yarn-error.log*
.pnpm-debug.log*

# env files (can opt-in for committing if needed)
.env*

# vercel
.vercel

# typescript
*.tsbuildinfo
next-env.d.ts
202 changes: 202 additions & 0 deletions examples/analytics-demo/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,202 @@
# Analytics Demo

A conversational analytics app that streams OpenUI Lang charts, tables, and metric cards using OpenAI and server-side tool execution.

It uses `@openuidev/react-ui`'s FullScreen layout and the built-in `openuiChatLibrary` — no custom component definitions needed — to render charts, tables, and metric cards from live tool data.

The example includes:

- A **Next.js frontend** with a FullScreen chat layout and built-in conversation starters
- A **Next.js API route** that calls OpenAI with streaming and automatic tool execution via `runTools()`
- **Four mock analytics tools** with built-in sample data, so nothing external is required

## Architecture

```
Browser (FullScreen) -- POST /api/chat --> Next.js route --> OpenAI
<-- SSE stream -- (OpenUI Lang + tool calls)
```

The client sends a conversation to `/api/chat`. The API route loads a generated `system-prompt.txt`, uses OpenAI's `runTools()` to handle multi-round tool calling automatically, and streams the response as SSE. On the client side, `openAIAdapter()` parses the stream, and `openuiChatLibrary` maps each node to a chart, table, or metric card that renders progressively as tokens arrive.

## Project layout

```
examples/analytics-demo/
|- src/app/ # Next.js app (layout, page, API route)
|- src/data/ # Built-in sample data
|- src/tools/ # Analytics tool definitions and implementations
|- src/generated/ # Generated system prompt
|- src/library.ts # Re-exports openuiChatLibrary and promptOptions
```

## Getting Started

1. Install dependencies:

```bash
cd examples/analytics-demo
pnpm install
```

2. Set your OpenAI API key:

```bash
export OPENAI_API_KEY=your-key-here
```

3. Start the dev server:

```bash
pnpm dev
```

This automatically generates the system prompt from the library definition before starting Next.js.

Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.

## What to expect

Try one of the built-in conversation starters:

- "Revenue trends" — a line or area chart of monthly revenue, expenses, and profit
- "Q1 vs Q2 sales" — a grouped bar chart comparing sales by product category across quarters
- "Key metrics" — metric cards for MRR, ARR, churn rate, NPS, CAC, LTV, and more
- "Customer segments" — a pie or bar chart breaking down customers by tier or region

All responses are streamed progressively. The assistant picks the right chart type for the data and can combine multiple visualizations (a summary metric, a chart, and a detail table) in a single response card.

## Key files

### `src/app/page.tsx` — FullScreen chat setup

The page wires up the FullScreen layout with the built-in `openuiChatLibrary`:

```tsx
import { openAIAdapter, openAIMessageFormat } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";
import { openuiChatLibrary } from "@openuidev/react-ui/genui-lib";

export default function Page() {
return (
<div className="h-screen w-screen overflow-hidden relative">
<FullScreen
processMessage={async ({ messages, abortController }) => {
return fetch("/api/chat", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
messages: openAIMessageFormat.toApi(messages),
}),
signal: abortController.signal,
});
}}
streamProtocol={openAIAdapter()}
componentLibrary={openuiChatLibrary}
agentName="Analytics Demo"
theme={{ mode: "light" }}
conversationStarters={{
variant: "short",
options: [
{
displayText: "Revenue trends",
prompt: "Show me monthly revenue trends for the past year.",
},
{
displayText: "Q1 vs Q2 sales",
prompt: "Compare Q1 vs Q2 sales by product category.",
},
{ displayText: "Key metrics", prompt: "What are our key business metrics right now?" },
{
displayText: "Customer segments",
prompt: "Break down our customer base by segment and show spending patterns.",
},
],
}}
/>
</div>
);
}
```

Because this example uses the standard `openuiChatLibrary`, there is no custom component library to define. The same library is used by the CLI to generate the system prompt.

### `src/library.ts` — library re-export

The library file re-exports the built-in library and prompt options so the CLI can discover them:

```ts
export {
openuiChatLibrary as library,
openuiChatPromptOptions as promptOptions,
} from "@openuidev/react-ui/genui-lib";
```

The `generate:prompt` script points the CLI at this file. The exported names `library` and `promptOptions` are the convention the CLI uses for auto-detection.

### `src/app/api/chat/route.ts` — OpenAI streaming with tool execution

The API route uses the OpenAI SDK's `runTools()` to handle multi-round tool calling automatically, streaming content and tool call events as SSE:

```ts
const runner = (client.chat.completions as any).runTools({
model,
messages: chatMessages,
tools,
stream: true,
});

runner.on("functionToolCall", (fc) => {
enqueue(sseToolCallStart(encoder, { id, function: { name: fc.name } }, callIdx));
});

runner.on("functionToolCallResult", (result) => {
enqueue(sseToolCallArgs(encoder, { id: tc.id, function: { arguments: tc.arguments } }, result, resultIdx));
});

runner.on("chunk", (chunk) => {
const delta = chunk.choices?.[0]?.delta;
if (delta?.content) enqueue(encoder.encode(`data: ${JSON.stringify(chunk)}\n\n`));
});
```

`runTools()` handles the full tool-calling loop internally — calling tools, feeding results back to the model, and repeating until a final text response is produced.

### `src/tools/analytics-tools.ts` — tool definitions

Each tool is defined with a JSON Schema description (for the LLM) and a plain async function (executed server-side):

```ts
{
type: "function",
function: {
name: "query_revenue",
description: "Query revenue, expenses, and profit data. Can return monthly or quarterly breakdowns.",
parameters: {
type: "object",
properties: {
period: { type: "string", description: "Time period: 'monthly' or 'quarterly'." },
},
},
function: queryRevenue,
},
}
```

## Tools

The API route defines four server-side tools with built-in sample data:

| Tool | Description |
| ----------------- | -------------------------------------------------------------------------------------- |
| `query_revenue` | Monthly or quarterly revenue, expenses, and profit. Includes YoY growth. |
| `query_sales` | Sales grouped by product category (per quarter), region, or individual product. |
| `query_metrics` | Key business metrics: MRR, ARR, churn, NPS, conversion rate, CAC, LTV, customer count. |
| `query_customers` | Customer segmentation by tier (Enterprise, Mid-Market, SMB, Individual) or region. |

No external API keys or data sources are needed. In a production app, replace the mock implementations with real database queries or API calls.

## Learn More

- [OpenUI Documentation](https://openui.com/docs) — learn about OpenUI features and API.
- [OpenUI GitHub repository](https://github.com/thesysdev/openui) — your feedback and contributions are welcome!
18 changes: 18 additions & 0 deletions examples/analytics-demo/eslint.config.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import { defineConfig, globalIgnores } from "eslint/config";
import nextVitals from "eslint-config-next/core-web-vitals";
import nextTs from "eslint-config-next/typescript";

const eslintConfig = defineConfig([
...nextVitals,
...nextTs,
// Override default ignores of eslint-config-next.
globalIgnores([
// Default ignores of eslint-config-next:
".next/**",
"out/**",
"build/**",
"next-env.d.ts",
]),
]);

export default eslintConfig;
8 changes: 8 additions & 0 deletions examples/analytics-demo/next.config.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
import type { NextConfig } from "next";

const nextConfig: NextConfig = {
turbopack: {},
transpilePackages: ["@openuidev/react-ui", "@openuidev/react-headless", "@openuidev/react-lang"],
};

export default nextConfig;
33 changes: 33 additions & 0 deletions examples/analytics-demo/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
{
"name": "analytics-demo",
"version": "0.1.0",
"private": true,
"scripts": {
"generate:prompt": "pnpm --filter @openuidev/cli build && pnpm exec openui generate src/library.ts --out src/generated/system-prompt.txt",
"dev": "pnpm generate:prompt && next dev",
"build": "next build",
"start": "next start",
"lint": "eslint"
},
"dependencies": {
"openai": "^6.22.0",
"@openuidev/cli": "workspace:*",
"@openuidev/react-headless": "workspace:*",
"@openuidev/react-lang": "workspace:*",
"@openuidev/react-ui": "workspace:*",
"lucide-react": "^0.575.0",
"next": "16.1.6",
"react": "19.2.3",
"react-dom": "19.2.3"
},
"devDependencies": {
"@tailwindcss/postcss": "^4",
"@types/node": "^20",
"@types/react": "^19",
"@types/react-dom": "^19",
"eslint": "^9",
"eslint-config-next": "16.1.6",
"tailwindcss": "^4",
"typescript": "^5"
}
}
7 changes: 7 additions & 0 deletions examples/analytics-demo/postcss.config.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
const config = {
plugins: {
"@tailwindcss/postcss": {},
},
};

export default config;
Loading