Skip to content

Commit

Permalink
feat: add vercel chat example (#1532)
Browse files Browse the repository at this point in the history
Co-authored-by: Marcus Schiesser <[email protected]>
  • Loading branch information
thucpn and marcusschiesser authored Dec 2, 2024
1 parent 510191c commit fd38a25
Show file tree
Hide file tree
Showing 13 changed files with 452 additions and 1 deletion.
7 changes: 7 additions & 0 deletions .changeset/clever-monkeys-switch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
"@llamaindex/vercel": patch
"@llamaindex/doc": patch
"@llamaindex/examples": patch
---

Add vercel tool adapter to use query engine tool
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"title": "Integration",
"description": "See our integrations",
"pages": ["open-llm-metry", "lang-trace"]
"pages": ["open-llm-metry", "lang-trace", "vercel"]
}
80 changes: 80 additions & 0 deletions apps/next/src/content/docs/llamaindex/integration/vercel.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
---
title: Vercel
description: Integrate LlamaIndex with Vercel's AI SDK
---

LlamaIndex provides integration with Vercel's AI SDK, allowing you to create powerful search and retrieval applications. Below are examples of how to use LlamaIndex with `streamText` from the Vercel AI SDK.

## Setup

First, install the required dependencies:

```bash
npm install @llamaindex/vercel ai
```

## Using Local Vector Store

Here's how to create a simple vector store index and query it using Vercel's AI SDK:

```typescript
import { openai } from "@ai-sdk/openai";
import { llamaindex } from "@llamaindex/vercel";
import { streamText } from "ai";
import { Document, VectorStoreIndex } from "llamaindex";

// Create an index from your documents
const document = new Document({ text: yourText, id_: "unique-id" });
const index = await VectorStoreIndex.fromDocuments([document]);

// Create a query tool
const queryTool = llamaindex({
index,
description: "Search through the documents", // optional
});

// Use the tool with Vercel's AI SDK
streamText({
tools: { queryTool },
prompt: "Your question here",
model: openai("gpt-4"),
onFinish({ response }) {
console.log("Response:", response.messages); // log the response
},
}).toDataStream();
```

## Using LlamaCloud

For production deployments, you can use LlamaCloud to store and manage your documents:

```typescript
import { LlamaCloudIndex } from "llamaindex";

// Create a LlamaCloud index
const index = await LlamaCloudIndex.fromDocuments({
documents: [document],
name: "your-index-name",
projectName: "your-project",
apiKey: process.env.LLAMA_CLOUD_API_KEY,
});

// Use it the same way as VectorStoreIndex
const queryTool = llamaindex({
index,
description: "Search through the documents",
});

// Use the tool with Vercel's AI SDK
streamText({
tools: { queryTool },
prompt: "Your question here",
model: openai("gpt-4"),
}).toDataStream();
```

## Next Steps

1. Explore [LlamaCloud](https://cloud.llamaindex.ai/) for managed document storage and retrieval
2. Join our [Discord community](https://discord.gg/llamaindex) for support and discussions

3 changes: 3 additions & 0 deletions examples/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,20 @@
"private": true,
"version": "0.0.16",
"dependencies": {
"@ai-sdk/openai": "^1.0.5",
"@aws-crypto/sha256-js": "^5.2.0",
"@azure/cosmos": "^4.1.1",
"@azure/identity": "^4.4.1",
"@datastax/astra-db-ts": "^1.4.1",
"@llamaindex/core": "^0.4.10",
"@llamaindex/readers": "^1.0.11",
"@llamaindex/workflow": "^0.0.6",
"@llamaindex/vercel": "^0.0.1",
"@notionhq/client": "^2.2.15",
"@pinecone-database/pinecone": "^4.0.0",
"@vercel/postgres": "^0.10.0",
"@zilliz/milvus2-sdk-node": "^2.4.6",
"ai": "^4.0.0",
"chromadb": "^1.8.1",
"commander": "^12.1.0",
"dotenv": "^16.4.5",
Expand Down
50 changes: 50 additions & 0 deletions examples/vercel/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# Vercel Examples

These examples demonstrate how to integrate LlamaIndexTS with Vercel's AI SDK. The examples show how to use LlamaIndex for search and retrieval in both local vector store and LlamaCloud environments.

## Setup

To run these examples, first install the required dependencies from the parent folder `examples`:

```bash
npm i
```

## Running the Examples

Make sure to run the examples from the parent folder called `examples`. The following examples are available:

### Vector Store Example

Run the local vector store example with:

```bash
npx tsx vercel/vector-store.ts
```

This example demonstrates:

- Creating a vector store index from one document
- Using Vercel's AI SDK with LlamaIndex for streaming responses

### LlamaCloud Example

To run the LlamaCloud example:

```bash
npx tsx vercel/llamacloud.ts
```

This example requires a LlamaCloud API key set in your environment and an embedding model set in the `EMBEDDING_MODEL` environment variable:

```bash
export LLAMA_CLOUD_API_KEY=your_api_key_here
export EMBEDDING_MODEL="text-embedding-3-small"
```

The example demonstrates:

- Creating a LlamaCloud index from one document
- Streaming responses using Vercel's AI SDK

For more detailed information about the Vercel integration, check out [the documentation](https://ts.llamaindex.ai/docs/llamaindex/integration/vercel).
38 changes: 38 additions & 0 deletions examples/vercel/llamacloud.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
import { openai } from "@ai-sdk/openai";
import { llamaindex } from "@llamaindex/vercel";

Check failure on line 2 in examples/vercel/llamacloud.ts

View workflow job for this annotation

GitHub Actions / typecheck

Cannot find module '@llamaindex/vercel' or its corresponding type declarations.
import { streamText } from "ai";
import { Document, LlamaCloudIndex } from "llamaindex";
import fs from "node:fs/promises";

async function main() {
const path = "node_modules/llamaindex/examples/abramov.txt";
const essay = await fs.readFile(path, "utf-8");
const document = new Document({ text: essay, id_: path });

const index = await LlamaCloudIndex.fromDocuments({
documents: [document],
name: "test-pipeline",
projectName: "Default",
apiKey: process.env.LLAMA_CLOUD_API_KEY,
});
console.log("Successfully created index");

const result = streamText({
model: openai("gpt-4o"),
prompt: "Cost of moving cat from Russia to UK?",
tools: {
queryTool: llamaindex({
index,
description:
"get information from your knowledge base to answer questions.", // optional description
}),
},
maxSteps: 5,
});

for await (const textPart of result.textStream) {
process.stdout.write(textPart);
}
}

main().catch(console.error);
34 changes: 34 additions & 0 deletions examples/vercel/vector-store.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
import { openai } from "@ai-sdk/openai";
import { llamaindex } from "@llamaindex/vercel";

Check failure on line 2 in examples/vercel/vector-store.ts

View workflow job for this annotation

GitHub Actions / typecheck

Cannot find module '@llamaindex/vercel' or its corresponding type declarations.
import { streamText } from "ai";
import { Document, VectorStoreIndex } from "llamaindex";

import fs from "node:fs/promises";

async function main() {
const path = "node_modules/llamaindex/examples/abramov.txt";
const essay = await fs.readFile(path, "utf-8");
const document = new Document({ text: essay, id_: path });

const index = await VectorStoreIndex.fromDocuments([document]);
console.log("Successfully created index");

const result = streamText({
model: openai("gpt-4o"),
prompt: "Cost of moving cat from Russia to UK?",
tools: {
queryTool: llamaindex({
index,
description:
"get information from your knowledge base to answer questions.", // optional description
}),
},
maxSteps: 5,
});

for await (const textPart of result.textStream) {
process.stdout.write(textPart);
}
}

main().catch(console.error);
50 changes: 50 additions & 0 deletions packages/providers/vercel/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
{
"name": "@llamaindex/vercel",
"description": "Vercel Adapter for LlamaIndex",
"version": "0.0.1",
"type": "module",
"main": "./dist/index.cjs",
"module": "./dist/index.js",
"exports": {
".": {
"edge-light": {
"types": "./dist/index.edge-light.d.ts",
"default": "./dist/index.edge-light.js"
},
"workerd": {
"types": "./dist/index.edge-light.d.ts",
"default": "./dist/index.edge-light.js"
},
"require": {
"types": "./dist/index.d.cts",
"default": "./dist/index.cjs"
},
"import": {
"types": "./dist/index.d.ts",
"default": "./dist/index.js"
}
}
},
"files": [
"dist"
],
"repository": {
"type": "git",
"url": "https://github.com/run-llama/LlamaIndexTS.git",
"directory": "packages/providers/vercel"
},
"scripts": {
"build": "bunchee",
"dev": "bunchee --watch"
},
"devDependencies": {
"bunchee": "5.6.1"
},
"dependencies": {
"@llamaindex/core": "workspace:*",
"zod": "^3.23.8"
},
"peerDependencies": {
"ai": "^4.0.0"
}
}
1 change: 1 addition & 0 deletions packages/providers/vercel/src/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export { llamaindex } from "./tool";
29 changes: 29 additions & 0 deletions packages/providers/vercel/src/tool.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
import type { BaseQueryEngine } from "@llamaindex/core/query-engine";
import { type CoreTool, tool } from "ai";
import { z } from "zod";

interface DatasourceIndex {
asQueryEngine: () => BaseQueryEngine;
}

export function llamaindex({
index,
description,
}: {
index: DatasourceIndex;
description?: string;
}): CoreTool {
const queryEngine = index.asQueryEngine();
return tool({
description: description ?? "Get information about your documents.",
parameters: z.object({
query: z
.string()
.describe("The query to get information about your documents."),
}),
execute: async ({ query }) => {
const result = await queryEngine?.query({ query });
return result?.message.content ?? "No result found in documents.";
},
});
}
19 changes: 19 additions & 0 deletions packages/providers/vercel/tsconfig.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
{
"extends": "../../../tsconfig.json",
"compilerOptions": {
"target": "ESNext",
"module": "ESNext",
"moduleResolution": "bundler",
"outDir": "./lib",
"tsBuildInfoFile": "./lib/.tsbuildinfo"
},
"include": ["./src"],
"references": [
{
"path": "../../core/tsconfig.json"
},
{
"path": "../../env/tsconfig.json"
}
]
}
Loading

0 comments on commit fd38a25

Please sign in to comment.