-
Notifications
You must be signed in to change notification settings - Fork 373
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat: add vercel chat example (#1532)
Co-authored-by: Marcus Schiesser <[email protected]>
- Loading branch information
1 parent
510191c
commit fd38a25
Showing
13 changed files
with
452 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
--- | ||
"@llamaindex/vercel": patch | ||
"@llamaindex/doc": patch | ||
"@llamaindex/examples": patch | ||
--- | ||
|
||
Add vercel tool adapter to use query engine tool |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,5 @@ | ||
{ | ||
"title": "Integration", | ||
"description": "See our integrations", | ||
"pages": ["open-llm-metry", "lang-trace"] | ||
"pages": ["open-llm-metry", "lang-trace", "vercel"] | ||
} |
80 changes: 80 additions & 0 deletions
80
apps/next/src/content/docs/llamaindex/integration/vercel.mdx
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,80 @@ | ||
--- | ||
title: Vercel | ||
description: Integrate LlamaIndex with Vercel's AI SDK | ||
--- | ||
|
||
LlamaIndex provides integration with Vercel's AI SDK, allowing you to create powerful search and retrieval applications. Below are examples of how to use LlamaIndex with `streamText` from the Vercel AI SDK. | ||
|
||
## Setup | ||
|
||
First, install the required dependencies: | ||
|
||
```bash | ||
npm install @llamaindex/vercel ai | ||
``` | ||
|
||
## Using Local Vector Store | ||
|
||
Here's how to create a simple vector store index and query it using Vercel's AI SDK: | ||
|
||
```typescript | ||
import { openai } from "@ai-sdk/openai"; | ||
import { llamaindex } from "@llamaindex/vercel"; | ||
import { streamText } from "ai"; | ||
import { Document, VectorStoreIndex } from "llamaindex"; | ||
|
||
// Create an index from your documents | ||
const document = new Document({ text: yourText, id_: "unique-id" }); | ||
const index = await VectorStoreIndex.fromDocuments([document]); | ||
|
||
// Create a query tool | ||
const queryTool = llamaindex({ | ||
index, | ||
description: "Search through the documents", // optional | ||
}); | ||
|
||
// Use the tool with Vercel's AI SDK | ||
streamText({ | ||
tools: { queryTool }, | ||
prompt: "Your question here", | ||
model: openai("gpt-4"), | ||
onFinish({ response }) { | ||
console.log("Response:", response.messages); // log the response | ||
}, | ||
}).toDataStream(); | ||
``` | ||
|
||
## Using LlamaCloud | ||
|
||
For production deployments, you can use LlamaCloud to store and manage your documents: | ||
|
||
```typescript | ||
import { LlamaCloudIndex } from "llamaindex"; | ||
|
||
// Create a LlamaCloud index | ||
const index = await LlamaCloudIndex.fromDocuments({ | ||
documents: [document], | ||
name: "your-index-name", | ||
projectName: "your-project", | ||
apiKey: process.env.LLAMA_CLOUD_API_KEY, | ||
}); | ||
|
||
// Use it the same way as VectorStoreIndex | ||
const queryTool = llamaindex({ | ||
index, | ||
description: "Search through the documents", | ||
}); | ||
|
||
// Use the tool with Vercel's AI SDK | ||
streamText({ | ||
tools: { queryTool }, | ||
prompt: "Your question here", | ||
model: openai("gpt-4"), | ||
}).toDataStream(); | ||
``` | ||
|
||
## Next Steps | ||
|
||
1. Explore [LlamaCloud](https://cloud.llamaindex.ai/) for managed document storage and retrieval | ||
2. Join our [Discord community](https://discord.gg/llamaindex) for support and discussions | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,50 @@ | ||
# Vercel Examples | ||
|
||
These examples demonstrate how to integrate LlamaIndexTS with Vercel's AI SDK. The examples show how to use LlamaIndex for search and retrieval in both local vector store and LlamaCloud environments. | ||
|
||
## Setup | ||
|
||
To run these examples, first install the required dependencies from the parent folder `examples`: | ||
|
||
```bash | ||
npm i | ||
``` | ||
|
||
## Running the Examples | ||
|
||
Make sure to run the examples from the parent folder called `examples`. The following examples are available: | ||
|
||
### Vector Store Example | ||
|
||
Run the local vector store example with: | ||
|
||
```bash | ||
npx tsx vercel/vector-store.ts | ||
``` | ||
|
||
This example demonstrates: | ||
|
||
- Creating a vector store index from one document | ||
- Using Vercel's AI SDK with LlamaIndex for streaming responses | ||
|
||
### LlamaCloud Example | ||
|
||
To run the LlamaCloud example: | ||
|
||
```bash | ||
npx tsx vercel/llamacloud.ts | ||
``` | ||
|
||
This example requires a LlamaCloud API key set in your environment and an embedding model set in the `EMBEDDING_MODEL` environment variable: | ||
|
||
```bash | ||
export LLAMA_CLOUD_API_KEY=your_api_key_here | ||
export EMBEDDING_MODEL="text-embedding-3-small" | ||
``` | ||
|
||
The example demonstrates: | ||
|
||
- Creating a LlamaCloud index from one document | ||
- Streaming responses using Vercel's AI SDK | ||
|
||
For more detailed information about the Vercel integration, check out [the documentation](https://ts.llamaindex.ai/docs/llamaindex/integration/vercel). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
import { openai } from "@ai-sdk/openai"; | ||
import { llamaindex } from "@llamaindex/vercel"; | ||
import { streamText } from "ai"; | ||
import { Document, LlamaCloudIndex } from "llamaindex"; | ||
import fs from "node:fs/promises"; | ||
|
||
async function main() { | ||
const path = "node_modules/llamaindex/examples/abramov.txt"; | ||
const essay = await fs.readFile(path, "utf-8"); | ||
const document = new Document({ text: essay, id_: path }); | ||
|
||
const index = await LlamaCloudIndex.fromDocuments({ | ||
documents: [document], | ||
name: "test-pipeline", | ||
projectName: "Default", | ||
apiKey: process.env.LLAMA_CLOUD_API_KEY, | ||
}); | ||
console.log("Successfully created index"); | ||
|
||
const result = streamText({ | ||
model: openai("gpt-4o"), | ||
prompt: "Cost of moving cat from Russia to UK?", | ||
tools: { | ||
queryTool: llamaindex({ | ||
index, | ||
description: | ||
"get information from your knowledge base to answer questions.", // optional description | ||
}), | ||
}, | ||
maxSteps: 5, | ||
}); | ||
|
||
for await (const textPart of result.textStream) { | ||
process.stdout.write(textPart); | ||
} | ||
} | ||
|
||
main().catch(console.error); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
import { openai } from "@ai-sdk/openai"; | ||
import { llamaindex } from "@llamaindex/vercel"; | ||
import { streamText } from "ai"; | ||
import { Document, VectorStoreIndex } from "llamaindex"; | ||
|
||
import fs from "node:fs/promises"; | ||
|
||
async function main() { | ||
const path = "node_modules/llamaindex/examples/abramov.txt"; | ||
const essay = await fs.readFile(path, "utf-8"); | ||
const document = new Document({ text: essay, id_: path }); | ||
|
||
const index = await VectorStoreIndex.fromDocuments([document]); | ||
console.log("Successfully created index"); | ||
|
||
const result = streamText({ | ||
model: openai("gpt-4o"), | ||
prompt: "Cost of moving cat from Russia to UK?", | ||
tools: { | ||
queryTool: llamaindex({ | ||
index, | ||
description: | ||
"get information from your knowledge base to answer questions.", // optional description | ||
}), | ||
}, | ||
maxSteps: 5, | ||
}); | ||
|
||
for await (const textPart of result.textStream) { | ||
process.stdout.write(textPart); | ||
} | ||
} | ||
|
||
main().catch(console.error); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,50 @@ | ||
{ | ||
"name": "@llamaindex/vercel", | ||
"description": "Vercel Adapter for LlamaIndex", | ||
"version": "0.0.1", | ||
"type": "module", | ||
"main": "./dist/index.cjs", | ||
"module": "./dist/index.js", | ||
"exports": { | ||
".": { | ||
"edge-light": { | ||
"types": "./dist/index.edge-light.d.ts", | ||
"default": "./dist/index.edge-light.js" | ||
}, | ||
"workerd": { | ||
"types": "./dist/index.edge-light.d.ts", | ||
"default": "./dist/index.edge-light.js" | ||
}, | ||
"require": { | ||
"types": "./dist/index.d.cts", | ||
"default": "./dist/index.cjs" | ||
}, | ||
"import": { | ||
"types": "./dist/index.d.ts", | ||
"default": "./dist/index.js" | ||
} | ||
} | ||
}, | ||
"files": [ | ||
"dist" | ||
], | ||
"repository": { | ||
"type": "git", | ||
"url": "https://github.com/run-llama/LlamaIndexTS.git", | ||
"directory": "packages/providers/vercel" | ||
}, | ||
"scripts": { | ||
"build": "bunchee", | ||
"dev": "bunchee --watch" | ||
}, | ||
"devDependencies": { | ||
"bunchee": "5.6.1" | ||
}, | ||
"dependencies": { | ||
"@llamaindex/core": "workspace:*", | ||
"zod": "^3.23.8" | ||
}, | ||
"peerDependencies": { | ||
"ai": "^4.0.0" | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
export { llamaindex } from "./tool"; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
import type { BaseQueryEngine } from "@llamaindex/core/query-engine"; | ||
import { type CoreTool, tool } from "ai"; | ||
import { z } from "zod"; | ||
|
||
interface DatasourceIndex { | ||
asQueryEngine: () => BaseQueryEngine; | ||
} | ||
|
||
export function llamaindex({ | ||
index, | ||
description, | ||
}: { | ||
index: DatasourceIndex; | ||
description?: string; | ||
}): CoreTool { | ||
const queryEngine = index.asQueryEngine(); | ||
return tool({ | ||
description: description ?? "Get information about your documents.", | ||
parameters: z.object({ | ||
query: z | ||
.string() | ||
.describe("The query to get information about your documents."), | ||
}), | ||
execute: async ({ query }) => { | ||
const result = await queryEngine?.query({ query }); | ||
return result?.message.content ?? "No result found in documents."; | ||
}, | ||
}); | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
{ | ||
"extends": "../../../tsconfig.json", | ||
"compilerOptions": { | ||
"target": "ESNext", | ||
"module": "ESNext", | ||
"moduleResolution": "bundler", | ||
"outDir": "./lib", | ||
"tsBuildInfoFile": "./lib/.tsbuildinfo" | ||
}, | ||
"include": ["./src"], | ||
"references": [ | ||
{ | ||
"path": "../../core/tsconfig.json" | ||
}, | ||
{ | ||
"path": "../../env/tsconfig.json" | ||
} | ||
] | ||
} |
Oops, something went wrong.