From c88332366b794a4afacefe5d8fb02425e84f044e Mon Sep 17 00:00:00 2001 From: Logan Markewich Date: Tue, 14 Nov 2023 10:30:55 -0600 Subject: [PATCH 1/3] readme update --- packages/create-llama/README.md | 35 +++++++++++++++++++++++++++------ 1 file changed, 29 insertions(+), 6 deletions(-) diff --git a/packages/create-llama/README.md b/packages/create-llama/README.md index 5cb6102661..41160446c8 100644 --- a/packages/create-llama/README.md +++ b/packages/create-llama/README.md @@ -1,7 +1,16 @@ # Create LlamaIndex App -The easiest way to get started with LlamaIndex is by using `create-llama`. This CLI tool enables you to quickly start building a new LlamaIndex application, with everything set up for you. -To get started, use the following command: +The easiest way to get started with [LlamaIndex](https://www.llamaindex.ai/) is by using `create-llama`. This CLI tool enables you to quickly start building a new LlamaIndex application, with everything set up for you. + +## Features + +- NextJS, ExpressJS, or FastAPI (python) stateless backend generation 💻 +- Streaming or non-streaming backend ⚡ +- Optional `shadcn` or `html` frontend generation 🎨 + +## Get Started + +You can run `create-llama` in interactive or non-interatactive mode. ### Interactive @@ -17,15 +26,25 @@ yarn create llama pnpm create llama ``` -You will be asked for the name of your project, and then which framework you want to use -create a TypeScript project: +You will be asked for the name of your project, along with other configuration options. + +Here is an example: ```bash +>> npm create llama +Need to install the following packages: + create-llama@0.0.3 +Ok to proceed? (y) y +✔ What is your project named? … my-app +✔ Which template would you like to use? › Chat with streaming ✔ Which framework would you like to use? › NextJS +✔ Which UI would you like to use? › Just HTML +✔ Which chat engine would you like to use? › ContextChatEngine +✔ Please provide your OpenAI API key (leave blank to skip): … +✔ Would you like to use ESLint? … No / Yes +Creating a new LlamaIndex app in /home/my-app. ``` -You can choose between NextJS and Express. - ### Non-interactive You can also pass command line arguments to set up a new project @@ -52,3 +71,7 @@ Options: ``` +## LlamaIndex Documentation + +- [TS/JS docs](https://ts.llamaindex.ai/) +- [Python docs](https://docs.llamaindex.ai/en/stable/) From d6700113635015caef548f7bd9f1b44423b2d83b Mon Sep 17 00:00:00 2001 From: Logan Markewich Date: Tue, 14 Nov 2023 10:57:16 -0600 Subject: [PATCH 2/3] typo --- packages/create-llama/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/packages/create-llama/README.md b/packages/create-llama/README.md index 41160446c8..b18a8ad139 100644 --- a/packages/create-llama/README.md +++ b/packages/create-llama/README.md @@ -10,7 +10,7 @@ The easiest way to get started with [LlamaIndex](https://www.llamaindex.ai/) is ## Get Started -You can run `create-llama` in interactive or non-interatactive mode. +You can run `create-llama` in interactive or non-interactive mode. ### Interactive From c1ce84ececf767cb788521cf26617be2cde09269 Mon Sep 17 00:00:00 2001 From: yisding Date: Tue, 14 Nov 2023 09:06:56 -0800 Subject: [PATCH 3/3] Update README.md --- packages/create-llama/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/packages/create-llama/README.md b/packages/create-llama/README.md index b18a8ad139..e326767377 100644 --- a/packages/create-llama/README.md +++ b/packages/create-llama/README.md @@ -6,7 +6,7 @@ The easiest way to get started with [LlamaIndex](https://www.llamaindex.ai/) is - NextJS, ExpressJS, or FastAPI (python) stateless backend generation 💻 - Streaming or non-streaming backend ⚡ -- Optional `shadcn` or `html` frontend generation 🎨 +- Optional `shadcn` frontend generation 🎨 ## Get Started