This is a LlamaIndex project bootstrapped with create-llama
.
We use our recursive retriever, combined with an OpenAI Agent, to create a bot capable of tabular/semi-structured/unstructured analysis within complex docs.
This also streams the intermediate results from the agent via a custom Callback handler
This extends beyond the simple create-llama
example. To see changes, look at the following files:
backend/app/utils/index.py
- contains core logic for constructing + getting agentbackend/app/api/routers/chat.py
- contains implementation of chat endpoint
First, startup the backend as described in the backend README.
Second, run the development server of the frontend as described in the frontend README.
Open http://localhost:3000 with your browser to see the result.
To learn more about LlamaIndex, take a look at the following resources:
- LlamaIndex Documentation - learn about LlamaIndex (Python features).
- LlamaIndexTS Documentation - learn about LlamaIndex (Typescript features).
You can check out the LlamaIndexTS GitHub repository - your feedback and contributions are welcome!