An AI-powered chat interface for querying PDF documents. Built using Langchain, OpenAI, Pinecone, and NextJS 13.
Follow these steps to set up and run the service locally :
- Next.js
- LangchainJS
- PineCone Vector Database
To run this app, you need the following:
- Clone the repository :
git clone https://github.com/Urias-T/StudyBuddy
- Navigate to the project directory :
cd StudyBuddy
- Install dependencies :
npm install
-
Create a
.env.local
file and populate it with your "OPENAI_API_KEY", "PINECONE_API_KEY" and "PINECONE_ENVIRONMENT" variables. -
Create a directory
documents
and include the pdf files you want to query. -
Run the app:
npm run dev
That's it! The web app would be running on localhost:3000
. 🤗
The first time you run the app, you need to run the setup flow:
- Put your pdf files in the
documents
directory. - Click on the "Create index and embeddings" link to trigger the setup of your Pinecone index with your documents.
After the initial setup, you only need to ask questions in the text box and the LLM would respond using your document embeddings as context.
If you want to contribute to this project, please open an issue and submit a pull request.
This project is made available under the MIT License.