This project aims to create a simple chat interface with a local Language Model (LLM) installed by ollama. The purpose of this project is to provide hands-on experience in building an LLM application from scratch.
TBA
- make serve (runs python gprc server)
- make grpc-proxy (runs proxy between grpc and web app)
- make ui-dev (runs angular, very simple app for now.........)
To improve this project, consider adding more features like: - Support for multiple users - Persistent user sessions - Integration with other NLP models (e.g., sentiment analysis, embeddings, function execution)