An AI-powered pizza ordering assistant built with FastAPI, LangChain, Google Gemini, and a Vite + React frontend.
PizzaBot supports conversational ordering, policy/FAQ lookup, and flexible vector search.
cd backend
cp .env.example .env # set GOOGLE_API_KEY (and other keys if needed)
pip install -r requirements.txt
uvicorn app.main:app --reload
cd frontend
npm install
npm run dev
curl -X POST http://localhost:8000/api/ingest \
-H 'Content-Type: application/json' \
-d '[{"id":"faq1","text":"Delivery within 5km. Cash or card."}]'
π Project Structure
pizzabot/
βββ backend/ # FastAPI + LangChain services
β βββ app/ # routers, graph, config, schemas
β βββ .env.example # environment template
β βββ requirements.txt
βββ frontend/ # Vite + React client
β βββ src/ # components, pages, API calls
β βββ package.json
βββ README.md
ποΈ Architecture
Routers β Graph β LLM / Vector β Config / Schemas
Loose coupling with dependency injection for easier testing & swapping components
Pydantic structured output ensures consistent contracts for UI
Vector DB abstraction: swap Chroma for another database without touching routers or UI
LangSmith tracing auto-enabled if LANGSMITH_API_KEY is set
Gemini-compatible via langchain-google-genai (ChatGoogleGenerativeAI)
π Tech Stack
Backend: FastAPI, LangChain, ChromaDB, Google Gemini
Frontend: React, Vite, Bootstrap
Infra: dotenv, Pydantic, Uvicorn
π Environment Variables
Create .env inside backend/ (see .env.example):
# Google Gemini
GOOGLE_API_KEY=your_api_key_here
# LangSmith (optional)
LANGSMITH_API_KEY=your_langsmith_key_here
# Backend Config
ENV=dev
HOST=0.0.0.0
PORT=8000
VECTORDB_DIR=./data/chroma
FRONTEND_ORIGIN= http://localhost:5173
π Roadmap
Save orders to database
User authentication & profiles
Payment gateway integration
Multi-LLM backend support (OpenAI, Anthropic, etc.)
Deployment with Docker
π€ Contributing
Contributions are welcome!
Fork the repo, create a feature branch, and open a Pull Request.
π License
MIT License Β© 2025 PizzaBot Authors