A customer support chatbot for a shoe store running locally with FastAPI + Ollama. This project stores conversation history in a database and provides a REST API that can be integrated with other applications.
- Python 3.11 (recommended) and virtual environment
- macOS/Linux:
python -m venv .venv && source .venv/bin/activate - Windows (PowerShell):
python -m venv .venv; .\\.venv\\Scripts\\Activate.ps1
- macOS/Linux:
- Clone repository
git clone <your-repo-url> cd Customer-Support-Chatbot-Project
- Install dependencies
pip install -r requirements.txt
- Install Ollama and local LLM model
- Install Ollama: see official documentation (
https://ollama.com) - Run server:
ollama serve - Pull model:
ollama pull llama3.2:3b
- Install Ollama: see official documentation (
- Setup database
- Default: SQLite (automatically created and seeded on first run)
- Optional: MySQL (set ENV
DATABASE_URL, examplemysql+pymysql://root:root@127.0.0.1:3306/shoe_support)
- Run application (REST API on localhost)
uvicorn app.main:app --reload # API : http://localhost:8000 # Docs : http://localhost:8000/docs (Swagger UI for API testing) # OpenAPI : http://localhost:8000/openapi.json (JSON spec for BE/FE) # Web UI : http://localhost:8000/web
- Optional (bonus): Docker
docker compose up --build # Services: api + db (MySQL 8) + ollama
Purpose: store chat history so conversation context can be reused by the LLM.
- Challenge-required table:
chat_history- Columns:
id(primary key),user_message,bot_response,timestamp
- Columns:
Example SQL schema:
CREATE TABLE IF NOT EXISTS chat_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_message TEXT NOT NULL,
bot_response TEXT NOT NULL,
timestamp DATETIME DEFAULT CURRENT_TIMESTAMP
);Note: This repository implementation already stores conversations in the conversations table (with user/assistant roles) and also has catalog tables (products, product_sizes) and orders (orders).
- FastAPI: REST API
- Uvicorn: ASGI server
- SQLAlchemy: ORM/database access
- PyMySQL: MySQL driver (optional)
- Pydantic: request/response schema
- Requests: HTTP client (calling Ollama API)
- python-dotenv: load environment variables
- cryptography: modern MySQL auth support
- Ollama: local LLM runtime
(As per requirements.txt.)
- Llama 3.2 (3B) via Ollama (running locally)
- Reason: lightweight, open-source, and meets challenge requirements for local LLM
- Order status
- Examples: "Where is my order?", "status of sela's order", "check order #12"
- Product information
- Examples: "What are the advantages of Air Max 90?", "details of Ultraboost 22"
- Size availability & stock per size
- Examples: "How much stock for Ultraboost 22 size 42?", "what sizes are available?"
- Warranty policy
- Examples: "How do I claim warranty?"
- Note: can be expanded for other questions as needed.
Demo users (seed) for order checking: adit, sela, gilang. When testing order status/delivery, set the user field in the payload to match these names.
- Order Status Lookup
- Chatbot calls external function to check order status based on intent (regex) and/or
order_idextracted from message. Iforder_idis not available, system uses the last order belonging touserin the payload. - Standard status output:
processing,shipped,delivered, along with product name. - Example payload:
{ "user": "sela", "message": "order status" }
- Chatbot calls external function to check order status based on intent (regex) and/or
- Catalog Lookup (expandable)
- Product details, available sizes, stock per size.
- Warranty Info (expandable)
- Returns fixed warranty policy text.
Additional: Swagger docs at http://localhost:8000/docs can be used for interactive endpoint testing, and OpenAPI JSON specification is available at http://localhost:8000/openapi.json for BE/FE integration needs (generate client or import to Postman/Insomnia).
- Endpoints:
GET /health,GET /products,GET /orders/{user},POST /chat - Docs:
http://localhost:8000/docs(interactive testing), OpenAPI:/openapi.json - UI:
http://localhost:8000/web
- RAG focus: catalog data from DB, LLM as complement.
- Simple web UI at
/web, configuration via ENV. - Ready-to-use intents: order status, product info, size/stock, warranty.
app/ (main.py, db.py, models.py, utils.py, schemas.py, config.py, llm.py)
frontend/ (index.html)
data/ (database.mysql.sql)
Dockerfile, docker-compose.yml, requirements.txt, LICENSE- Python 3.11+, Ollama (model
llama3.2:3b), SQLite (default) / MySQL (optional) - Main ENV:
DATABASE_URL,OLLAMA_HOST,OLLAMA_MODEL,MAX_HISTORY_MESSAGES
- Local:
uvicorn app.main:app --reload - Docker:
docker compose up --build - See detailed steps in section 1 (Installation)
GET /health→ simple health checkGET /products→ brief product listGET /orders/{user}→ last order for userPOST /chat→ chatbot conversation (RAG-first)
Request:
{
"user": "gilang",
"message": "How much stock for size 42 Ultraboost 22?"
}Response:
{ "answer": "Ultraboost 22 size 42 stock: 4 pairs." }Example cURL:
curl -s http://localhost:8000/chat \
-H 'Content-Type: application/json' \
-d '{"user":"gilang","message":"What types of shoes are available?"}'- Open
http://localhost:8000/docsto try endpoints interactively (Swagger UI). You can fill in request body, press "Execute" button, and see response directly. - OpenAPI specification available at
http://localhost:8000/openapi.json. This is useful for:- BE/FE integration (generate client with tools like
openapi-generator/swagger-codegen) - Import to Postman/Insomnia for automatic request collection
- BE/FE integration (generate client with tools like
Example fetching OpenAPI JSON:
curl -s http://localhost:8000/openapi.json | jq '.info, .paths["/chat"]'- Access
http://localhost:8000/web - Input stays at bottom of screen, suitable for mobile
- Quick prompt examples available below input
- Main tables:
products,product_sizes(stock per size),orders,conversations. - Seed automatically creates 16+ products from various categories/brands, stock per size, and sample orders for users like
gilang,sela.
- Intent and extraction (regex/heuristic) in
app/utils.py. - Catalog/stock queries in
app/db.py(SQLAlchemy). Tool answers (database) are formatted and prioritized. - If not answered by tool, combined prompt (history + tool context) is sent to Ollama via
app/llm.py. - Conversation history stored in
conversationstable; number sent to LLM limited byMAX_HISTORY_MESSAGES.
- Order Status Lookup: check status based on
order_idor user's last order. - Catalog Lookup: product details, available sizes, stock per size.
- Warranty Info: fixed warranty policy text.
Example order status test payload:
{ "user": "sela", "message": "order status" }Demo users: adit, sela, gilang.
- Questions requiring precise data: order status, stock per size, size availability, price list, category/brand/size filter → tool is called.
- General questions (how to choose size, care tips, casual model suggestions) → answered directly by LLM (concise, empathetic), without fabricating numbers.
Available sample users: adit, sela, gilang. To check order status/delivery, set the user column in payload to match these user names.
Examples:
curl -s http://localhost:8000/chat \
-H 'Content-Type: application/json' \
-d '{"user":"adit","message":"check my order"}'
curl -s http://localhost:8000/chat \
-H 'Content-Type: application/json' \
-d '{"user":"sela","message":"order status"}'
# Or directly by order id (user can be anything):
curl -s http://localhost:8000/chat \
-H 'Content-Type: application/json' \
-d '{"user":"gilang","message":"check my order"}'Helper endpoint:
GET /orders/{user}→ get last order foradit|sela|gilang.
POST /chat
{
"user": "gilang",
"message": "How much stock for Ultraboost 22 size 42?"
}
Response:
{
"answer": "Ultraboost 22 size 42 stock: 4 pairs. Need help checking colors or similar alternatives?"
}- API version: see
app/main.py→FastAPI(..., version="1.5.0"). - Static UI mounted at
/webwithStaticFiles.
- Ensure Ollama is running and model is available (
OLLAMA_HOSTis correct). In Docker,ollamaservice automatically exposes11434. - For local MySQL, ensure DSN
mysql+pymysql://...is valid and user has rights to create tables. - If seed doesn't appear, delete SQLite file
data/shoe_support.dband restart, or ensure DB is empty.
See LICENSE file.
Planned: pytest (TBD). tests/ folder not yet included.