A smart AI-powered web app for analyzing and managing project/task data, featuring advanced function calling and data analysis. Enjoy an AI chat that remembers previous conversations and delivers actionable insights.
- π€ Intelligent AI Chat with session-based memory
- π Function Calling for precise data search and filtering
- π Advanced Data Analysis with statistics and insights
- π¬ Automatic Conversation Routing based on question type
- π± Responsive Frontend built with Vue.js
- π Comprehensive Logging for monitoring and debugging
- π― Optimized Prompt Engineering for best results
- Search tasks by user/employee name
- Filter by status (todo, in-progress, done)
- Search by project/task name
- Filter by specific month
- Team and individual performance statistics
- Productivity comparison between users
- Monthly work trend analysis
- Identify bottlenecks and improvement areas
- Executive reports with actionable insights
- Complete team member list
- Detailed user and task assignment info
- Natural conversation in English (and Indonesian)
- Explains system features and capabilities
- General help and usage guidance
ai-task-analyst/
β
βββ app.py # FastAPI backend with function calling
βββ requirements.txt # Python dependencies
βββ .env # LLM API key
βββ database_task.sql # SQL schema + sample data
βββ static/
β βββ index.html # Responsive chat frontend
βββ tasks.db # (auto-generated) SQLite DB
βββ app.log # Log file for monitoring
βββ Dockerfile # Docker image configuration
βββ docker-compose.yml # Production Docker Compose
βββ docker-compose.dev.yml # Development Docker Compose
βββ nginx.conf # Nginx reverse proxy config
βββ .dockerignore # Docker build ignore file
βββ README-Docker.md # Docker documentation
The system uses a smart AI router to select the right function:
- search_activities - Search and filter data
- get_user_list - List users/employees
- analyze_full_data - Complex analysis and statistics
- general_conversation - General chat
search_activities(user_name, status, task_name, month)get_user_list()- Context stuffing for complex analysis
- General conversation for common questions
- Docker Desktop atau Docker Engine
- Docker Compose
# 1. Clone the repository
git clone <repo-url>
cd ai-task-analyst
# 2. Create .env file
cp env.example .env
# Edit .env with your API keys
# 3. Run with Docker (Development)
docker-compose -f docker-compose.dev.yml up --build
# 4. Or run with Docker (Production)
docker-compose up --build
# 5. Access the application
# Open: http://localhost:8000/static/index.html# Development mode (with hot reload)
docker-compose -f docker-compose.dev.yml up --build
# Production mode
docker-compose up --build
# With nginx reverse proxy
docker-compose --profile production up --build
# Stop containers
docker-compose down
# View logs
docker-compose logs -f- β Hot Reload untuk development
- β Nginx Reverse Proxy untuk production
- β Health Checks untuk monitoring
- β Volume Persistence untuk database dan logs
- β Environment Variables support
- β Network Isolation untuk keamanan
π Docker Documentation: Lihat README-Docker.md untuk panduan lengkap Docker.
git clone <repo-url>
cd ai-task-analyst# Using conda
conda create -n ai-task python=3.10
conda activate ai-task
# Or with venv
python -m venv venv
venv\Scripts\activate # Windows
source venv/bin/activate # Linux/Macpip install -r requirements.txtCreate a .env file in the root directory:
LLM_API_KEY=your_google_ai_studio_api_key
LLM_API_URL=https://generativelanguage.googleapis.com/v1beta/models/gemini-pro:generateContent?key=your_google_ai_studio_api_key
OLLAMA_HOST=http://127.0.0.1:11434
OLLAMA_MODEL=llama2Replace with your actual API key and endpoint as needed.
Already included in the repo, but make sure the top of the file looks like:
CREATE TABLE IF NOT EXISTS user (...);
CREATE TABLE IF NOT EXISTS task (...);
CREATE TABLE IF NOT EXISTS activity (...);
-- then INSERT ...# Development mode (with hot reload)
docker-compose -f docker-compose.dev.yml up --build
# Production mode
docker-compose up --build
# With nginx reverse proxy
docker-compose --profile production up --build# Start the backend
uvicorn app:app --reload- The server will auto-create
tasks.dbfromdatabase_task.sqlif it doesn't exist. - Logs are written to
app.logfor monitoring.
Open in your browser:
- Docker: http://localhost:8000/static/index.html
- Manual: http://127.0.0.1:8000/static/index.html
- With Nginx: http://localhost/static/index.html
- "Find tasks assigned to Budi"
- "Show tasks with status done"
- "Activities in January"
- "Website project tasks"
- "Who is the most productive?"
- "How many tasks were completed this month?"
- "Compare team performance"
- "Generate monthly report"
- "Work statistics"
- "Who are the team members?"
- "Show all employees"
- "User list"
- "Hello, how are you?"
- "What can you do?"
- "Thank you"
# Check what's using port 8000
netstat -tulpn | grep :8000
# Kill process using port 8000
sudo kill -9 $(lsof -t -i:8000)
# Or change port in docker-compose.yml
ports:
- "8001:8000" # Use port 8001 instead# Check container logs
docker-compose logs -f
# Rebuild without cache
docker-compose build --no-cache
# Remove and recreate containers
docker-compose down -v --remove-orphans
docker-compose up --build# Remove existing database
rm tasks.db
# Restart container (will recreate database)
docker-compose restartCause:
The tasks.db file exists but is empty or has the wrong schema.
Solution:
- Delete the
tasks.dbfile in the project folder. - Restart the server (
uvicorn app:app --reload).- The server will automatically recreate the database from
database_task.sql.
- The server will automatically recreate the database from
- Make sure your environment is active.
- Run:
pip install fastapi uvicorn
- Run:
pip install uvicorn
- Ensure
.envis correct and API key is active. - Check
app.logfor error details. - Make sure the API response format matches expectations.
- Use the
/chatendpoint (not/chat_stream). - Make sure the frontend is updated to use the correct endpoint.
# Fix file permissions
sudo chown -R $USER:$USER .
# Or run with user permissions
docker-compose run --user $(id -u):$(id -g) ai-task-analyst- Router Prompt: Smart AI for function selection
- Summary Prompt: Informative and structured data presentation
- Analysis Prompt: Deep analysis with actionable insights
- General Prompt: Friendly, natural conversation
- HTML tables for structured data
- Markdown support for formatting
- Code blocks for technical output
- Responsive design for mobile
- 45-second timeout for API calls
- Detailed logging for monitoring
- Robust error handling
- Efficient session management
- All data is sourced from the SQLite database (
tasks.db). - To update data, edit
database_task.sql, deletetasks.db, and restart the server. - Chat history per session is stored in the database.
- Detailed logs are available in
app.logfor debugging.
- For production, use
docker-compose --profile production up -d - Add SSL/HTTPS certificates to nginx configuration
- Use Docker Swarm or Kubernetes for scaling
- Implement health checks and monitoring
- For large datasets, limit data sent to the LLM.
- Switch LLM to OpenAI, Ollama, etc. by updating
.env. - For deployment, use a production server (e.g.,
uvicorn app:app --host 0.0.0.0 --port 80). - Add authentication and authorization as needed.
- Implement streaming responses for better UX.
- Add more AI models support (OpenAI, Anthropic, etc.)
- Implement real-time notifications
- Add user authentication and role-based access
- Create mobile app version
- Add data export/import features
Free to use for learning and development purposes.
If you encounter other errors, check app.log or ask here!
