A production-inspired backend system that schedules and executes tasks across multiple distributed worker nodes. Built with Python, FastAPI, Celery, Redis, and PostgreSQL.
- Task Submission API – REST API to submit and query tasks with unique IDs
- Distributed Task Queue – Celery with Redis, at-least-once execution, retry with backoff
- Regional Workers – US, India, and EU worker nodes
- Fault Tolerance – Worker crash detection, task reassignment, idempotent execution
- Monitoring – Prometheus metrics, structured logging, health checks
- Docker & Docker Compose
- Python 3.11+ (for local development)
# Copy environment file
cp .env.example .env
# Start all services (PostgreSQL, Redis, API, workers, beat)
docker-compose up -d
# Run database migrations (first time)
docker-compose exec api alembic upgrade head- API: http://localhost:8000
- API Docs: http://localhost:8000/docs
- Health: http://localhost:8000/health/ready
- Metrics: http://localhost:8000/metrics
# Create virtual environment
python -m venv venv
venv\Scripts\activate # Windows
# source venv/bin/activate # Linux/Mac
# Install dependencies
pip install -r requirements.txt
# Start PostgreSQL and Redis (via Docker)
docker-compose up -d postgres redis
# Run migrations
$env:DATABASE_URL="postgresql://postgres:postgres@localhost:5432/task_scheduler"
$env:REDIS_URL="redis://localhost:6379/0"
$env:CELERY_BROKER_URL="redis://localhost:6379/1"
alembic upgrade head
# Start API
uvicorn app.main:app --reload --port 8000
# In another terminal - Start a worker
$env:REGION="US"
celery -A app.workers.celery_app worker -Q us-tasks -l infocurl -X POST http://localhost:8000/api/v1/tasks \
-H "Content-Type: application/json" \
-d '{
"task_type": "process_data",
"payload": {"data": "hello", "delay_seconds": 1},
"region": "US",
"max_retries": 3
}'curl http://localhost:8000/api/v1/tasks/{task_id}curl "http://localhost:8000/api/v1/tasks?status=PENDING®ion=US&page=1&page_size=20"curl -X POST http://localhost:8000/api/v1/tasks/{task_id}/cancel| Type | Description | Payload Example |
|---|---|---|
process_data |
Process data with delay | {"data": "value", "delay_seconds": 1} |
send_email |
Simulated email send | {"to": "user@example.com", "subject": "Hi"} |
generate_report |
Simulated report | {"type": "summary"} |
| custom | Returns payload as result | Any JSON |
├── app/
│ ├── main.py # FastAPI app
│ ├── config.py # Settings
│ ├── api/ # REST endpoints
│ ├── core/ # DB, Redis
│ ├── models/ # SQLAlchemy models
│ ├── schemas/ # Pydantic schemas
│ ├── services/ # Business logic
│ ├── workers/ # Celery tasks
│ └── monitoring/ # Logs, metrics
├── alembic/ # Migrations
├── docker-compose.yml
└── Dockerfile
MIT