Skip to content

tejeshvenkat/Globally-Distributed-Task-Scheduling-Monitoring-System

Repository files navigation

Globally Distributed Task Scheduling & Monitoring System

A production-inspired backend system that schedules and executes tasks across multiple distributed worker nodes. Built with Python, FastAPI, Celery, Redis, and PostgreSQL.

Features

  • Task Submission API – REST API to submit and query tasks with unique IDs
  • Distributed Task Queue – Celery with Redis, at-least-once execution, retry with backoff
  • Regional Workers – US, India, and EU worker nodes
  • Fault Tolerance – Worker crash detection, task reassignment, idempotent execution
  • Monitoring – Prometheus metrics, structured logging, health checks

Quick Start

Prerequisites

  • Docker & Docker Compose
  • Python 3.11+ (for local development)

Run with Docker Compose

# Copy environment file
cp .env.example .env

# Start all services (PostgreSQL, Redis, API, workers, beat)
docker-compose up -d

# Run database migrations (first time)
docker-compose exec api alembic upgrade head

Local Development (without Docker for API)

# Create virtual environment
python -m venv venv
venv\Scripts\activate   # Windows
# source venv/bin/activate  # Linux/Mac

# Install dependencies
pip install -r requirements.txt

# Start PostgreSQL and Redis (via Docker)
docker-compose up -d postgres redis

# Run migrations
$env:DATABASE_URL="postgresql://postgres:postgres@localhost:5432/task_scheduler"
$env:REDIS_URL="redis://localhost:6379/0"
$env:CELERY_BROKER_URL="redis://localhost:6379/1"
alembic upgrade head

# Start API
uvicorn app.main:app --reload --port 8000

# In another terminal - Start a worker
$env:REGION="US"
celery -A app.workers.celery_app worker -Q us-tasks -l info

API Usage

Submit a Task

curl -X POST http://localhost:8000/api/v1/tasks \
  -H "Content-Type: application/json" \
  -d '{
    "task_type": "process_data",
    "payload": {"data": "hello", "delay_seconds": 1},
    "region": "US",
    "max_retries": 3
  }'

Get Task Status

curl http://localhost:8000/api/v1/tasks/{task_id}

List Tasks

curl "http://localhost:8000/api/v1/tasks?status=PENDING&region=US&page=1&page_size=20"

Cancel Task

curl -X POST http://localhost:8000/api/v1/tasks/{task_id}/cancel

Task Types (Built-in)

Type Description Payload Example
process_data Process data with delay {"data": "value", "delay_seconds": 1}
send_email Simulated email send {"to": "user@example.com", "subject": "Hi"}
generate_report Simulated report {"type": "summary"}
custom Returns payload as result Any JSON

Project Structure

├── app/
│   ├── main.py           # FastAPI app
│   ├── config.py         # Settings
│   ├── api/              # REST endpoints
│   ├── core/             # DB, Redis
│   ├── models/           # SQLAlchemy models
│   ├── schemas/          # Pydantic schemas
│   ├── services/         # Business logic
│   ├── workers/          # Celery tasks
│   └── monitoring/       # Logs, metrics
├── alembic/              # Migrations
├── docker-compose.yml
└── Dockerfile

License

MIT

About

Production-inspired distributed task scheduler with multi-region workers (US, India, EU), fault tolerance, retries, and observability. Python, FastAPI, Celery, Redis, PostgreSQL.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors