Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 23 additions & 0 deletions .env.qadr.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
WM_PORT=38081
REDIS_TOKEN=replace-with-long-random-token
SEED_INTERVAL_SECONDS=1800

# Internal LiteLLM on QADR
LLM_API_URL=http://qadr-ai-gateway-litellm:4000/v1/chat/completions
LLM_API_KEY=replace-with-litellm-key
LLM_MODEL=freegpt-local

# Optional upstream data keys
OPENROUTER_API_KEY=
GROQ_API_KEY=
FINNHUB_API_KEY=
EIA_API_KEY=
FRED_API_KEY=
ACLED_EMAIL=
ACLED_PASSWORD=
ACLED_ACCESS_TOKEN=
UCDP_ACCESS_TOKEN=
NASA_FIRMS_API_KEY=
AISSTREAM_API_KEY=
CLOUDFLARE_API_TOKEN=
AVIATIONSTACK_API=
35 changes: 35 additions & 0 deletions Dockerfile.qadr-prebuilt
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
FROM node:22-alpine AS final

RUN apk add --no-cache nginx supervisor gettext && \
mkdir -p /tmp/nginx-client-body /tmp/nginx-proxy /tmp/nginx-fastcgi \
/tmp/nginx-uwsgi /tmp/nginx-scgi /var/log/supervisor && \
addgroup -S appgroup && adduser -S appuser -G appgroup

WORKDIR /app

# Prebuilt runtime bundle:
# - api/**/*.js compiled locally via docker/build-handlers.mjs
# - dist/ built locally via vite build
COPY src-tauri/sidecar/local-api-server.mjs ./local-api-server.mjs
COPY src-tauri/sidecar/package.json ./package.json
COPY api ./api
COPY data ./data
COPY dist /usr/share/nginx/html

COPY docker/nginx.conf /etc/nginx/nginx.conf.template
COPY docker/supervisord.conf /etc/supervisor/conf.d/worldmonitor.conf
COPY docker/entrypoint.sh /app/entrypoint.sh
RUN chmod +x /app/entrypoint.sh

RUN chown -R appuser:appgroup /app /tmp/nginx-client-body /tmp/nginx-proxy \
/tmp/nginx-fastcgi /tmp/nginx-uwsgi /tmp/nginx-scgi /var/log/supervisor \
/var/lib/nginx /var/log/nginx /usr/share/nginx/html

USER appuser

EXPOSE 8080

HEALTHCHECK --interval=30s --timeout=5s --start-period=15s --retries=3 \
CMD wget -qO- http://localhost:8080/api/health || exit 1

CMD ["/app/entrypoint.sh"]
11 changes: 10 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,15 @@

**Real-time global intelligence dashboard** — AI-powered news aggregation, geopolitical monitoring, and infrastructure tracking in a unified situational awareness interface.

## Gantor fork

This fork powers the QADR deployment at [monitor.gantor.ir](https://monitor.gantor.ir/) and adds:

- Persian (`fa`) locale with RTL support
- Gantor branding and domain metadata
- QADR-specific Docker Compose deployment
- Same-origin runtime API behavior for self-hosted web mode

[![GitHub stars](https://img.shields.io/github/stars/koala73/worldmonitor?style=social)](https://github.com/koala73/worldmonitor/stargazers)
[![GitHub forks](https://img.shields.io/github/forks/koala73/worldmonitor?style=social)](https://github.com/koala73/worldmonitor/network/members)
[![Discord](https://img.shields.io/badge/Discord-Join-5865F2?style=flat&logo=discord&logoColor=white)](https://discord.gg/re63kWKxaz)
Expand Down Expand Up @@ -45,7 +54,7 @@
- **Local AI** — run everything with Ollama, no API keys required
- **5 site variants** from a single codebase (world, tech, finance, commodity, happy)
- **Native desktop app** (Tauri 2) for macOS, Windows, and Linux
- **21 languages** with native-language feeds and RTL support
- **22 languages** with native-language feeds and RTL support, including Persian

For the full feature list, architecture, data sources, and algorithms, see the **[documentation](https://docs.worldmonitor.app)**.

Expand Down
128 changes: 128 additions & 0 deletions compose.qadr.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
services:
worldmonitor:
build:
context: .
dockerfile: Dockerfile.qadr-prebuilt
image: gantor-worldmonitor:20260330
container_name: qadr-worldmonitor
ports:
- "127.0.0.1:${WM_PORT:-38081}:8080"
environment:
UPSTASH_REDIS_REST_URL: "http://qadr-worldmonitor-redis-rest:80"
UPSTASH_REDIS_REST_TOKEN: "${REDIS_TOKEN:-wm-qadr-token}"
LOCAL_API_PORT: "46123"
LOCAL_API_MODE: "docker"
LOCAL_API_CLOUD_FALLBACK: "false"
WS_RELAY_URL: "http://qadr-worldmonitor-ais-relay:3004"
LLM_API_URL: "${LLM_API_URL:-http://qadr-ai-gateway-litellm:4000/v1/chat/completions}"
LLM_API_KEY: "${LLM_API_KEY:-}"
LLM_MODEL: "${LLM_MODEL:-freegpt-local}"
GROQ_API_KEY: "${GROQ_API_KEY:-}"
AISSTREAM_API_KEY: "${AISSTREAM_API_KEY:-}"
FINNHUB_API_KEY: "${FINNHUB_API_KEY:-}"
EIA_API_KEY: "${EIA_API_KEY:-}"
FRED_API_KEY: "${FRED_API_KEY:-}"
ACLED_EMAIL: "${ACLED_EMAIL:-}"
ACLED_PASSWORD: "${ACLED_PASSWORD:-}"
ACLED_ACCESS_TOKEN: "${ACLED_ACCESS_TOKEN:-}"
UCDP_ACCESS_TOKEN: "${UCDP_ACCESS_TOKEN:-}"
NASA_FIRMS_API_KEY: "${NASA_FIRMS_API_KEY:-}"
CLOUDFLARE_API_TOKEN: "${CLOUDFLARE_API_TOKEN:-}"
AVIATIONSTACK_API: "${AVIATIONSTACK_API:-}"
depends_on:
redis-rest:
condition: service_started
ais-relay:
condition: service_started
restart: unless-stopped
cpus: 3
mem_limit: 4096m
networks:
- default
- fgpt_ingress
- fgpt_ai

ais-relay:
build:
context: .
dockerfile: Dockerfile.relay
image: gantor-worldmonitor-ais-relay:20260330
container_name: qadr-worldmonitor-ais-relay
environment:
AISSTREAM_API_KEY: "${AISSTREAM_API_KEY:-}"
PORT: "3004"
restart: unless-stopped
cpus: 1
mem_limit: 512m

redis:
image: docker.io/redis:7-alpine
container_name: qadr-worldmonitor-redis
command: redis-server --maxmemory 256mb --maxmemory-policy allkeys-lru
volumes:
- redis-data:/data
restart: unless-stopped
mem_limit: 512m

redis-rest:
build:
context: docker
dockerfile: Dockerfile.redis-rest
image: gantor-worldmonitor-redis-rest:20260330
container_name: qadr-worldmonitor-redis-rest
environment:
SRH_TOKEN: "${REDIS_TOKEN:-wm-qadr-token}"
SRH_CONNECTION_STRING: "redis://redis:6379"
depends_on:
- redis
restart: unless-stopped
mem_limit: 256m

seeders:
image: node:22-alpine
container_name: qadr-worldmonitor-seeders
working_dir: /app
command: >
sh -lc "npm ci --ignore-scripts --prefer-offline &&
while true; do
./scripts/run-seeders.sh || true;
sleep ${SEED_INTERVAL_SECONDS:-1800};
done"
environment:
UPSTASH_REDIS_REST_URL: "http://qadr-worldmonitor-redis-rest:80"
UPSTASH_REDIS_REST_TOKEN: "${REDIS_TOKEN:-wm-qadr-token}"
GROQ_API_KEY: "${GROQ_API_KEY:-}"
OPENROUTER_API_KEY: "${OPENROUTER_API_KEY:-}"
FINNHUB_API_KEY: "${FINNHUB_API_KEY:-}"
EIA_API_KEY: "${EIA_API_KEY:-}"
FRED_API_KEY: "${FRED_API_KEY:-}"
ACLED_EMAIL: "${ACLED_EMAIL:-}"
ACLED_PASSWORD: "${ACLED_PASSWORD:-}"
ACLED_ACCESS_TOKEN: "${ACLED_ACCESS_TOKEN:-}"
UCDP_ACCESS_TOKEN: "${UCDP_ACCESS_TOKEN:-}"
NASA_FIRMS_API_KEY: "${NASA_FIRMS_API_KEY:-}"
AISSTREAM_API_KEY: "${AISSTREAM_API_KEY:-}"
CLOUDFLARE_API_TOKEN: "${CLOUDFLARE_API_TOKEN:-}"
AVIATIONSTACK_API: "${AVIATIONSTACK_API:-}"
LLM_API_URL: "${LLM_API_URL:-http://qadr-ai-gateway-litellm:4000/v1/chat/completions}"
LLM_API_KEY: "${LLM_API_KEY:-}"
LLM_MODEL: "${LLM_MODEL:-freegpt-local}"
volumes:
- .:/app
depends_on:
redis-rest:
condition: service_started
worldmonitor:
condition: service_healthy
restart: unless-stopped
cpus: 1.5
mem_limit: 1536m

volumes:
redis-data:

networks:
fgpt_ingress:
external: true
fgpt_ai:
external: true
56 changes: 56 additions & 0 deletions docs/qadr-deployment.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# QADR deployment

This fork is deployed on QADR behind the existing `fgpt_ingress` Caddy stack and exposed at `https://monitor.gantor.ir/`.

## What is customized

- Persian locale added as a first-class language.
- RTL support extended to Persian.
- Branding and metadata updated for `monitor.gantor.ir`.
- Runtime API fallback prefers the current web origin instead of upstream `worldmonitor.app`.
- QADR-specific compose stack added in `compose.qadr.yaml`.
- Optional seeder loop runs on QADR to keep Redis-backed datasets fresh.

## Required QADR files

- repo checkout: `/home/saman/workspaces/worldmonitor`
- env file: `/home/saman/workspaces/worldmonitor/.env`
- compose file: `/home/saman/workspaces/worldmonitor/compose.qadr.yaml`

## Required ingress

In the FreeGPT ingress source of truth:

- `monitor.gantor.ir` reverse proxies to `qadr-worldmonitor:8080`

Because QADR mounts the workspace `Caddyfile` directly, recreate the Caddy container after pulling ingress changes:

```bash
docker compose -f /home/saman/workspaces/freegpt/stacks/ingress-core/compose.yaml up -d --force-recreate caddy
```

## Required DNS

Add to the live `gantor.ir` zone on QADR:

```dns
monitor IN A 5.235.208.128
```

## Compose notes

- `worldmonitor` joins `fgpt_ingress` so Caddy can reach it.
- `worldmonitor` joins `fgpt_ai` so it can use the internal LiteLLM gateway.
- `worldmonitor` uses `Dockerfile.qadr-prebuilt` on QADR, not the full build-stage `Dockerfile`.
- Build the frontend and compiled handler bundle on a stable workstation first:

```bash
npm ci --ignore-scripts
node docker/build-handlers.mjs
npx tsc
npx vite build
```

- Then sync the repo contents, including `dist/` and generated `api/**/*.js`, to `/home/saman/workspaces/worldmonitor` before running `docker compose` on QADR.
- `seeders` uses `node:22-alpine` and runs `./scripts/run-seeders.sh` every 30 minutes by default.
- Redis state stays local to this stack through the `redis-data` volume.
Loading
Loading