Skip to content

Application to monitor natural disasters by ingesting real-time earthquake data available in a public source ( USGS ). This data is then made available via RESTful API.

License

Notifications You must be signed in to change notification settings

jcondeco207/quake_tracker_api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

rust,python

Quake Tracker API

Application to monitor natural disasters by ingesting real-time earthquake data available in a public source ( USGS ). This data is then made available via RESTful API.

Setup Instructions

To run the solution:

  1. Create a .env file in the root of the project (same level as docker compose file) and add the POSTGRES_USER, POSTGRES_PASSWORDand POSTGRES_DB (with value "QuakeTracking") variables.

  2. Create a .env file in /data_ingestor/src with the variables INTERVAL_SECONDS, POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_ADDRESS (with value "postgres_db") and POSTGRES_DB (with value "QuakeTracking")

  3. Create a .env file in /api_service with the variables POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_ADDRESS (with value "postgres_db") and POSTGRES_DB (with value "QuakeTracking")

It is not a good practice to send .env files to a github repo, but I sent some "fake" .env files so that you only have to fill the missing data. :)

After this all you have to do is the following command in the docker compose file directory:

docker compose up --build

Architeture

This solution was implemented following microservices architecture as seen in the image.

alt text

Data Ingestor

The data ingestor container allows the solution to retrieve the data in real time and parse it with almost no time cost. This process is way faster in Rust, which is why that language was chosen.

This service retrieves every 10 seconds earthquake data delivered by USGS and stores the new information into the postgres database. The only limitation is that USGS only updates this information every minute. So even tho this microservice checks every 10 seconds, there's the limitation of the 1 minute delay.

API Service

The api service microservice runs a fastapi server that allows the user to see the data stored in the postgres database. At docs you will find all the available methods well documented.

Data Model

erDiagram
    EARTHQUAKES {
        string id PK
        string event_time
        int unix_event_time
        float mag
        float lon
        float lat
        float alt
        string country_code
    }

    EARTHQUAKE_PROPERTIES {
        uuid id PK
        string earthquake_id FK
        json properties
    }

    EARTHQUAKES ||--o{ EARTHQUAKE_PROPERTIES : has
Loading

Future improovements

  • Users Management
  • JWT Token authentication
  • Cron to clean older unnecessary data
  • A Clickhouse engine database will handle the large data of volume that will eventually be created with much better performance. (Based on my professional experience and on the results I faced in my masters degree dissertation)
  • Data aggregation methods
  • Frontend dashboards
  • More code comments
  • Unit tests

About

Application to monitor natural disasters by ingesting real-time earthquake data available in a public source ( USGS ). This data is then made available via RESTful API.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published