Skip to content

Latest commit

 

History

History
120 lines (81 loc) · 3.51 KB

local.md

File metadata and controls

120 lines (81 loc) · 3.51 KB

NOTE: The installation guide below is only tested to work on Ubuntu, we recommend using docker for other operating systems.

Local Installation Requirements.

  • Install GDAL on your computer using the command below:
sudo apt-get update && \
sudo apt-get -y install gdal-bin python3-gdal && \
sudo apt-get -y autoremove && \
sudo apt-get clean

  • Install redis on your computer using the command below:
sudo apt-get -y install redis
sudo apt-get -y install redis-tools # For client
  • Confirm Redis Installation
redis-cli

Type ping it should return pong.

If redis is not running check out its documentation

  • Clone the Raw Data API repository on your computer
git clone https://github.com/hotosm/raw-data-api.git
  • Navigate to the repository directory
cd raw-data-api
  • Install the python dependencies
pip install -r requirements.txt

Optional : For Tiles Output

If you opt for tiles output and have ENABLE_TILES : True in env variable . Make sure you install [Tippecanoe] (https://github.com/felt/tippecanoe)

Start the Server

uvicorn API.main:app --reload

Queues

Currently there are two type of queue implemented :

  • "raw_daemon" : Queue for default exports which will create each unique id for exports , This queue is attached to 24/7 available workers
  • "raw_ondemand" : Queue for recurring exports which will replace the previous exports if present on the system , can be enabled through uuid:false API Param . This queue will be attached to worker which will only spin up upon request.

Start Celery Worker

You should be able to start celery worker by running following command on different shell

  • Start for default daemon queue
    celery --app API.api_worker worker --loglevel=INFO --queues="raw_daemon" -n 'default_worker'
    
  • Start for on demand queue
    celery --app API.api_worker worker --loglevel=INFO --queues="raw_ondemand" -n 'ondemand_worker'
    

Set no of request that a worker can take at a time by using --concurrency

Note

If you are using postgres database as result_backend for celery you need to install sqlalchemy

pip install SQLAlchemy==2.0.25

Start flower for monitoring queue [OPTIONAL]

Raw Data API uses flower for monitoring the Celery distributed queue. Run this command on a different shell , if you are running redis on same machine your broker could be redis://localhost:6379//.

celery --broker=redis://redis:6379// --app API.api_worker flower --port=5000 --queues="raw_daemon,raw_ondemand"

OR Simply use flower from application itself

celery --broker=redis://localhost:6379// flower

Navigate to the docs to view Raw Data API endpoints

After sucessfully starting the server, visit http://127.0.0.1:8000/v1/docs on your browser to view the API docs.

http://127.0.0.1:8000/v1/docs

Flower dashboard should be available on port 5000 on your localhost.

http://127.0.0.1:5000/

DEBUG

If you are running in to worker crash on mac , You can use the OBJC_DISABLE_INITIALIZE_FORK_SAFETY environment variable to disable the Objective-C runtime's fork safety checks. However use this with caution and may result to other errrors

export OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES