This repository contains the backend for a Waste-to-Energy processing application. The backend is built using Flask and provides multiple API endpoints for different waste processing methods such as Fermentation, HTL (Hydrothermal Liquefaction), Combustion, and Anaerobic Digestion.
- Project Structure
- Installation
- Running the Application
- API Endpoints
- Environment Variables
- Docker Usage
- Contributing
- License
/api-backend
│
├── /app
│ ├── /blueprints # Contains Blueprints for each functional module
│ │ ├── fermentation.py # Fermentation-related API routes
│ │ ├── htl.py # HTL-related API routes
│ │ ├── combustion.py # Combustion-related API routes
│ │ └── digestion.py # Digestion-related API routes
│ ├── /services # Business logic and data processing scripts
│ │ ├── fermentation_service.py
│ │ ├── htl_service.py
│ │ ├── combustion_service.py
│ │ └── digestion_service.py
│ ├── /data # Data files (CSV, Excel, etc.)
│ │ └── sludge_data_dmt.csv # Sludge data for HTL functions
│ ├── __init__.py # App factory and Blueprint registration
│ └── config.py # Configuration for environment variables
│
├── Dockerfile # Dockerfile for containerizing the API
├── requirements.txt # Python dependencies
├── .env # Environment variables for local development
└── wsgi.py # Entry point for WSGI server
- Blueprints: Separate each API route into its module (e.g., fermentation, HTL, combustion, digestion).
- Services: Contains the business logic and data handling for each processing method.
- Data: Stores relevant CSV and other data files used in processing the requests.
- Configuration: Manages environment-specific settings through
config.py
and environment variables.
-
Clone the Repository:
git clone https://github.com/your-username/api-backend.git cd api-backend
-
Create and Activate a Virtual Environment:
python -m venv venv source venv/bin/activate # On macOS/Linux venv\Scripts\activate # On Windows
-
Install the Dependencies:
pip install -r requirements.txt
-
Set Up Environment Variables: Create a
.env
file in the root directory and add your environment variables (e.g.,FLASK_ENV
,SECRET_KEY
). You can use the.env.example
as a template:FLASK_ENV=development SECRET_KEY=your-secret-key
-
Run the Flask App Locally:
flask run
The app will be running on
http://127.0.0.1:5000/
. -
WSGI Server: For production environments, use a WSGI server (e.g., Gunicorn) to run the app via
wsgi.py
.Example:
gunicorn --bind 0.0.0.0:5000 wsgi:app
- GET
/api/v1/htl/county/<countyname>
: Fetch HTL data for a given county. - GET
/api/v1/htl/sludge?sludge=<value>&unit=<unit>
: Calculate diesel price and global warming potential based on sludge mass.
- GET
/api/v1/fermentation/county/<countyname>
: Fetch fermentation data for a given county. - GET
/api/v1/fermentation/biomass?mass=<value>
: Calculate ethanol production based on biomass mass.
- GET
/api/v1/combustion/county/<countyname>
: Fetch combustion data for a given county. - GET
/api/v1/combustion/mass?mass=<value>
: Calculate electricity and emissions from feedstock.
- GET
/api/v1/digestion/county/<countyname>
: Fetch anaerobic digestion data for a given county. - GET
/api/v1/digestion/mass?mass=<value>
: Calculate biogas production based on feedstock mass.
- FLASK_ENV: Environment the app is running in (
development
,testing
,production
). - SECRET_KEY: Secret key for Flask app.
- DATABASE_URL: If you add a database, configure the URL here.
You can add more environment-specific variables in your .env
file.
-
Build the Docker Image:
docker build -t api-backend .
-
Run the Container:
docker run -p 5000:5000 api-backend
This will run the app inside a Docker container and expose it on port 5000.
If you want to contribute to this project:
- Fork the repository.
- Create a feature branch (
git checkout -b feature/your-feature
). - Commit your changes (
git commit -m 'Add a new feature'
). - Push to the branch (
git push origin feature/your-feature
). - Open a pull request.
This project is licensed under the MIT License - see the LICENSE file for details.
- The app is configured to handle multiple routes and dynamically process data based on requests.
- Always ensure that the environment variables are set correctly, especially for production environments.