Skip to content

Latest commit

 

History

History
146 lines (89 loc) · 5.42 KB

README.rst

File metadata and controls

146 lines (89 loc) · 5.42 KB

Koku README

license Build Status codecov Updates Python 3 Docs

About

Koku's goal is to provide an open source solution for cost management of cloud and hybrid cloud environments. This is offered via a web interface that exposes resource consumption and cost data in easily digestible and filterable views. The project also aims to provide insight into this data and ultimately provide suggested optimizations for reducing cost and eliminating unnecessary resource usage.

Full documentation is available through readthedocs.

Getting Started

This is a Python project developed using Python 3.6. Make sure you have at least this version installed.

Development

To get started developing against Koku first clone a local copy of the git repository.

git clone https://github.com/project-koku/koku

Developing inside a virtual environment is recommended. A Pipfile is provided. Pipenv is recommended for combining virtual environment (virtualenv) and dependency management (pip). To install pipenv, use pip

pip3 install pipenv

Then project dependencies and a virtual environment can be created using

pipenv install --dev

To activate the virtual environment run

pipenv shell

Preferred Environment

Please refer to Working with Openshift.

Alternative Environment

If deploying with Openshift seems overly complex you can try an alternate local environment where you will need to install and setup some of the dependencies and configuration.

Configuration

This project is developed using the Django web framework. Many configuration settings can be read in from a .env file. An example file .env.example is provided in the repository. To use the defaults simply

cp .env.example .env

Modify as you see fit.

Database

PostgreSQL is used as the database backend for Koku. A docker-compose file is provided for creating a local database container. If modifications were made to the .env file the docker-compose file will need to be modified to ensure matching database credentials. Several commands are available for interacting with the database.

# This will launch a Postgres container
make start-db

# This will run Django's migrations against the database
make run-migrations

# This will stop and remove a currently running database and run the above commands
make reinitdb

Assuming the default .env file values are used, to access the database directly using psql run

psql koku -U koku -h localhost -p 15432

There is a known limitation with docker-compose and Linux environments with SELinux enabled. You may see the following error during the postgres container deployment:

"mkdir: cannot create directory '/var/lib/pgsql/data/userdata': Permission denied" can be resolved by granting ./pg_data ownership permissions to uid:26 (postgres user in centos/postgresql-96-centos7)

If a docker container running Postgres is not feasible, it is possible to run Postgres locally as documented in the Postgres tutorial. The default port for local Postgres installations is 5432. Make sure to modify the .env file accordingly. To initialize the database run

make run-migrations
Server

To run a local dev Django server you can use

make serve

API Documentation Generation

To generate and host the API documentation locally you need to Install APIDoc.

Generate the project API documenttion by running the following command

make gen-apidoc

In order to host the docs locally you need to collect the static files

make collect-static

Now start the server with as described above and point your browser to http://127.0.0.1:8000/apidoc/index.html.

Testing and Linting

Koku uses tox to standardize the environment used when running tests. Essentially, tox manages its own virtual environment and a copy of required dependencies to run tests. To ensure a clean tox environment run

tox -r

This will rebuild the tox virtual env and then run all tests.

To run unit tests specifically:

tox -e py36

To lint the code base

tox -e lint

Contributing

Please refer to Contributing.