Skip to content

Dockerized ML solution using mini Batch Kmeans for predictions

Notifications You must be signed in to change notification settings

Nitin-pgmr/live_stream_kafka_app

Repository files navigation

Streaming Static and Live Data API

Introduction

The app is used to automate the machine learning pipeline by using the MiniBatch Kmeans algorithm on the iris dataset used to predict three species of flowers setosa(0),versicolor(1),virginica (2)

Functionality

The app uses Swagger API to provide a front end that services users input, csv file format input and live streaming API, to produce the live predictions

FILE STRUCTURE

|src -- contains the main app to be run
 app -- main application file
   |template --default templates
      |index.html --welcome page
|Dockerfile -- spin up a docker
|Producer.py |Finalized_model --pickle file
|requirements.txt
|testinput_file.csv --sample inputs fed to the app during dev

Procedure

start zookeeper
start kafka
modify the host for kafka and zookeeper since the Dockertool box was used

pip install -r requirements.txt
run producer on a topic
python app.py
access on the browser

Accessing live stream input request

Live stream is based on generator and will not be rendered on swagger however can be accessed as

For Example: http://127.0.0.1:5000/predictlive?topic_name=jts on the browser directly

About

Dockerized ML solution using mini Batch Kmeans for predictions

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published