Argo Dataflow is intended as a cloud-native and language-agnostic platform for executing large parallel data-processing pipelines composed of many steps which are often small and homogenic.
- Real-time "click" analytics
- Anomaly detection
- Fraud detection
- Operational (including IoT) analytics
pip install git+https://github.com/argoproj-labs/argo-dataflow#subdirectory=dsls/python
from argo_dataflow import cron, pipeline
if __name__ == '__main__':
(pipeline('hello')
.namespace('argo-dataflow-system')
.step(
(cron('*/3 * * * * *')
.cat('main')
.log())
)
.run())
Read in order:
Beginner:
Intermediate:
Advanced