You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on May 19, 2023. It is now read-only.
-`broker`* - Host and port where kafka broker is running
63
+
-`group_id`* - Kafka [group id](https://docs.confluent.io/current/installation/configuration/consumer-configs.html#group.id) that uniquely identifies the streamz data consumer.
64
+
-`input_topic` - The name for the input topic to send the input dataset. Any name can be indicated here.
65
+
-`output_topic` - The name for the output topic to send the output data. Any name can be indicated here.
66
+
-`model_file` - The path to your model file
67
+
-`label_file` - The path to your label file
68
+
-`cuda_visible_devices` - List of gpus use to run streamz with Dask. The gpus should be equal to or a subset of devices indicated within the docker run command (in the example above device list is set to `'"device=0,1,2"'`)
69
+
-`poll_interval`* - Interval (in seconds) to poll the Kafka input topic for data
70
+
-`max_batch_size`* - Max batch size of data (max number of logs) to ingest into streamz with each `poll_interval`
71
+
-`data` - The input dataset to use for this streamz example
72
+
73
+
``*`` = More information on these parameters can be found in the streamz [documentation](https://streamz.readthedocs.io/en/latest/api.html#streamz.from_kafka_batched).
74
+
61
75
View the data processing activity on the dask dashboard by visiting `localhost:8787` or `<host>:8787`
62
76
63
77
View the cyBERT script output in the container logs
@@ -68,7 +82,7 @@ docker logs cybert-streamz
68
82
69
83
Processed data will be pushed to the kafka topic named `output`. To view all processed output run:
0 commit comments