In this case study we take stored stock market data and transfer it over Apache Kafka's producer/consumer model to the client's S3 bucket where it is further used, e.g. for advertising purposes. This happens in real time.
A Kafka cluster consists of one or more servers (Kafka brokers) running Kafka.
Producers are processes that push records into Kafka topics within the broker.
A consumer pulls records off a Kafka topic.
Kafka Connect makes it easy to stream from numerous sources into Kafka and from Kafka into numerous sources, with hundreds of available connectors. Kafka connectors are available ready-made but can also be written by oneself, as we do here.
The database contains stock market data.
The data is transferred into the client's S3 Bucket.