This repository demonstrates how to use the IBM MQ connector. Two connectors will be started up: Datagen source, to mock clickstream data and IBM MQ Connetor source. Then we'll use KSQL to join the two sources together. No sink connector is configured.
make build
make cluster
# wait a minute for cluster to spinup
make topic
make connect
# wait a minute before moving on to the next step
UserName=admin
Password=passw0rd
Goto the link below to view the AVRO schema the datagen connector registered to schema registry.
You need to send a message to IBM MQ before the schema will appear in the topic in C3.
- Select
DEV.QUEUE.1
under "Queues on MQ1"
- Add a message
- You can now see the schema assigned to the
ibmmq
topic
Run the ibmmq consumer to see messages coming in from DEV.QUEUE.1
make consumer
Make sure to leave the timestamp blank to use the topic timestamp by default
You can use the user names bobk_43
or akatz1022
to capture clickstreams for those users with a KSQL join.
This time we will use KSQL to create the stream. Paste the KSQL statement into the KSQL Editor.
CREATE STREAM ibmmq
WITH (KAFKA_TOPIC='ibmmq',
VALUE_FORMAT='AVRO');
Paste the KSQL statement into the KSQL Editor to perform the join.
select * from CLICKSTREAM
join IBMMQ WITHIN 5 seconds
on text = username;