Releases: streamdal/plumber
Releases · streamdal/plumber
Replay safe shutdown mode
When using relay mode, plumber will now ensure all relay messages are sent to Batch before exiting the application
Plumber As Replay Destination
Plumber's new dynamic destination mode allows it to act as a replay destination for your batch collections!
See https://docs.batch.sh/what-are/what-are-destinations/plumber-as-a-destination for more information
NSQ Support
- Support for reading/writing messages for NSQ
- Support for relaying NSQ messages to your batch collections
Headers, pretty output, protobuf fixes
Plumber release 0.26.0
Lots of good stuff here.
- Ability to specify headers when writing to Kafka
- Dramatically improved
read
output - Added
--json
flag which will attempt to format + colorize output that is valid JSON - Kafka relay now includes headers
- Protobuf root message type now must include the full name that includes pkgs (ie.
events.Message
instead ofMessage
) --line-numbers
has been removed - count is now always showing.
This is what a read
now looks like:
Fixing grpc max message size for relayed messages
Merge pull request #117 from batchcorp/blinktag/grpc_call_size Fixing max grpc message size
List batch replays and JSON output for batch resources
- Listing of replays has been added
plumber batch list replay
plumber batch list *
commands now support--output json
flag to display output as JSON instead of a table
Adding batch archive replay command
Merge pull request #113 from batchcorp/blinktag/batch_archive_replay Batch replay archiving
Support for multiple kafka topics
It is now possible to read (and relay) from multiple topics at the same time for Kafka.
To read/relay for multiple topics, specify multiple --topic
flags (or separate topic names with comma, if using env vars).
Also:
- Fixed a relay recovery bug that can occur if Kafka broker temporarily goes away
- Fixed stats display bug (was not including the relay type in output)
Pulsar support and MQTT relaying
- Added support for Apache Pulsar
- You can now relay MQTT messages to a Batch.sh collection
Support for multiple kafka brokers
- Kafka read/write: multiple kafka brokers can now be specified by repeating the
--address
flag for each broker - Kafka relay: multiple kafka brokers can now be specified by separating them by a comma. Example:
PLUMBER_RELAY_KAFKA_ADDRESS="broker1.domain.com:9092,broker2.domain.com:9092"