-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
kafkaMdm sending streams in strange encoding format to Kafka #484
Comments
Hey @Megge1, The message is posted to Kafka encoded with msgpack format. When the message is consumed, must be unmarshalled to a struct so it makes sense to you. There's a similar discussion here: #281 (comment) |
seems like the format should be documented in the relay-ng docs, it currently isn't. |
Hi all, |
The messages are |
Thank you for your help @Dieterbe and the hint. Sorry also for the late reply. I'll check once if I can find a solution with verifying the kafkamdm.go. |
Hi all, Maybe you would think now that I can try it :-) I'll do it, but needs some time because first I've to check how I can change this code and pack it again, everything new for me. Also my colleague who is the owner of this server and has installed the carbon-relay-ng is out of office. |
I'm not sure what the intention is by changing the serialization at this level. The messages can be pulled from Kafka without the need of converting to JSON. The fix could be somehow unmaintainable if implemented as proposed. Here's a preview of the code I've submitted to go-carbon repository to add support for msgpack - https://github.com/go-graphite/go-carbon/blob/6ad5b88eb4c5489aad5109bc06a2b2b543031445/receiver/parse/msgpack.go If I can be of help getting things moving on your setup, feel free to reach out to me via email. @gmail.com |
Thanks a lot for your help @zerosoul13 |
I guess when you say "proper" maybe you mean "human readable". Messagepack is a binary format but it's a nice, compact serialization format with libraries in many languages :) |
Hi @Dieterbe , yes for sure I've mentioned "human readable" :-) |
Hi all,
I hope I'm right here with this questions.
Sorry, my knowhow with Icinga and garbon-relay-ng ast all is very limited :-(
I've tried to setup the carbon-relay-ng on my colleagues Icinga server and I was able so far to send the DataStreams over to Kafka, but I fetch them in a bad encoded format on Kafka site.
They look like that
Key: Value:��Id�"1.60bbd43ae111c32e6e25dade557ef919�OrgId�Name�<icinga2.test-mock08_test.host.icmp.perfdata.pl.max�Interval<�Value�@y�Unit�unknown�Time�a�5=�Mtype�gauge�Tags�
Key: Value:��Id�"1.7ef266878bd7f299c9bf4a12272fe3e9�OrgId�Name�Aicinga2.test.host.icmp.perfdata.rtmax.value�Interval<�Value�?o�˯�[!�Unit�unknown�Time�a�5=�Mtype�gauge�Tags�
Key: Value:��Id�"1.2bca5085a41bb2760ab3958cea37d17e�OrgId�Name�Aicinga2.test.host.icmp.perfdata.rtmin.value�Interval<�Value�?j��U����Unit�unknown�Time�a�5=�Mtype�gauge�Tags�
% Reached end of topic monitoring-eventmgmt.stage.icingametrics-json [0] at offset 545145
Any idea where the problem could be?
My carbon-relay-ng.conf settings that could be important for that part are:
[[route]]
#which compression to use. possible values: none, gzip, snappy
codec = 'none'
#possible values are: byOrg, bySeries, bySeriesWithTags, bySeriesWithTagsFnv
partitionBy = 'bySeries'
schemasFile = '/etc/carbon/storage-schemas.conf'
I've tried several different settings with partitionBy and codec, no improvement.
Would appreciate every help.
Thanks a lot and cheers
Markus
The text was updated successfully, but these errors were encountered: