Replies: 1 comment
-
Hey @atennapel! 👋 I'm here to help you with any bugs, questions, or contributions you have. Let's get to the bottom of this together! I found a similar closed issue where Jaeger was experiencing out-of-memory (OOM) problems when using Badger as a storage backend. The suggested solution was to use Cassandra or Elasticsearch for persistent data, as they are recommended for scaling deployments [1]. Additionally, there are configuration options in Jaeger that can help manage memory usage when using Badger. You might consider adjusting the following settings:
These adjustments might help improve the performance and stability of your Jaeger setup with Badger. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
-
I have Jaeger setup with Badger and am running a performance test that creates traces. This results in about 5gb of Badger data (looking at the Badger directory size minus the .vlog file). While the test is running Jaeger increases memory usage but it's not so bad, and the app performance is not much affected, I am very hapy with this.
The problem: after the test is done when I try to look at the created spans in the Jaeger UI, Jaeger keeps increasing memory usage, even up to 10gb and then the UI crashes. I cannot view any spans.
The only way to fix this issue is if I enable the trace id ratio sampler and set it to 15% (20% will still cause crashes). However, if I do that the search results become uninteresting.
Is 5gb of span data already too much for the Jaeger search UI to handle? Or am I doing something wrong?
Beta Was this translation helpful? Give feedback.
All reactions