Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kamon is creating thousands of PassivateIntervalTick #1254

Open
Symbianx opened this issue Jan 27, 2023 · 5 comments
Open

Kamon is creating thousands of PassivateIntervalTick #1254

Symbianx opened this issue Jan 27, 2023 · 5 comments
Assignees

Comments

@Symbianx
Copy link
Contributor

Hello,

We're using Kamon with our akka cluster applications and after updating to 2.6.0 from 2.4.1 we noticed tens of thousands of spans being created every 20 milliseconds with the name tell(PassivateIntervalTick$).

The spans start appearing right after a tell(ShardStarted) is created which leads me to believe this is some sort of internal akka poller.

See the screenshot for what I mean:

image

The list keeps growing and the trace is now 11 hours long

Because they are so many (useless) spans, the trace becomes unreadable and very hard to analyse. This also causes a strain on the traces storage and at a point the trace is just too big and can't be retrieved.

Is this the intended behaviour? If so, is there a way to disable this spans?

@ivantopo
Copy link
Contributor

Hey @Symbianx, this is definitely not intended to happen! Did you manage to pinpoint is any of the intermediate versions brings up this problem?

Also, can you please share your kamon-related settings? This looks as if a trace got started in a Scheduler thread and then it keeps growing as ticks get sent and generate spans, but I've never seen this happening before.

Pay special attention to this setting: https://github.com/kamon-io/Kamon/blob/master/instrumentation/kamon-akka/src/common/resources/reference.conf#L92-L95 and ensure the actors generating these spans are not there.

Also, could you share what is the first Span in this trace?

@Symbianx
Copy link
Contributor Author

Hey @ivantopo , thanks for getting back to me. We did try updating kamon before the 2.6.0 release and had the same issue so it must be one of the releases before it.

The trace starts as part of an HTTP Post so not a scheduler thread :/

We use a pretty much default kamon config:

kamon.environment.service = ${?OTEL_SERVICE_NAME}

kamon.instrumentation.kafka.client.tracing.propagator = "w3c"

kamon.instrumentation.logback.mdc.copy {
  entries = ["REDACTED", "REDACTED", "REDACTED"]
}

kamon.trace {
  sampler = "always"
  identifier-scheme = double
}

kamon.propagation.http.default.entries {
  incoming {
    span = "w3c"
  }
  outgoing {
    span = "w3c"
  }
}

kanela.modules.annotation {
  within += "taxonomy.*"
}

kamon.prometheus {
  embedded-server {
    hostname = 0.0.0.0
    port = 9095
  }
}

And the trace just before it starts with the ticks:
image

@ivantopo
Copy link
Contributor

My guess is that v2.5.0 is the one bringing out this problem for you, because that's the one including support for context propagation in the Akka Scheduler.

Could you please very that v2.5.0 is the one bringing you the problems? And, do you have any scheduler-related calls on your codebase while handling that HTTP endpoint that starts the trace?

@Symbianx
Copy link
Contributor Author

Symbianx commented Feb 3, 2023

Yes, 2.5.0 is the version that brings the problems.

We don't use the scheduler directly, so we did some more deep investigation into the Actor system and determined this is caused by the Automatic Passivation of entities.

By default, Akka uses a Idle entity passivation strateggy which seems to use the scheduler to keeep track of which entities are idle.

We switched to the recommended Active Entity Limit strategy, which does not generate the PassivateIntervalTick spans anymore so I guess that's the workaround for this issue.

I think it is still undesirable to get so many spans from Kamon while running with the default strategy. What are your thoughts on this?

@ivantopo
Copy link
Contributor

ivantopo commented Feb 6, 2023

@Symbianx I totally agree with you that Kamon shouldn't be generating these Spans out of the box. I'll look into this and let you know when there is something to test

@ivantopo ivantopo self-assigned this Feb 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants