Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

data race in the OTel tracing integration #960

Open
andreimatei opened this issue Jan 31, 2025 · 1 comment
Open

data race in the OTel tracing integration #960

andreimatei opened this issue Jan 31, 2025 · 1 comment
Assignees

Comments

@andreimatei
Copy link

Problem Statement

I've seen this data race, seemingly coming from the Sentry tracing code invoked through the OTel tracer, in turn used through the otelgrpc auto-instrumentation.

WARNING: DATA RACE
Read at 0x00c000688000 by goroutine 63:
  github.com/getsentry/sentry-go.DynamicSamplingContextFromTransaction()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/dynamic_sampling_context.go:55 +0x118
  github.com/getsentry/sentry-go.(*Scope).ApplyToEvent()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/scope.go:413 +0x15c4
  github.com/getsentry/sentry-go.(*Client).prepareEvent()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/client.go:685 +0x834
  github.com/getsentry/sentry-go.(*Client).processEvent()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/client.go:612 +0x6ec
  github.com/getsentry/sentry-go.(*Client).CaptureEvent()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/client.go:444 +0x91
  github.com/getsentry/sentry-go.(*Hub).CaptureEvent()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/hub.go:225 +0x7b
  github.com/getsentry/sentry-go.(*Span).doFinish()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/tracing.go:372 +0x2b8
  github.com/getsentry/sentry-go.(*Span).doFinish-fm()
      <autogenerated>:1 +0x33
  sync.(*Once).doSlow()
      GOROOT/src/sync/once.go:78 +0xe1
  sync.(*Once).Do()
      GOROOT/src/sync/once.go:69 +0x44
  github.com/getsentry/sentry-go.(*Span).Finish()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go/tracing.go:212 +0x28f
  github.com/getsentry/sentry-go/otel.(*sentrySpanProcessor).OnEnd()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go_otel/span_processor.go:85 +0x238
  go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End()
      external/gazelle~~go_deps~io_opentelemetry_go_otel_sdk/trace/span.go:417 +0x9b6
  go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.(*config).handleRPC()
      external/gazelle~~go_deps~io_opentelemetry_go_contrib_instrumentation_google_golang_org_grpc_otelgrpc/stats_handler.go:192 +0x89e
  go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.(*clientHandler).HandleRPC()
      external/gazelle~~go_deps~io_opentelemetry_go_contrib_instrumentation_google_golang_org_grpc_otelgrpc/stats_handler.go:113 +0x64
  google.golang.org/grpc.(*csAttempt).finish()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:1211 +0x672
  google.golang.org/grpc.(*clientStream).finish()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:1030 +0x23c
  google.golang.org/grpc.newClientStreamWithParams.func4()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:397 +0x114

Previous write at 0x00c000688000 by goroutine 90:
  github.com/getsentry/sentry-go/otel.(*sentrySpanProcessor).OnStart()
      external/gazelle~~go_deps~com_github_getsentry_sentry_go_otel/span_processor.go:54 +0x706
  go.opentelemetry.io/otel/sdk/trace.(*tracer).Start()
      external/gazelle~~go_deps~io_opentelemetry_go_otel_sdk/trace/tracer.go:48 +0x369
  go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.(*clientHandler).TagRPC()
      external/gazelle~~go_deps~io_opentelemetry_go_contrib_instrumentation_google_golang_org_grpc_otelgrpc/stats_handler.go:96 +0x32c
  google.golang.org/grpc.(*clientStream).newAttemptLocked()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:418 +0x817
  google.golang.org/grpc.(*clientStream).withRetry()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:781 +0xad
  google.golang.org/grpc.newClientStreamWithParams()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:363 +0x172b
  google.golang.org/grpc.newClientStream.func3()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:220 +0x191
  google.golang.org/grpc.newClientStream()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:255 +0xc47
  google.golang.org/grpc.(*ClientConn).NewStream()
      external/gazelle~~go_deps~org_golang_google_grpc/stream.go:170 +0x2b4
  github.com/DataExMachina-dev/side-eye/data/exclient.(*tenantWrappedClientConn).NewStream()
      data/exclient/ex_client.go:571 +0x17b
  github.com/DataExMachina-dev/side-eye/data/expb.(*exClient).GetUpdates()
      data/expb/ex_grpc.pb.go:47 +0xeb
  github.com/DataExMachina-dev/side-eye/data/exclient.runLoop.func2.1()
      data/exclient/ex_client.go:191 +0x243

using

github.com/getsentry/sentry-go v0.31.1
github.com/getsentry/sentry-go/otel v0.31.1

Solution Brainstorm

I don't know.

@cleptric
Copy link
Member

cleptric commented Feb 5, 2025

@ribice can you please take a look.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

3 participants