Skip to content

prometheus exporter convert instrumentation scope to otel_scope_info metric#3357

Merged
MrAlias merged 6 commits into
open-telemetry:mainfrom
fatsheep9146:prometheus_exporter_add_scope_info
Nov 2, 2022
Merged

prometheus exporter convert instrumentation scope to otel_scope_info metric#3357
MrAlias merged 6 commits into
open-telemetry:mainfrom
fatsheep9146:prometheus_exporter_add_scope_info

Conversation

@fatsheep9146
Copy link
Copy Markdown
Contributor

@fatsheep9146 fatsheep9146 commented Oct 18, 2022

fix #3273
related to #3161

Since #3285 already made an implementation for add target_info metric to prometheus exporter. I think I can wait for this pr merged and add otel_scope_info metric on the basic of his work.

@fatsheep9146 fatsheep9146 force-pushed the prometheus_exporter_add_scope_info branch 2 times, most recently from 4ae23e1 to 6310a13 Compare October 23, 2022 06:54
@codecov
Copy link
Copy Markdown

codecov Bot commented Oct 23, 2022

Codecov Report

Attention: Patch coverage is 81.81818% with 8 lines in your changes missing coverage. Please review.

Project coverage is 77.8%. Comparing base (49b62ae) to head (7efab63).
Report is 1849 commits behind head on main.

Files with missing lines Patch % Lines
exporters/prometheus/exporter.go 79.4% 7 Missing and 1 partial ⚠️
Additional details and impacted files

Impacted file tree graph

@@          Coverage Diff          @@
##            main   #3357   +/-   ##
=====================================
  Coverage   77.7%   77.8%           
=====================================
  Files        164     164           
  Lines      11457   11488   +31     
=====================================
+ Hits        8913    8942   +29     
- Misses      2342    2343    +1     
- Partials     202     203    +1     
Files with missing lines Coverage Δ
exporters/prometheus/config.go 100.0% <100.0%> (ø)
exporters/prometheus/exporter.go 82.5% <79.4%> (+1.4%) ⬆️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@fatsheep9146 fatsheep9146 force-pushed the prometheus_exporter_add_scope_info branch from 6310a13 to 653b28b Compare October 23, 2022 07:08
@fatsheep9146 fatsheep9146 marked this pull request as ready for review October 23, 2022 07:09
@fatsheep9146 fatsheep9146 force-pushed the prometheus_exporter_add_scope_info branch from e856489 to 1de3ce2 Compare October 25, 2022 01:38
Copy link
Copy Markdown
Collaborator

@dashpole dashpole left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that this does not fully fix #3161. I believe you also need to implement open-telemetry/opentelemetry-specification#2890

Comment thread CHANGELOG.md Outdated
@fatsheep9146 fatsheep9146 force-pushed the prometheus_exporter_add_scope_info branch from 768cdae to 2abb67e Compare October 25, 2022 10:58
@fatsheep9146
Copy link
Copy Markdown
Contributor Author

fatsheep9146 commented Oct 25, 2022

Note that this does not fully fix #3161. I believe you also need to implement open-telemetry/opentelemetry-specification#2890

You are right, I mock a test as following.

If use two different meter to create two metric with same name and enable scope info, no errors happen

func TestMetricWithSameName(t *testing.T) {
	exporter, err := New()
	assert.NoError(t, err)

	provider := metric.NewMeterProvider(
		metric.WithReader(exporter),
	)

	httpCounter, err := provider.Meter("http").SyncInt64().Counter("error_count", instrument.WithUnit(unit.Dimensionless))
	assert.NoError(t, err)
	httpCounter.Add(context.TODO(), 1)

	sqlCounter, err := provider.Meter("sql").SyncInt64().Counter("error_count", instrument.WithUnit(unit.Dimensionless))
	assert.NoError(t, err)
	sqlCounter.Add(context.TODO(), 1)

	t.Logf("serving metrics at localhost:2223/metrics")
	http.Handle("/metrics", promhttp.Handler())
	err = http.ListenAndServe(":2223", nil)
	if err != nil {
		t.Fatalf("error serving http: %v", err)
		return
	}
}

curl metric port will produce following metric

# HELP error_count_ratio_total
# TYPE error_count_ratio_total counter
error_count_ratio_total{otel_scope_name="http",otel_scope_version=""} 1
error_count_ratio_total{otel_scope_name="sql",otel_scope_version=""} 1

But if I disable scope info like this

func TestMetricWithSameName(t *testing.T) {
	exporter, err := New(WithoutScopeInfo())
	... // following codes keep unchanged
}

the error happned

An error has occurred while serving metrics:

collected metric "error_count_ratio_total" { counter:<value:1 > } was collected before with the same name and label values

I think I can solve this problem in another pr? Is that ok ? @dashpole @MrAlias

@dashpole
Copy link
Copy Markdown
Collaborator

dashpole commented Oct 25, 2022

If use two different meter to create two metric with same name and enable scope info, no errors happen

Can you try with a different description on each? I think that is why you aren't reproducing the issue.

Can you also try with different Counter vs UpDownCounter?

@fatsheep9146
Copy link
Copy Markdown
Contributor Author

fatsheep9146 commented Oct 26, 2022

Can you try with a different description on each? I think that is why you aren't reproducing the issue.

For two Counter of two different meter with same name, different Description, an error happend @dashpole

code

...
httpCounter, err := provider.Meter("http").
  SyncInt64().Counter(
  "error_count",
  instrument.WithUnit(unit.Dimensionless),
  instrument.WithDescription("http error count"))
  httpCounter.Add(context.TODO(), 1)

sqlCounter, err := provider.Meter("sql").
  SyncInt64().Counter(
  "error_count",
  instrument.WithUnit(unit.Dimensionless),
  instrument.WithDescription("sql error count"))
sqlCounter.Add(context.TODO(), 1)
 ...

error

collected metric error_count_ratio_total label:<name:"otel_scope_name" value:"sql" > label:<name:"otel_scope_version" value:"" > counter:<value:1 >  has help "sql error count" but should have "http error count"

@fatsheep9146
Copy link
Copy Markdown
Contributor Author

fatsheep9146 commented Oct 26, 2022

Can you also try with different Counter vs UpDownCounter?

For one Counter and another UpDownCounter of two different meter with same name, same description, no error happend @dashpole

code

httpCounter, err := provider.Meter("http").
  SyncInt64().Counter(
  "error_count",
  instrument.WithUnit(unit.Dimensionless))
httpCounter.Add(context.TODO(), 1)

sqlCounter, err := provider.Meter("sql").
  SyncInt64().UpDownCounter(
  "error_count",
  instrument.WithUnit(unit.Dimensionless))
sqlCounter.Add(context.TODO(), 1)

result

# HELP error_count_ratio
# TYPE error_count_ratio gauge
error_count_ratio{otel_scope_name="sql",otel_scope_version=""} 1
# HELP error_count_ratio_total
# TYPE error_count_ratio_total counter
error_count_ratio_total{otel_scope_name="http",otel_scope_version=""} 1
...

Comment thread CHANGELOG.md Outdated
Comment thread exporters/prometheus/config.go Outdated
Comment thread exporters/prometheus/config.go Outdated
Comment thread exporters/prometheus/exporter.go Outdated
Comment thread exporters/prometheus/exporter.go Outdated
Comment thread exporters/prometheus/exporter.go Outdated
Comment thread exporters/prometheus/exporter.go Outdated
Comment thread exporters/prometheus/exporter.go Outdated
Comment thread exporters/prometheus/exporter.go Outdated
Comment thread exporters/prometheus/exporter_test.go
@MrAlias
Copy link
Copy Markdown
Contributor

MrAlias commented Oct 27, 2022

Note that this does not fully fix #3161. I believe you also need to implement open-telemetry/opentelemetry-specification#2890

You are right, I mock a test as following.

If use two different meter to create two metric with same name and enable scope info, no errors happen

func TestMetricWithSameName(t *testing.T) {
	exporter, err := New()
	assert.NoError(t, err)

	provider := metric.NewMeterProvider(
		metric.WithReader(exporter),
	)

	httpCounter, err := provider.Meter("http").SyncInt64().Counter("error_count", instrument.WithUnit(unit.Dimensionless))
	assert.NoError(t, err)
	httpCounter.Add(context.TODO(), 1)

	sqlCounter, err := provider.Meter("sql").SyncInt64().Counter("error_count", instrument.WithUnit(unit.Dimensionless))
	assert.NoError(t, err)
	sqlCounter.Add(context.TODO(), 1)

	t.Logf("serving metrics at localhost:2223/metrics")
	http.Handle("/metrics", promhttp.Handler())
	err = http.ListenAndServe(":2223", nil)
	if err != nil {
		t.Fatalf("error serving http: %v", err)
		return
	}
}

curl metric port will produce following metric

# HELP error_count_ratio_total
# TYPE error_count_ratio_total counter
error_count_ratio_total{otel_scope_name="http",otel_scope_version=""} 1
error_count_ratio_total{otel_scope_name="sql",otel_scope_version=""} 1

But if I disable scope info like this

func TestMetricWithSameName(t *testing.T) {
	exporter, err := New(WithoutScopeInfo())
	... // following codes keep unchanged
}

the error happned

An error has occurred while serving metrics:

collected metric "error_count_ratio_total" { counter:<value:1 > } was collected before with the same name and label values

I think I can solve this problem in another pr? Is that ok ? @dashpole @MrAlias

@dashpole what's the expected behavior here? Just drop the second attempt to collect the second instrument and log an error?

@MrAlias MrAlias added this to the Metric v0.34.0 milestone Oct 27, 2022
@fatsheep9146 fatsheep9146 force-pushed the prometheus_exporter_add_scope_info branch from c5e22cf to c698aaf Compare October 30, 2022 13:28
…metric

Signed-off-by: Ziqi Zhao <zhaoziqi9146@gmail.com>
Signed-off-by: Ziqi Zhao <zhaoziqi9146@gmail.com>
@fatsheep9146 fatsheep9146 force-pushed the prometheus_exporter_add_scope_info branch from f20f64e to dbeea34 Compare November 1, 2022 05:41
Signed-off-by: Ziqi Zhao <zhaoziqi9146@gmail.com>
@dashpole
Copy link
Copy Markdown
Collaborator

dashpole commented Nov 1, 2022

what's the expected behavior here? Just drop the second attempt to collect the second instrument and log an error?

From open-telemetry/opentelemetry-specification#2890

Example approach

  • Keep a map of metric name -> Description when constructing a batch of metrics for a scrape.
  • Each time a new metric (with Description) is added, look up the name in the map:
    • If it isn't found, add it to the map
    • If it is found, and the desc is the same, do nothing
    • If it is found, and has conflicting type information, drop the metric + log a warning
    • If it is found, and has conflicting help or unit, use the stored help + unit instead of the provided one + log a warning.

Basically, we need the description of all metrics with the same name to match.

Comment thread exporters/prometheus/exporter_test.go Outdated
@MrAlias
Copy link
Copy Markdown
Contributor

MrAlias commented Nov 1, 2022

what's the expected behavior here? Just drop the second attempt to collect the second instrument and log an error?

From open-telemetry/opentelemetry-specification#2890

Example approach

  • Keep a map of metric name -> Description when constructing a batch of metrics for a scrape.

  • Each time a new metric (with Description) is added, look up the name in the map:

    • If it isn't found, add it to the map
    • If it is found, and the desc is the same, do nothing
    • If it is found, and has conflicting type information, drop the metric + log a warning
    • If it is found, and has conflicting help or unit, use the stored help + unit instead of the provided one + log a warning.

Basically, we need the description of all metrics with the same name to match.

👍

@dashpole are you good with merging this and then adding that tracking in another PR? That seems to make sense to me.

Signed-off-by: Ziqi Zhao <zhaoziqi9146@gmail.com>
Signed-off-by: Ziqi Zhao <zhaoziqi9146@gmail.com>
@dashpole
Copy link
Copy Markdown
Collaborator

dashpole commented Nov 2, 2022

@dashpole are you good with merging this and then adding that tracking in another PR? That seems to make sense to me.

Yes.

@MrAlias MrAlias merged commit e8023fa into open-telemetry:main Nov 2, 2022
@MrAlias MrAlias added the area:metrics Part of OpenTelemetry Metrics label Jun 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:metrics Part of OpenTelemetry Metrics

Projects

No open projects
Status: Done

Development

Successfully merging this pull request may close these issues.

Add Instrumentation Scope and Version as info metric and label in Prometheus exporter

3 participants