Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix ut in pypy #2809

Merged

Conversation

gongbenhua
Copy link
Contributor

Description

fix ut for pypy

Fixes #2410

Type of change

Please delete options that are not relevant.

  • [√] Bug fix (non-breaking change which fixes an issue)

How Has This Been Tested?

UT ofpypy can pass after changing

Does This PR Require a Core Repo Change?

  • No.

Checklist:

See contributing.md for styleguide, changelog guidelines, and more.

  • Followed the style guidelines of this project
  • Changelogs have been updated
  • Unit tests have been added
  • Documentation has been updated

Copy link

linux-foundation-easycla bot commented Aug 21, 2024

CLA Signed

The committers listed above are authorized under a signed CLA.

@gongbenhua gongbenhua closed this Aug 21, 2024
@gongbenhua gongbenhua reopened this Aug 21, 2024
@lzchen
Copy link
Contributor

lzchen commented Aug 23, 2024

@xrmx

Is the original issue even valid anymore? I have not seen fastapi flaky tests for awhile.

@xrmx
Copy link
Contributor

xrmx commented Aug 25, 2024

@xrmx

Is the original issue even valid anymore? I have not seen fastapi flaky tests for awhile.

Yeah, the delta has been already bumped n da78275.

Closing.

@xrmx xrmx closed this Aug 25, 2024
@gongbenhua
Copy link
Contributor Author

gongbenhua commented Aug 26, 2024

Hi, @lzchen @xrmx @emdneto
Thanks for your reply. But I think you may I think there might be a misunderstanding.
This PR is intended to to address issue 2410 :#2410
As far as I understand, this issue is to make test with pypy all passed. But the main branch is still can't pass the test with pypy
You can try with "tox -e pypy3-test-opentelemetry-instrumentation-fastapi"
I just test it locally, there still have 9 test failed(This is exactly the issue my PR aims to address.)
Since the test with pypy still can't pass, I think my PR is needed. Can you have a test and reopen it?

@gongbenhua
Copy link
Contributor Author

Hi, @lzchen @xrmx @emdneto Thanks for your reply. But I think you may I think there might be a misunderstanding. This PR is intended to to address issue 2410 :#2410 As far as I understand, this issue is to make test with pypy all passed. But the main branch is still can't pass the test with pypy You can try with "tox -e pypy3-test-opentelemetry-instrumentation-fastapi" I just test it locally, there still have 9 test failed(This is exactly the issue my PR aims to address.) Since the test with pypy still can't pass, I think my PR is needed. Can you have a test and reopen it?

If you think the issue 2410 :#2410 has been fixed according to your request, Can I create a new issue to fix the error that "pypy3-test-opentelemetry-instrumentation-fastapi" can't all pass and create a new PR to implement it ? Thanks a lot.

@xrmx
Copy link
Contributor

xrmx commented Aug 26, 2024

@gongbenhua tests are working fine here locally and we haven't seen any failure in the CI. The code that was failing has been already updated to take a bigger delta. What failures do you have?

@gongbenhua
Copy link
Contributor Author

gongbenhua commented Aug 26, 2024

@xrmx Here is the failures I got. There're total 9 failures for "tox -e pypy3-test-opentelemetry-instrumentation-fastapi". Due to length limitations, I only pasted two places

_______________________________________________ TestFastAPIManualInstrumentation.test_basic_post_request_metric_success_both_semconv ________________________________________________

self = <tests.test_fastapi_instrumentation.TestFastAPIManualInstrumentation testMethod=test_basic_post_request_metric_success_both_semconv>

    def test_basic_post_request_metric_success_both_semconv(self):
        start = default_timer()
        response = self._client.post(
            "/foobar",
            json={"foo": "bar"},
        )
        duration = max(round((default_timer() - start) * 1000), 0)
        duration_s = max(default_timer() - start, 0)
        response_size = int(response.headers.get("content-length"))
        request_size = int(response.request.headers.get("content-length"))
        metrics_list = self.memory_metrics_reader.get_metrics_data()
        for metric in (
            metrics_list.resource_metrics[0].scope_metrics[0].metrics
        ):
            for point in list(metric.data.data_points):
                if isinstance(point, HistogramDataPoint):
                    self.assertEqual(point.count, 1)
                    if metric.name == "http.server.request.duration":
                        self.assertAlmostEqual(duration_s, point.sum, places=1)
                    elif metric.name == "http.server.response.body.size":
                        self.assertEqual(response_size, point.sum)
                    elif metric.name == "http.server.request.body.size":
                        self.assertEqual(request_size, point.sum)
                    elif metric.name == "http.server.duration":
>                       self.assertAlmostEqual(duration, point.sum, delta=40)
E                       AssertionError: 223 != 18 within 40 delta (205 difference)

instrumentation\opentelemetry-instrumentation-fastapi\tests\test_fastapi_instrumentation.py:896: AssertionError
________________________________________________ TestFastAPIManualInstrumentation.test_basic_post_request_metric_success_new_semconv ________________________________________________

self = <tests.test_fastapi_instrumentation.TestFastAPIManualInstrumentation testMethod=test_basic_post_request_metric_success_new_semconv>

    def test_basic_post_request_metric_success_new_semconv(self):
        start = default_timer()
        response = self._client.post(
            "/foobar",
            json={"foo": "bar"},
        )
        duration_s = max(default_timer() - start, 0)
        response_size = int(response.headers.get("content-length"))
        request_size = int(response.request.headers.get("content-length"))
        metrics_list = self.memory_metrics_reader.get_metrics_data()
        for metric in (
            metrics_list.resource_metrics[0].scope_metrics[0].metrics
        ):
            for point in list(metric.data.data_points):
                if isinstance(point, HistogramDataPoint):
                    self.assertEqual(point.count, 1)
                    if metric.name == "http.server.request.duration":
>                       self.assertAlmostEqual(duration_s, point.sum, places=1)
E                       AssertionError: 0.1789492000000017 != 0.006499199999998595 within 1 places (0.1724500000000031 difference)

instrumentation\opentelemetry-instrumentation-fastapi\tests\test_fastapi_instrumentation.py:864: AssertionError
================================================================ 9 failed, 46 passed, 2 skipped, 59 warnings in 28.23s ===============================================================

@xrmx
Copy link
Contributor

xrmx commented Aug 26, 2024

Ah you are on windows. I've reopened the issue then.

@gongbenhua
Copy link
Contributor Author

Ah you are on windows. I've reopened the issue then.

Thanks a lot. Can you reopen this PR so I can push my new changed code here?

@xrmx xrmx reopened this Aug 26, 2024
@lzchen
Copy link
Contributor

lzchen commented Aug 26, 2024

@gongbenhua

Please rebase with main and enable maintainer edits so we can rebase for you in the future.

@gongbenhua
Copy link
Contributor Author

@gongbenhua

Please rebase with main and enable maintainer edits so we can rebase for you in the future.

Thanks for your reply. I have rebase with main and push my branch. You can take a look when you're free. Thanks.

@gongbenhua
Copy link
Contributor Author

gongbenhua commented Aug 28, 2024

@lzchen @emdneto Hi, Sorry to bother you. Can you help me to add a skip changelog label for my PR. Thanks.
I can't enable maintainer edits from myside for some reason I don't know. But I have rebased with main

@lzchen lzchen added the Skip Changelog PRs that do not require a CHANGELOG.md entry label Aug 28, 2024
@lzchen lzchen merged commit b0129ac into open-telemetry:main Aug 28, 2024
521 of 522 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Skip Changelog PRs that do not require a CHANGELOG.md entry
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Flaky test with pypy
4 participants