Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In the "**Runs**" tab, the real status for tests with the **@mark.xfail** mark is not displayed #240

Closed
BohdanObruch opened this issue Nov 1, 2023 · 16 comments
Assignees
Labels
bug Something isn't working priority medium python

Comments

@BohdanObruch
Copy link

Describe the bug
In the "Runs" tab, the real status for tests with the @mark.xfail mark is not displayed

Precondition
Create a test and add pytest mark with this test @mark.xfail

To Reproduce
Steps to reproduce:

  1. Run the test with the mark @mark.xfail
  2. Check the test result in the console
  3. Go to the testomat.io website
  4. Go to the Runs tab
  5. Open the last launched test
  6. Check the display of the status of the test result

Expected behavior
The status of the test should be "xpassed" and with a display of why.
For example, how the reason for failed tests is displayed

Screenshots
1

2

Desktop (please complete the following information):

  • Python 3.10.4
  • Pytest 7.4.3
  • PyCharm 2023.2.3 (Professional Edition)
@BohdanObruch BohdanObruch added the bug Something isn't working label Nov 1, 2023
@poliarush poliarush transferred this issue from testomatio/app Nov 2, 2023
@poliarush
Copy link
Contributor

poliarush commented Nov 2, 2023

@BohdanObruch what did you use to report the results to testomat.io? please give command example from terminal

@BohdanObruch
Copy link
Author

I used the pytest-analyzer plugin,
mark.skip appears in the report, but mark.xfail does not
@
2

@BohdanObruch
Copy link
Author

I use pytest --analyzer sync command

@BohdanObruch
Copy link
Author

and what is somewhat interesting:
if I run tests by command in the console
set TESTOMATIO=api_key; pytest --analyzer sync , for some reason does not accept the installation of the TESTOMATIO variable, indicates an error, only when I manually add it to the PC environment variable, it works

@Ypurek
Copy link

Ypurek commented Nov 2, 2023

will check and fix on weekend

@Ypurek
Copy link

Ypurek commented Nov 20, 2023

@BohdanObruch sorry for a delay. I was sick. I've looked through your issue and cannot figure out how to help you. There is no such status in testomat.io like xpassed. There only 3
@poliarush pls correct me if I am wrong

@BohdanObruch
Copy link
Author

Hello, thank you for your reply Oleksii.
I think you mean these statuses, but what about this pytest mark? Display in "Passed" status in testomat.io is not correct
1

@Ypurek
Copy link

Ypurek commented Nov 20, 2023

do you suggest to fail this test in testomat.io?

@Ypurek
Copy link

Ypurek commented Nov 20, 2023

image
pycharm considers it as passed

@BohdanObruch
Copy link
Author

  1. in the photo, you run the test by clicking the launch icon (triangle in the line of the name of the test) or by clicking on the file, but I will show you the command to run the tests through the console "pytest ..."
  2. I added a photo of what this status looks like in Allure, there it is clear what kind of status it is and why exactly (mark displayed is @pytest.mark.xfail)
    2

@BohdanObruch
Copy link
Author

with such a launch as you indicated there, regardless of the pytest marks in the test, only the status of either passed or failed will be displayed

@poliarush
Copy link
Contributor

poliarush commented Nov 23, 2023

@Ypurek @BohdanObruch
from testomat.io point of view there only 3 statues:

  • passed
  • failed
  • skipped

it's tricky part how to treat xfail and xpass

  • xfail = failed or passed ?!
  • xpass = passed or failed ?!

I think, the mapping from pytest's statuses to testomat.io's statuses should be like this:

  • pass = passed
  • fail = failed
  • skip = skipped
  • skipif = skipped
  • xfail = failed
  • xpass = passed

However, I suggest to put message for test case for following statuses to it's visible in the list:

2023-11-23_21 46 39@2x

2023-11-23_21 50 45@2x

  • skip, reason if available, example "pytest.mark.skip("all tests still WIP")"
  • skipif, reason if available, example "pytest.mark.skipif(sys.platform == "win32", reason="does not run on windows")"
  • xfail, reason if available, example "Expected failure with pytest.mark.xfail(sys.platform == "win32", reason="bug in a 3rd party library")"
  • xpass, example "Test passed but expected to fail with pytest.mark.xfail(reason="bug in a 3rd party library")"

@DavertMik I think we need to introduce labels and custom fields for testrun

@BohdanObruch
Copy link
Author

@poliarush thank you, I agree with you

@poliarush
Copy link
Contributor

poliarush commented Nov 24, 2023

related to testomatio/app#881

@BohdanObruch also, i've created a new item to introduce labels and custom fields for testrun, so then you can customly assign necessary labels to testruns and filter by it with some statistics

@DavertMik
Copy link
Contributor

Closed in favor of testomatio/app#881

@poliarush
Copy link
Contributor

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working priority medium python
Projects
None yet
Development

No branches or pull requests

4 participants