You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Inventorize the existing acceptance / system tests in order to avoid redundancy. For this purpose we propose to create a test plan covering the following aspects:
Do they require a special category instead of being in TestAcceptance?
Which data do they use to run?
Who maintains said data?
Determine scope to test:
Generation of reinforcement profiles? Currently not considered as acceptance tests as they run "fast".
"Sandbox" Koswat analysis (WITHOUT validation of expected results, only checks generation of files), for the default input profile and the following combination of cases:
With / without infrastructure,
With / without obstacles,
With / without grass and clay layer,
With Scenario default / 2 / 3
"Sandbox" Koswat analysis (WITH validation of expected results), default scenario, with obstacles and the combination of:
With / without grass and clay layer,
With Scenario default / 2 / 3
This test category should be replaced with a "curated" approach of the previous one.
"Sandbox" Koswat analysis with default layers, and all possible input profiles, checks for the correct generation of the reinforcement profiles.
The very first acceptance tests might be outdated / redundant as per now, study whether we should keep them in a small test-case or remove them instead:
Update test bench according to what's spoken in the previous step.
Study the option of delivering a docker version of the tool that can be used to run koswat via command line.
Use case
Product owner as well as developers want to understand what are we currently testing and verify whether the generated results adhere to domain expectations.
Both PO and dev team should be able to modify / extend the test data that will be collected by the acceptance tests without having to create new tests (or modifying the existing ones). This would enable testers / domain experts / PO to easily validate the current status of the tool.
Kind of request
Modifying existing functionality
Enhancement Description
Inventorize the existing acceptance / system tests in order to avoid redundancy. For this purpose we propose to create a test plan covering the following aspects:
TestAcceptance
?TestKoswatHandler.test_koswat_handler_run_analysis_given_valid_data
tests.test_main.TestMain.test_given_valid_input_succeeds
Use case
Product owner as well as developers want to understand what are we currently testing and verify whether the generated results adhere to domain expectations.
Both PO and dev team should be able to modify / extend the test data that will be collected by the acceptance tests without having to create new tests (or modifying the existing ones). This would enable testers / domain experts / PO to easily validate the current status of the tool.
Additional Context
The text was updated successfully, but these errors were encountered: