Skip to content

Regression Testing SOPs

Michael Tryby edited this page Jul 15, 2020 · 1 revision

General Observations

  • Test file names containing spaces are currently not supported. Substitute a dash "-" or an underscore "_" instead.

  • Comparing report files can be problematic. To help, minimize false test failures use the following settings in the test case [REPORT] section.

Recommended setting for [REPORTS] section when adding new tests

[REPORT]
 Status             Yes
 Summary            No
 Nodes              None
 Links              None

Adding Tests

  1. Create a test configuration json file using test-config script.

  2. Add the new input and configuration files to the test repo.

  3. Run nrtest to generate new test artifacts.

  4. Create a benchmark archive containing the new test artifacts.

  5. Create a new release and attach the new benchmark archive.

  6. Trigger a build on the CI server and check that the new tests are running properly.

Updating Rolling Benchmark

  1. Retrieve SUT Benchmark artifact from Actions.

  2. Inspect SUT Benchmark for anomalous results. If the results checkout then proceed to Step 3 otherwise do not merge the changes. Instead create an issue describing the problem.

  3. Create a new release in the swmm-nrtestsuite repo

  4. Attach SUT Benchmark to the new release.

  5. Determine BUILD_ID from SUT Benchmark manifest.

  6. Trigger a new Actions build

  7. Check that nrtests are passing on Actions.

  8. Merge the PR. The SUT Benchmark is now the new REF Benchmark.

Clone this wiki locally