Comparison for Cases (Results) report


The Comparison for Cases (Results) report shows all the test cases you have executed across an array of test runs and the results from those tests. You can select specific sections of test cases to report on or filter test cases displayed in the report by various fields like test type, priority, and more.

The report will give you insights into the status of tests across multiple test runs. The report also gives you visibility into the potential issues with the tests in your test suite or areas of your application that might have a higher level of risk in a given release, as indicated by clusters of non-passing tests.

Creating a Comparison for Cases (Results) report


Please make sure you have the following permissions enabled for your user role.


For more details, please check out managing user roles and permissions guide

Steps to create a Comparison for Cases (Results) report

To generate the Comparison for Cases (Results) report in TestRail, you need to put a few configuration settings in place.

Navigate to the project for which you would like to create a test report. Then, navigate to the Report tab.

On the right side, you will see a pane titled Create a Report. Click on Comparison for Cases under the Results section.

On Add Report page, you will find 3 main sections: 

  • Name and Description
  • Report Options
  • Access & Scheduling

Follow the below steps to configure each section

  1. Name and Description

    Enter the report's name and description in their respective fields.

  2. Report Options

    On the Add Report page, scroll down to Report Options. Here, you will see two tabs, SECTIONS & TEST RUNS and TEST CASES. 


    You can do the following under the sections and test runs tab:


    1. Choose to include all sections of test cases or specific sections only. This allows you to show the results for specific sections of your test case repository that you would like to display in your report. For instance, if you would like to focus on particular features like Login and Messaging, you can select those two sections alone. Hold down the ctrl/cmd key and click to select multiple sections or sub-sections.


    2. Choose the test runs for which you would like to view test results. This can be done in two ways. The first option is to apply one or more filters to select the test runs that match the applied filter(s). You can filter the test runs using parameters like Assigned To, Completed On, Created By, Created On, Is Completed, Milestone, and Test Plan. This can be helpful if you want to compare results for tests across multiple test runs that fit certain criteria, e.g. all the test runs in a given test plan, or across a milestone that you are using to track testing for a specific release.


    3. Afterwards, click Add Test Runs; then, click Apply Filters.


      The second option is to make individual selections of the test runs, one by one. 


      Select the maximum number of test runs to include. The maximum number can be five, 10, 25, 50, or 100. 


      You can do the following under the test cases tab:

      1. Filter the test cases to be displayed in the report by any case field—including custom case fields—except text fields, like Assigned to, Created By, Deletion Status, Estimate, Priority, etc. For instance, in the figure below, the Priority filter is applied and set to High.


      2. Choose the column(s) to include in the table to display information about test cases. The default columns include ID and Title. Depending on the information you want to display, you can add additional columns such as Type (of the test case), Priority, Created On, etc.


      3. If you would like to show the comparison between the test results for each selected test run in your report, you need to check the box provided for the option as shown in the Figure below. The same applies if you want to show the latest/combined test result for the selected test runs. Afterwards, you can set the maximum number of test cases to include in the report. The maximum number can be 100, 250, 500, 1000, 2500, or 5000.



  3. Access & Scheduling

    This section enables users to set access and scheduling options. For more details, please refer to the General configurations guide.

Finally, scroll down and click Add Report to generate the report.

Reading a Comparison for Cases (Results) report

It may take some minutes to generate your report depending on the amount of data included. However, once the report is fully generated, the next thing is to view and interpret it.

To view the report, go to your Reports tab and click on the appropriate report under the list.

The first chart on the report page is a stacked bar chart showing you the total number of test results recorded for each selected test run and the breakdown of the test results by status.  To the far right of the chart is a bar that shows the latest/combined test results for any test case you have tested in the selected test runs.

Hover your mouse on each of the bars to see more details.


Next, you will see a list of the test runs contained in the report.

Immediately following that is a table that provides the configuration, test plan, and milestone associated with the test run (if applicable), the total number of tests in the run, and a summary of the number of tests with the Passed, Blocked, Untested, Retest, and Failed status for each selected test run. If you created the test run outside of a test plan, or have not linked the test run to a milestone, these fields will appear blank.


Finally, the report presents different tables listing all the test cases for which you have added tests in the test runs included in the report.

You will see information about each test case configured during the report setup, like Case ID, Title, and any other columns you added. Also, you will see the current status for any tests you have added against the test cases for each test run. If a cell in the table is blank, it means you have not added a test for that case in the specific test run. Similarly, if an entire row is blank, it means you have not added any tests for the case in any of the test runs included in the report. The figure below shows the results for an example set of test cases in a section called the “Download and Installation” section.

As seen in the figure, the test case with ID C204 (User can install app on mobile device) has a priority set to Critical, and the type of test carried out for the test case was a smoke test. Also, it can be seen that the smoke test passed for Android and iOS while that of Windows failed.


Was this article helpful?
3 out of 6 found this helpful