Runs (Summary) report


The Run (Summary) report provides a high-level overview of all progress and testing activity for one or more test runs in your TestRail project. The report shows an overview of the current progress and test results across all of the runs you include in the report, as well as a summary of the progress of each run and a detailed breakdown of the status of each test being executed within that run. You can use this report to get an overview of a custom set of test runs within a single report and increase visibility on the status of testing across multiple test runs.

Creating a Runs (Summary) report


Please make sure you have the following permissions enabled for your user role.


For more details, please check out the managing user roles and permissions guide

Steps to create a Runs (Summary) report

To generate the Runs (Summary) report in TestRail, you need to put a few configuration settings in place.

Navigate to the project for which you would like to create a test report. Then, navigate to the Report tab.

On the right side, you will see a pane titled Create a Report. Click on the Runs report under the Summary section. 


On Add Report page, you will find 3 main sections: 

  • Name and Description
  • Report Options
  • Access & Scheduling

Follow the below steps to configure each section

  1. Name and Description

    Enter the report's name and description in their respective fields.

  2. Report Options

    On the Add Report page, scroll down to Report Options. Here, you will see four tabs, TEST RUNS, DETAILS, ACTIVITY, and TESTS.


    On the TEST RUNS tab, you can set the report to include test runs by filter or by name.

    To select by the filter, click Change then use the dropdowns to select criteria, and to select specific test runs, click Add Test Runs and select one or more named runs.





    On the DETAILS tab, you can select the general sections to include in the report: Status and test statistics, Activity (results over time), Progress and remaining estimate/forecast, Tests, and test results.



    You can select the time frame for the report. You can select between today and past dates.


    1. You can select the time frame for the report. You can select between today and past dates.
    2. By default, all test cases are included. To show only test cases with specific statuses, hold ctrl + click one or more: Passed, Blocked, Untested, Retest, Failed, In Progress.
    3. Select the maximum number of activities to display on the report, (from newest to oldest). 10, 25, 50, 100, 250, 500, or 1000 (maximum).


    You can do the following under the TESTS tab:

    1. Filter the tests to be displayed in the report by any case field—including custom case fields—except text fields, like Assigned to, Estimate, Priority, etc. 

    2. Choose the column(s) to include in the tables displaying tests' information in the report. The default columns include ID and Title. Depending on the information you want to display, you can add additional columns such as Type (of the test), Priority, etc.
    3. Set the maximum number of tests to display per group. The maximum number can be 10, 25, 50, 100, 250, 500, or 1000.


    To select specific test runs, click Add Test Runs and select one or more named runs.

  3. Access & Scheduling

    This section enables users to set access and scheduling options. For more details, please refer to the General configurations guide.

Finally, scroll down and click Add Report to generate the report.

Reading a Runs (Summary) report

It may take minutes to generate your report depending on the data included. However, once the report is fully generated, the next thing is to view and interpret it. To view the report, go to your Reports tab and click on the appropriate report under the list.


If the user enabled “Status and test statistics” at Report Options > Details of configuration, this section shows at the top of the report. 

A pie chart showing a status-by-status breakdown of runs associated with the project. This gives users a quick view of the overall health across the project, and also a ratio of runs that are untested and completed at the bottom.


A table of test runs, separated by run date and displaying columns as the user configured them. Users can click the names of test runs, test plans, or milestones to open detail pages.


On the list, each test run that was created for a different device is listed as a separate item. Alternatively, users can click into the test plan column to see a list of all associated runs.


The activity section shows a chart of activity within the configured timeline, as well as a breakdown of test case percentages according to status.


Underneath the chart is a list of test cases (newest to oldest run dates) that are sorted by date and marked by badges showing the current status for each item. Users can click the test case names to open a details page.


To the right of the test case list, the name of the user who ran the test shows. 

  • Tested by – Indicates the user that created the test.
  • Marked by – Indicates the user who made a manual change to the status or results.
  • Deleted by – Indicates the user that canceled or removed a test case – test name displays as a strikethrough.


At the top of the Progress section, is a chart of remaining tests to complete and remaining effort needed, against ideal progress. This is a look at overall test trends by date.


Forecast & Estimates give users a side-by-side hours estimate of progress, breaking down more details about completion ratios and the number of weeks since TestRail started collecting data on past/current tests. At a quick glance, users can view the projected completion date based on estimates.


Underneath Forecasts & Estimates is a list of all the test runs, contributing data to this report. The device type is listed to the right of the titles, and the titles link to more details.


Tests & Results

This section gives a view of all the test cases that were run, and some context about their health. Test cases populate according to the filters and columns the user set on the Tests tab during configurations. Test runs are sorted by date, and each test run lists IDs that were tested for a specific device.


The test cases are separated according to groups the user configured during test run creation. Users can click into the Title column to open a details page for each text case that is listed.


Was this article helpful?
1 out of 6 found this helpful