Skip to main content
Template · Test Tracking Workbook

One workbook for everything you track during test execution.
Team, environment, coverage, status, metrics.

A multi-sheet workbook that gives a test manager a single pane of glass over an execution cycle. Designed for the volume of data a real program produces, not the volume a demo produces.

Sheets
6+
Variants
Basic + Advanced
Signals
Team, Env, Coverage, Status

Dashboards without source data are theater. This workbook is the source data — the chart is a byproduct.

Key Takeaways

Four things to remember.

01

One book, six sheets

Test Environment, Test Team, Quality Risk Coverage, Test Case Summary, Bug Metrics, Release Readiness. Each sheet feeds the others; none stands alone.

02

Every test case ties to a risk

The Quality Risk Coverage sheet cross-tabulates test cases against the FMEA risks. An uncovered high-RPN risk shows up immediately.

03

Estimate vs. actual, side by side

The Test Case Summary sheet tracks estimated and actual effort for every case. Velocity signals emerge from the difference — not from guesswork.

04

Track the environment explicitly

The Test Environment sheet names every host and its availability status. A "blocked" cycle has a root cause traceable to a named system.

Why this exists

What this template is for.

Most test tracking we inherit is a README, a shared doc, and three Slack channels. This workbook replaces all three with a structured record the whole team edits together.

The Basic variant is four sheets; the Advanced variant adds historical snapshots and a release-readiness view. Pick the one that matches how many releases you run per year — both are useful.

The columns

What each field means.

Sheet 1 — Test Environment

One row per test host or environment. Track system role, hostname, IP, OS, installed software, and current availability.

Example: DB1 / Kursk / 192.168.6.10 / Solaris / Oracle 9i / Available

Sheet 2 — Test Team

Roster of testers, including initials (used elsewhere in the book), full name, title, and working shift.

Example: JHB / Jamal Brown / Test Manager / Split

Sheet 3 — Quality Risk Coverage

Matrix cross-tabulating test suites / cases against the FMEA risks. Cell values indicate coverage strength (e.g., 9 = full, 3 = partial, 1 = incidental, 0 = none).

Example: Test 1.001 (File) covers risk 1.001 with weight 9 (full coverage).

Sheet 4 — Test Case Summary

Row per test case. Tracks developer, executor, Test ID, name, status, type (Auto / Manual), phase, estimated effort, actual effort, estimated duration, and comments.

Example: ATE / TT / 1.001 / File / Update / Auto / IS / 2 est / 6 act / 4 duration

Sheet 5 — Bug Metrics

Running counts of bugs by severity × priority, opens vs. closes per cycle, fix velocity, and regression rate. Feeds the dashboard charts.

Example: Severity 1 × Priority 1 = 3 open; 2 closed this cycle; regression rate 4%.

Sheet 6 — Release Readiness

Summary view rolling up coverage, status, and bug metrics into release-go/no-go signals for leadership.

Example: Coverage: 78% of high-risk; Status: 62% passed; Bugs: 2 sev-1 open.

Live preview

What it looks like populated.

Test Environment sheet from the Basic Sumatra Test Tracking workbook.

SystemNameIP AddressOSOther SWStatus
Server Cluster East (SE)
DB1Kursk192.168.6.10SolarisOracle 9iAvailable
Web1Leningrad192.168.6.20SolarisNetscapeAvailable
App1Stalingrad192.168.6.30HP/UXOracle 9ASAvailable
Server Cluster West (SW)
DB2Dunkirk192.168.6.11AIXSybaseAvailable
Web2Bulge192.168.6.21AIXDominoAvailable

How to use it

7 steps, in order.

  1. 1

    Populate the Test Team sheet with every person who will log test results. Use initials consistently in downstream sheets.

  2. 2

    Populate the Test Environment sheet with every host the test lab will use. Mark availability before each cycle.

  3. 3

    Import your FMEA into the Quality Risk Coverage sheet as column headers. Create one row per test suite / case.

  4. 4

    Fill coverage values (9 / 3 / 1 / 0) for each test case against each risk. Sort by risk RPN to spot uncovered high-risk cells.

  5. 5

    In Test Case Summary, enter estimated effort and duration for every case before execution starts. Fill in actual effort as execution progresses.

  6. 6

    Update Bug Metrics daily during execution. Use the shift transition as the natural checkpoint.

  7. 7

    At each release gate, review the Release Readiness roll-up with stakeholders. If it cannot answer their questions, iterate the underlying sheets.

Methodology

The thinking behind it.

The Basic variant is enough for teams running 3–6 test cycles per release. The Advanced variant adds historical snapshot sheets and a read-only dashboard tab suitable for programs running 20+ cycles per year.

The coverage weights (9 / 3 / 1 / 0) come from QFD (Quality Function Deployment). They force a step-function judgment instead of arbitrary percentages.

Take it with you

Download the piece you just read.

We keep this library free. All we ask is that you tell us who you are, so we know who to follow up with if we release an updated version. One-time form, this browser remembers you after that.

Need a QA program to back this up in your organization?

If a checklist is not enough and you want help applying it to a live engagement, we can have a call this week.

Related reading

Articles, talks, guides, and case studies tagged for the same audience.