Reviewer metrics live inside the Team Collaboration Network tab of the Performance Delivery dashboard. They quantify the reviewer side of the PR lifecycle — how quickly reviewers engage, how broadly review work is shared, and how fast PRs reach approval.
The four sub-metrics
Reaction Time — the average number of hours it takes a reviewer (or the team) to respond to PRs assigned for review. This is the headline review-pickup signal.
Sharing Index — measures how broadly review work is shared across the team by looking at who reviews whose PRs. A high Sharing Index means reviews flow across the team rather than concentrating on one or two people.
AVG Pickup Time — the average time in hours from PR creation until the first review interaction (a comment, approval, or reviewer commit). This is the gap between "ready for review" and "someone is actually engaging".
TT First Approval — the average time in hours between PR creation and first approval, rounded to the nearest tenth of an hour. Captures how long the path-to-mergeable signal takes.
How to interpret them
A long AVG Pickup Time is the most common review-side bottleneck. Until pickup happens, nothing else can.
A low Sharing Index is a fragility signal — the team is dependent on a small set of reviewers. If those people get busy or leave, throughput collapses.
If Reaction Time and AVG Pickup Time look good but TT First Approval is high, reviewers are engaging but not finishing — the iteration loop is long.
What to do about it
Set a review-pickup SLA the team agrees to (a few working hours is typical for a healthy team).
Auto-assign reviewers from a rota instead of having authors hunt. This raises the Sharing Index almost immediately.
If TT First Approval is the long pole, look at PR size — large diffs slow first approval more than they slow pickup.
Related
Submitter metrics
Team Collaboration Network — overview
Lead Time for Changes
