Skip to main content

AI Governance metrics overview

How AI Governance measures AI tool adoption and impact across your engineering team.

Updated in the last hour

The AI Governance dashboard combines several distinct signals about AI tool adoption and the impact of AI-assisted work on delivery outcomes. This article is the entry point — it explains what each metric category measures and how to read them together.

Metric categories

  • Adoption — who is using AI tools, how often, and which tools.

    • AI-assisted PRs

    • AI-assisted Commits

    • Distinct AI users per week (covered in AI User Engagement Segments)

    • AI User Engagement Segments — Power, Casual, Idle, New

  • Volume — how much code is flowing through AI-assisted work.

    • AI Lines of Code

    • Avg Lines per PR (AI-assisted vs non-AI) — covered in AI Lines of Code

  • Impact — does AI usage correlate with measurable delivery improvements?

    • AI Lead Time Comparison (with Coding, Waiting for Review, In Review, Ready to Deploy sub-stages)

    • AI Intensity — and its scatter plots against Cycle Time, Throughput, and Bug %

  • Investment — what AI usage costs and how seats are utilized.

    • AI Tool Cost

Data sources

AI Governance derives signals from connected AI tool integrations and overlays them on existing Git activity:

  • GitHub Copilot — usage and billing data from the Copilot API.

  • Claude Code — activity data from the integration; cost is estimated from active users × seat price.

  • Cursor — activity data from the integration; cost is estimated from active users × seat price.

  • Other AI tools as their integrations come online.

If a tool isn't connected, its panel on this dashboard stays empty.

How to read AI Governance

  1. Start with Adoption. Are AI tools being used at all? By whom?

  2. Look at Volume. Is AI-assisted work a small share of total code or a meaningful one?

  3. Check Impact. Are AI-enabled teams actually moving faster than peers?

  4. Round out with Investment. Are the costs justified by the impact you observed, and are the seats you pay for actually utilized?

Caveats

  • Correlation, not causation. AI Intensity scatter plots show patterns, not proof.

  • The "AI-assisted" label on a PR or commit means at least one commit was authored on a day the developer used an AI tool — not that the AI wrote the code.

  • Engagement segments (Power, Casual, Idle, New) are documented in AI User Engagement Segments.

Related articles

  • AI Governance dashboard

  • AI Lead Time Comparison

  • AI-assisted PRs

  • AI-assisted Commits

  • AI Lines of Code

  • AI Intensity

  • AI Tool Cost

  • AI User Engagement Segments

Did this answer your question?