Skip to content
Knowledge beta

Goodhart's Law

The moment you mandate a metric, people optimize for the number instead of the result it was supposed to measure.

“When a measure becomes a target, it ceases to be a good measure.” Charles Goodhart articulated this in 1975 while studying monetary policy, but it applies everywhere metrics meet incentives.

The classic illustration is the Cobra Effect: British colonial administrators in Delhi offered bounties for dead cobras to reduce the snake population. Entrepreneurs started breeding cobras for the income. When the program was scrapped, the breeders released their now-worthless stock and the cobra population ended up larger than before. Not good.

The same dynamic plays out in engineering organizations.

Mandate a cycle time target and teams start splitting work into artificially small increments. Mandate code coverage and teams write tests that exercise lines without asserting behavior. The metric improves; the underlying problem doesn’t.

This is why the Discover-Option-Action Cycle insists on outcome-first measurement: teams that choose their own metrics to track progress toward a validated outcome have no incentive to game the number, because the number serves their learning, not someone else’s dashboard.

Outcomes Over Outputs makes the same argument at the product level: optimizing output metrics without connecting them to outcomes is activity masquerading as progress.

Resources

  • Discover-Option-Action Cycle — the antidote to top-down metric mandates
  • Output-Activity Trap — what happens when organizations optimize output metrics
  • Charles Goodhart, “Problems of Monetary Management: The U.K. Experience” (1975)

©2026 Nerd/Noir. All rights reserved.

Referenced sources and frameworks are copyrights of their respective owners. Fair use & attribution.