I might drink this beer

Logo

I am a jack of some trades and definitely a master of none. That doesn't mean I haven't had some experience and a handful of opinions to go with it. All of the opinions expressed here are my own and do not reflect the views of my employer

@mjmengerGitHub

Encrypted chat via Keybase markjmenger

mstdn.socialmastodon

mastodon.f2a.iomastodon

pdx.socialmastodon

RSS

summer
sweet
devops
dark
year-round
big-ip
automation
hashicorp
terraform
winter
brewing
yeast
complexity
fragile2agile
technology
history
lean
modernization
evolutionary
revolutionary
innovation
strategy
security
agility
linguistics
ai
architecture

4 April 2026

What Accounting Actually Measures

by Mark J Menger

What Accounting Actually Measures

In 1975, a British economist named Charles Goodhart made an observation while critiquing the Thatcher government’s approach to monetary policy. The government had chosen specific statistical targets — measures of money supply — to guide economic decisions. Goodhart’s point was precise: the moment you elevate a statistical measure to the status of a policy target, it stops being a reliable measure. The act of targeting it changes the behavior of the system being measured, which corrupts the measurement.

He stated it simply: “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”

The popular rendering, later crystallized by anthropologist Marilyn Strathern, is more compact: when a measure becomes a target, it ceases to be a good measure.

Goodhart was talking about monetary policy. But what he had identified is a general property of complex systems — one that applies with particular force to organizations and the metrics they use to navigate. It is, in retrospect, one of the most important and least-heeded insights in the management of human institutions.


What Financial Metrics Actually Are

Revenue. Margin. Earnings per share. Return on equity. These are the instruments most organizations use to assess their performance and direct their decisions. They have the considerable virtue of being quantifiable, comparable, and legible across time periods and organizational units. They are the common language of boards, investors, and executive teams.

They are also, without exception, backward-facing. Every financial metric measures what has already happened. Revenue this quarter reflects sales activity from weeks and months ago. Margin reflects cost structures that were established over years. Earnings per share is a function of profits already realized and capital allocation decisions already made.

Robert Kaplan and David Norton, in their landmark 1992 Harvard Business Review article “The Balanced Scorecard: Measures That Drive Performance” — and the subsequent 1996 book — named this property precisely. They described financial metrics as lagging indicators: measures that record the residue of prior activity. Their core observation was that companies managing by lagging indicators alone were, in their phrase, driving a car by looking in the rearview mirror.

Kaplan and Norton proposed supplementing financial measures with what they called leading indicators — measures from the perspectives of customers, internal processes, and organizational learning — that would give managers forward visibility into the conditions producing future financial outcomes. The financial results would remain as the ultimate scorecard. But they would be understood as consequences of getting the leading indicators right, not as targets to be managed directly.

This distinction — between what generates value and what records value after it has been generated — is the conceptual foundation of everything that follows in this series.


The Category Error

Here is the failure mode, stated plainly: an organization that optimizes its lagging indicators directly is making a category error. It is treating the measurement as the thing being measured. It is, to return to Goodhart, elevating a measure to the status of a target — which corrupts the measure and degrades the underlying reality it was supposed to represent.

This is not a minor navigational error. It is a reversal of causal order. Financial results are downstream of organizational health. When you manage the downstream metric rather than the upstream reality, you create the appearance of health while the conditions that generate health erode. You can sustain this for some time — the lag between cause and consequence in large organizations can be years or even decades — but the eventual reckoning is as predictable as compounding interest.

The Soviet nail factory is the archetype. Given a production target measured in number of nails, the factory produced enormous quantities of tiny, useless nails. When the target was changed to weight, it produced enormous, useless nails. The metric was being optimized. The purpose the metric was supposed to represent — useful nails — was not.

The analogy to organizational management is not perfect, but the structural error is identical: when the measure becomes the target, the system reorganizes itself around producing the measure, not around the underlying reality the measure was designed to track.


The Leading Indicators That Actually Matter

If financial metrics are the wake of organizational activity, what is the horizon? What are the conditions that generate financial outcomes rather than merely record them?

They are harder to see clearly and impossible to aggregate into a single number. But they are identifiable:

Customer trust and experience. The willingness of customers to continue buying, to pay a premium rather than defect to a cheaper alternative, to recommend without being asked. This is an asset of enormous value that does not appear on any balance sheet.

Product and service integrity. Whether what the organization produces actually works, actually solves the problem it is supposed to solve, actually delivers what was promised. This is both an ethical and a financial condition — customers who experience integrity tend to return; those who experience its absence tend not to.

Talent and institutional knowledge. The quality, motivation, and accumulated experience of the people doing the work. This is perhaps the most fragile of the leading indicators — easily degraded by the wrong incentives and nearly impossible to rebuild once it has dispersed.

Organizational capacity for adaptation. The ability to recognize changing conditions and respond effectively. This depends on the quality of information flowing through the organization, the willingness of people to surface uncomfortable signals, and the judgment of those who receive them.

None of these appear as line items. All of them ultimately express themselves as financial results — but only over time horizons longer than a quarterly reporting cycle, and only in ways that are difficult to attribute to any specific prior decision.


Why This Matters

The distinction between leading and lagging indicators is not merely academic. It determines what an organization pays attention to, what it invests in, what it measures progress against, and therefore what it becomes.

An organization that treats financial metrics as the primary navigation instrument will, over time, optimize its lagging indicators while the leading indicators quietly erode. Customer trust will be harvested rather than compounded. Talent will be managed as a cost rather than a capability. Product integrity will be traded against margin at the margin.

An organization that treats financial metrics as the lagging consequence of getting the leading indicators right will do the opposite. It will invest in customer experience, product quality, and talent development not as acts of generosity but as acts of rational capital allocation — because these are the conditions that generate financial outcomes over the relevant time horizon.

The difference between these two orientations is the difference between steering by the wake and steering by the horizon.


Next: What happens when organizations harvest the leading indicators they should be compounding — and why the damage is so difficult to detect until it is irreversible.


References

tags: innovation - lean - fragile2agile - history - technology - complexity