I might drink this beer

Logo

I am a jack of some trades and definitely a master of none. That doesn't mean I haven't had some experience and a handful of opinions to go with it. All of the opinions expressed here are my own and do not reflect the views of my employer

@mjmengerGitHub

Encrypted chat via Keybase markjmenger

mstdn.socialmastodon

mastodon.f2a.iomastodon

pdx.socialmastodon

RSS

summer
sweet
devops
dark
year-round
big-ip
automation
hashicorp
terraform
winter
brewing
yeast
complexity
fragile2agile
technology
history
lean
modernization
evolutionary
revolutionary
innovation
strategy
security
agility
linguistics
ai
architecture

5 April 2026

Let Them Eat Cake

by Mark J Menger

Let Them Eat Cake

The previous article described the strategic failure of accounting-led enterprises: they optimize lagging indicators directly, harvesting the leading indicators — customer trust, product quality, institutional knowledge — that would have generated future financial outcomes. This is the seed corn problem. It is serious. And it is, in principle, correctable. If the people making decisions could see what they were doing, they could choose differently.

The reason they don’t choose differently is the subject of this article. It is not, primarily, a failure of character. It is a failure of epistemology — of the conditions that make it possible to know what is actually happening.


The Quote That May Never Have Been Said

Marie Antoinette almost certainly did not say “let them eat cake.” The remark, attributed to her in the context of a bread shortage, does not appear in any contemporaneous source and was probably apocryphal — possibly misattributed from an earlier text, possibly invented entirely.

It doesn’t matter. What the quote captures is real, and it has survived precisely because the phenomenon it describes is recognizable across centuries and contexts: the decision-maker so thoroughly insulated from the conditions of the people the system depends on that she cannot perceive the gap between her model of reality and reality itself.

She was not, on the attribution’s logic, being cruel. She was being sincere. If there is no bread, eat cake — it did not occur to her that the absence of bread implied an absence of cake, because her information environment had never included either absence. The abstraction layer between her and ground truth was complete.

This is the epistemic failure of accounting-led enterprises. And it is the reason the strategic failure described in the previous article is so difficult to interrupt from the inside.


The Abstraction That Enables Extraction Also Prevents Its Detection

Financial abstraction does two things simultaneously. It enables extraction — by reducing complex organizational realities to line items, it creates the legibility required to act on them at scale. You cannot manage what you cannot see, and financial metrics make visible, in a single dashboard, the consolidated performance of thousands of people across many activities. This is genuinely useful.

At the same time, the aggregation process that produces that legibility destroys the texture of information that would allow decision-makers to perceive what their decisions are actually doing to the underlying reality. Customer satisfaction becomes a Net Promoter Score — a single number that contains none of the specific friction, disappointment, or distrust that produced it. Employee engagement becomes a survey metric — a number that contains none of the specific demotivation, sense of futility, or search for alternatives that produced it. Product quality becomes a defect rate — a number that contains none of the specific engineering compromises that produced it.

The executives reading these dashboards are not stupid. They are responding rationally to the information environment they have been placed in. But that information environment has a fundamental property: it represents the world through the categories the accounting framework can capture. And the accounting framework was designed to capture financial flows, not human experience.

The lived reality of customers — the friction of a degraded support experience, the disappointment of a product that no longer works as well as it once did, the quiet decision to look for an alternative — is progressively filtered out of the information that reaches people who could act on it. Not through deliberate concealment. Through the architecture of what gets measured and therefore what becomes real within the organization.

By the time the damage appears in the lagging indicators — churn rates rising, customer acquisition costs increasing, pricing power eroding — it is usually irreversible. The trust that took years to build has been spent. The engineering talent that took years to develop has dispersed. The institutional knowledge that took years to accumulate has left. These things do not reconstitute quickly.


The Unfalsifiability Problem

There is a further dimension to the epistemic failure, and it is the one that makes accounting-led worldviews particularly resistant to challenge.

The philosopher Karl Popper argued, in The Logic of Scientific Discovery (1934), that the defining property of a genuine empirical claim is its falsifiability — the existence of some evidence that, if observed, would require revising or abandoning the claim. A framework that can accommodate any outcome — that can explain away every piece of counterevidence — is not an analytical framework. It is a closed belief system.

The “everything reduces to dollars” worldview has this property. Any outcome that contradicts the model gets explained away: “we didn’t cut enough,” “the market hasn’t caught up,” “those are externalities,” “the disruption would have happened anyway.” There is no result that would prompt someone deeply inside this framework to question the framework itself.

You can test this directly. Ask a committed adherent: what evidence would change your mind? Genuine analysts will engage with the question — they can describe the conditions under which they would revise their view. People inside the closed system will deflect, reframe, or express confusion about what the question is even asking. The diagnostic value of this question is not in the answer it produces but in the quality of engagement it receives.

This is not a character indictment. People operating inside closed epistemic systems are not necessarily dishonest or unintelligent. They are often confident, capable, and sincere. The confidence is, in fact, part of the signature of the closed system: when your framework always produces an answer, uncertainty is never experienced, and the absence of uncertainty feels like rigor.


What IBM’s Executives Could Not See

Return to IBM. Over the decade in which its strategic position was being hollowed — the 2000s and 2010s — its executives were not, by most accounts, ignorant of the trends in cloud computing, AI, and software-defined infrastructure. IBM’s research organization was tracking these developments closely, often leading them.

What the executives at the top could not perceive was the organizational consequence of their own resource allocation decisions. The engineers and researchers who saw the future were not adequately resourced or empowered. The financial engineering that sustained quarterly metrics was simultaneously defunding the leading indicators that would have made the metrics sustainable. But this dynamic was not visible in the dashboards the executives were navigating by.

They were not, to use the Marie Antoinette framing, being cruel. They were being sincere. Their information environment, organized around financial metrics, did not present the progressive erosion of organizational capability in a form they could act on. By the time it appeared in the lagging indicators, the talent had gone, the culture had changed, and the window for a different outcome had closed.

This is the compounding catastrophe: the strategic failure described in the previous article, and the epistemic failure described in this one, are not parallel problems produced by different causes. They are a single self-reinforcing dynamic produced by the same mechanism.

The financial abstraction layer enables the extraction of value from leading indicators. It simultaneously insulates decision-makers from the consequences of that extraction. The system that does the damage also prevents the damage from being legible to the people who could stop it. This is why accounting-led capture is so durable — and why the organizations that fall into it so rarely self-correct.


The Causal Chain, Complete

Stated whole:

Accounting-led enterprises mistake lagging indicators for the things those indicators were designed to measure. They optimize the measurement rather than the underlying reality. This causes them to harvest the leading indicators — customer trust, product quality, talent, institutional knowledge — that would have generated future financial outcomes. The extraction improves present metrics while degrading the conditions that produced them.

And the financial abstraction layer that enables the extraction also severs decision-makers from ground truth. The numbers replace the reality. Eventually no one in the room has independent contact with the reality the numbers were supposed to represent. The damage is neither visible nor felt until it manifests in the lagging indicators — at which point it is usually too late to reverse.

The strategic failure explains what is happening. The epistemic failure explains why it persists. Together they explain why IBM, Boeing, Oracle, and dozens of less dramatic cases follow the same pattern across different industries, different time periods, and different leadership teams.

It is not a coincidence. It is a mechanism.


Next: Why Google and Facebook are not exceptions to this framework — and what they reveal about the condition that actually matters.


References

tags: innovation - history - complexity