The Unaccountability Machine cover

The Unaccountability Machine

Why Big Systems Make Terrible Decisions - and How The World Lost its Mind

Dan Davies 2024
Computers

Press Enter to add

10

Key Takeaways

  1. 1

    Modern institutions often behave irrationally not because individuals are foolish, but because large systems distort information and incentives. As organizations scale, feedback loops weaken and decision-makers become insulated from consequences. This creates environments where bad decisions can persist without correction. The result is systemic dysfunction rather than isolated error.

  2. 2

    Cybernetics—the science of control and communication—offers a framework for understanding why big systems fail. Effective systems require timely feedback and clear lines of accountability. When feedback is delayed, distorted, or ignored, decision quality deteriorates rapidly. Many modern institutions suffer from precisely this breakdown.

  3. 3

    Bureaucracies are designed to manage complexity, but they often end up amplifying it. Layers of management, reporting structures, and performance metrics can obscure reality rather than clarify it. Over time, these structures become self-preserving and resistant to reform. Decision-making becomes more about maintaining the system than solving problems.

  4. 4

    Metrics and targets frequently replace genuine understanding within organizations. When institutions focus excessively on measurable outputs, they incentivize gaming and superficial compliance. This leads to Goodhart’s Law in action: when a measure becomes a target, it ceases to be a good measure. Apparent success masks underlying failure.

  5. 5

    Financial systems are especially prone to unaccountability because of their abstraction and complexity. Risk is often distributed in ways that obscure responsibility. When crises occur, no single actor appears fully responsible. This diffusion of accountability enables repeated systemic failures.

  6. 6

    The corporate form separates ownership from control, weakening direct accountability. Shareholders, executives, regulators, and employees all have partial but incomplete authority. This fragmentation allows harmful decisions to persist without clear responsibility. The result is moral hazard embedded in organizational design.

  7. 7

    Large systems often optimize for internal coherence rather than external reality. Organizations may produce reports, dashboards, and narratives that reinforce their worldview. Dissenting information gets filtered out as it moves up the hierarchy. Over time, leadership becomes detached from ground truth.

  8. 8

    Public policy failures frequently stem from institutional feedback failures rather than ideological disagreement. Governments struggle to process complex information at scale. Civil servants and politicians operate within incentive structures that reward short-term optics over long-term outcomes. This leads to policy churn and reactive governance.

  9. 9

    Technological systems, especially algorithmic ones, can exacerbate unaccountability. Automated decision-making often hides behind technical complexity. When harms occur, responsibility is diffused between designers, operators, and institutions. This creates a new form of opaque authority.

  10. 10

    Restoring accountability requires redesigning feedback loops and simplifying structures. Smaller units, clearer responsibility, and more direct consequences can improve decision quality. Transparency and open information flows are essential. Without systemic reform, large institutions will continue to produce irrational outcomes.

12

Concepts

Cybernetics

The study of control and communication in systems, emphasizing feedback loops and adaptive regulation. It provides a framework for understanding how organizations maintain stability or spiral into dysfunction.

Example

A thermostat adjusting heating based on temperature feedback A company adjusting strategy based on accurate customer feedback

Feedback Loops

Mechanisms by which systems receive information about their performance and adjust accordingly. Effective feedback must be timely, accurate, and actionable.

Example

Quarterly earnings reports influencing executive decisions Customer complaints leading to product redesign

Goodhart’s Law

The principle that when a measure becomes a target, it ceases to be a good measure. Overemphasis on metrics distorts behavior and undermines true performance.

Example

Schools teaching to standardized tests rather than fostering learning Police departments manipulating crime statistics to meet quotas

Diffusion of Responsibility

A condition in which accountability is spread across many actors, making it difficult to identify who is responsible for outcomes. This weakens corrective action.

Example

Financial crises where banks, regulators, and rating agencies all share partial blame Corporate scandals with unclear executive accountability

Moral Hazard

A situation where individuals or institutions take excessive risks because they do not bear the full consequences of failure. It is common in financial and corporate systems.

Example

Banks taking risky bets expecting government bailouts Executives pursuing short-term gains knowing shareholders absorb losses

Principal-Agent Problem

A conflict that arises when agents (managers) make decisions on behalf of principals (owners) but have different incentives. This misalignment weakens accountability.

Example

CEOs prioritizing bonuses over long-term shareholder value Politicians acting for reelection rather than public interest

Organizational Insulation

The process by which decision-makers become shielded from frontline realities and negative feedback. Insulation leads to distorted perceptions and poor decisions.

Example

Executives relying solely on curated reports Government leaders disconnected from local implementation challenges

Systemic Risk

The risk that the failure of one part of a system can trigger cascading failures across the whole. Large interconnected systems are especially vulnerable.

Example

The 2008 financial crisis spreading through global banking networks Supply chain disruptions affecting multiple industries

Bureaucratic Drift

The tendency of organizations to prioritize internal procedures and survival over their original mission. Over time, process replaces purpose.

Example

Regulatory agencies focusing on compliance paperwork over outcomes Corporations optimizing internal KPIs rather than customer satisfaction

Algorithmic Opacity

The difficulty in understanding how automated systems make decisions due to technical complexity or secrecy. This obscures accountability.

Example

Credit scoring algorithms denying loans without explanation Content moderation systems making unexplained removals

Complexity Overload

A state in which the complexity of a system exceeds the capacity of its managers to understand or control it. This leads to reliance on simplified models and flawed assumptions.

Example

Global financial derivatives too complex for regulators to fully assess Large IT projects that no single team fully comprehends

Accountability Gaps

Structural voids where no actor has both the authority and responsibility to correct problems. These gaps allow dysfunction to persist unchecked.

Example

Outsourced public services with unclear oversight Multinational corporations operating across fragmented regulatory regimes