How Institutions Forget How to Move
A practical theory of process, variance, and organizational decay
In Silicon Valley, Netflix is well known for its distinctive management philosophy. The Netflix culture deck, later expanded into No Rules Rules by Erin Meyer and Reed Hastings, is still required reading at many startups. It describes a freedom and responsibility culture that replaces much of the process that defines corporate life with high talent density, radical candor, and broad employee autonomy.
Netflix’s success is beyond question, which makes one detail especially interesting: for many years, Netflix did not everywhere follow its own (no) rules. During the DVD era, the company operated a nationwide network of warehouses that functioned under an entirely different operating system. Warehouse employees worked inside strict, highly specified processes governing nearly every moment of their shifts. I’ve spoken with the people who designed those systems, and they’ll tell you the same thing: rigid processes and clear hierarchies were essential to delivering a reliable customer experience at scale.
The leaders at Netflix understood something that many organizations miss. While the danger of too much process is real, process has a legitimate and sometimes indispensable role, depending on context. This essay explores that tension. It’s an attempt at a practical theory of process: what purpose process actually serves, when adding it makes sense, and why removing process later is orders of magnitude harder than adding it in the first place.
What Process Actually Does
Process exists for one reason: to suppress variance.
Every procedure, review, and checklist narrows the range of possible outcomes. Adding requirements for code reviews and test coverage limits the types of bugs that can reach production. Checklists ensure minimum standards are met every time.
Described this way, suppressing variance sounds positive. If you asked most employees if they want unpredictable outcomes, who would say yes? There is, however, a catch: outperformance is also variance.
The exceptional outcome lives in the same distribution tails as the failures that process is built to prevent. When we compress the range of outcomes, we don’t just eliminate disasters, we also eliminate breakthroughs.
Organizations fail to understand this cost because the effects are asymmetric. New processes have an immediate, visible benefit: fewer errors. Innovation happens over longer time periods, so the cost (fewer innovations), takes longer to be felt. The errors being addressed are concrete and linked to a process. The innovations lost are at best theoretical; they never existed, so they leave no trace back to the process that killed them.
The Ratchet Effect
Processes accumulate because the incentives for adding and removing are fundamentally asymmetric.
New processes have a natural champion: the person making the proposal. This champion can directly link the proposed process to some acute issue, whether it’s a recent error or a known organizational problem. As long as the process has a story that makes it seem prudent, this process is likely to become part of the organization, with credit being amassed by the champion.
Removing process has no natural champion. The beneficiaries are hypothetical future employees who would have shipped faster, thought more creatively, taken more initiative. They’re not in the room. They might not even work here yet. The victory condition for removal is diffuse and slow to materialize: maybe things get slightly faster, maybe innovation picks up over time. Hard to measure. Hard to take credit for. Process removal is also likely to run into political challenges. Process is often a means of control for people in the organization, and those people are likely to feel threatened by process removal when it would mean giving up control.
There’s also a career incentive asymmetry. The person who adds process and prevents a visible error looks prudent. The person who removes process and enables an error, even if that same removal also enabled three successful innovations, looks reckless. Diffuse organizational benefit doesn’t show up in performance reviews. Specific, attributable failures do.
This triggers a doom loop. Once the organization begins to treat judgment as a liability, talent begins to self-select out. Rule-followers come to dominate the employee base and talent density drops, “proving” the company needs more process. This is why Netflix places such a premium on talent density; less talent is both the cause, and effect, of heavy-handed process.
This is what drives the ratchet effect: once added, processes become nearly impossible to remove. The incentives in organizations heavily bias towards process accretion. The cumulative effect is that organizations become progressively more constrained, despite nobody seeking that outcome.
The ratchet usually turns one direction, but it operates on territory, not systems as a whole. What looks like freedom in the postwar period (nuclear plants in three years, polonium in cereal box prizes) wasn’t a result of deregulation. It was unregulated territory, a new frontier where the harms were not just unrealized but unforeseen. Process is scar tissue. No scars yet, no process.
This is why paradigm shifts appear to reset the clock. The technological revolutions of the early 20th century outran existing processes, which were built for an agrarian and early industrial economy. The new economy grew up in the gaps. By the 1970s, slowing technological growth allowed process to catch up to the frontier and fully enclose the territory.
Risks of Removal, Real and Perceived
While processes often address some acute issue, the people who advocated for them move on and the thinking that drove them is quickly forgotten. This results in a sort of organizational Chesterton’s fence: processes that are enforced rigorously, with no understanding of what problem the process was intended to solve.
With the purpose and reasoning behind a process lost to time, removal of that process now invokes a powerful psychological bias: the fear of the unknown. This creates circumstances where the perceived risk of removal far outpaces the actual risk.
This is loss aversion applied to organizational design. It’s why even well-intentioned process audits tend to produce modest results. The gravitational pull toward keeping things as they are is immense.
Everything above might suggest that process is almost universally bad. It isn’t. The problem lies not in process itself, but in process applied without regard to context. There are domains where suppressing variance is exactly correct, where the cost of error so dramatically exceeds the cost of rigidity that heavy process becomes a competitive advantage.
When Variance Is The Enemy
Netflix’s warehouse operations weren’t an exception to their philosophy; they were evidence that the leadership understood the need to right-size processes in a given context. The challenge is knowing when variance is your enemy and when it’s your only path forward.
Consider aviation and surgery. These are fields where a single mistake can have deadly consequences. The checklist revolution in both domains, documented extensively by Atul Gawande and others, has saved countless lives. The purpose of process in high-stakes domains is not to replace human judgment, but to ensure judgment is spent only where it adds value. By making the routine reliable, they reduce the need for constant improvisation and ensure that when deviation is required, it happens deliberately and from a stable base.
Innovation in aviation and surgery still happens, but it is quarantined to labs, simulations, and tightly controlled pilots. This is systems hygiene: funneling innovation to contexts that can absorb variance without catastrophe. Slower learning cycles are appropriate when mistakes are irreversible, public trust is fragile, and error costs dominate opportunity costs.
Intel’s famous “Copy Exactly!” methodology offers another example. When I was in graduate school for industrial engineering, we examined how Intel went to extraordinary lengths to replicate their manufacturing processes across facilities, matching equipment layouts down to the centimeter and even the orientation of buildings relative to the earth’s magnetic field. This seems absurd until you understand semiconductor manufacturing. Yield loss compounds relentlessly. A tiny variation in one step cascades into defect rates that destroy profitability. In this context, rigid process becomes a source of competitive advantage.
This suggests a general principle: Organizations should suppress variance in proportion to the cost of error, and nowhere else.
The first half of this law explains Netflix’s warehouses, Intel’s fabs, and the surgical checklist. The second half is where most organizations fail.
Most process failures stem from suppressing variance where error costs don’t justify it. This happens in two distinct ways:
The process is a self-referential loop: It creates the conditions that mandate its own existence. The solution is to recognize the loop and eliminate it.
One-size-fits-all governance: Process appropriate for one context gets applied uniformly across contexts with different error costs. The solution is structural separation.
Let’s take each in turn.
Breaking the Loop
We’ve established that unwarranted process is detrimental to long-term company health. We’ve also established that process in some circumstances isn’t only justified, but advantageous. How can a company looking to escape bureaucratic hell discern when they are cutting fat and when they are cutting muscle?
The strongest tell is that processes are self-referential; they create the conditions that justify their existence. For example, a defensive hiring process:
Make hiring extremely rigorous and consensus-driven to avoid bad hires
Hiring slows
Slow hiring compounds the pain of being short-staffed
Which makes managers reluctant to exit underperformers
Which “proves” you need rigorous hiring to avoid bad hires
When you notice a process that feels impossible to remove because of problems the process itself created, you’ve found a self-referential loop. These are high-leverage points in any organization not because they’re easy to break, but because this is where intervention compounds rather than just improving one thing.
Innovating When You Have Something to Defend
The second categorical process error, one-size-fits-all governance, is a challenge every company that reaches scale must face. The company now has much to defend: significant revenue, an established customer base, and a growing reputation. Many of the employees who made their careers on the company’s growth will be eager to protect what they have built. As expectations rise, so does the cost of making errors. It is perfectly reasonable for companies in this position to want to flatten some variance.
When multiple groups operate under shared governance (e.g., same parent company, division, or compliance function), organizations default to uniform processes to ensure consistency, auditability, and risk mitigation. Risk contagion is when the highest-risk unit’s controls become the de facto minimum standard for everyone under shared governance. This prevents fragmentation where siloed processes create enterprise vulnerabilities, but it sacrifices speed and effectiveness in the exact groups you need to sustain innovation.
If risk contagion pulls everything toward the highest-process context, the only way to preserve variance where it’s needed is to create enough distance that the contagion can’t spread. This naturally leads to the question: how much separation is enough?
Separation of Mandate vs. Separation of Means
You can have a different mission while still being subject to the same processes. True independence requires different means: different hiring, different accountability, different timelines. Separation isn’t one choice, but a series of choices along a spectrum: same division → separate division → subsidiary → external entity.
Greater levels of separation offer greater degrees of process freedom and a longer half-life of that freedom. The strategic question is: how much separation do we need for this particular mission? A six-month sprint to prototype something might survive with divisional separation. A multi-year effort to build a new business line probably cannot; the half-life is too short for the timeline.
“Startup culture with big company resources” is the lie organizations tell themselves to avoid real separation. They want the benefits of variance without paying the cost of actual independence. The resources come with strings: reporting requirements, budget cycles, HR policies, performance reviews calibrated to the parent’s norms. You can’t declare a different culture while remaining tethered to systems that enforce the old one.
Low levels of separation are harder to maintain because jealousy creates political pressure to close the gap, especially when the group subject to fewer rules begins to find success. Everyone sees who gets autonomy and who doesn’t. The people outside the protected space resent it. Political pressure builds to revoke the special treatment, killing innovation. This is exactly what happened to Saturn under General Motors. It started with genuine independence: different labor agreements, different dealer model, and different manufacturing. Saturn’s success built resentment from other divisions that became rancorous. GM leadership starved it of resources to keep the peace, leading to its destruction.
Persistent pressure from the parent organization explains the evolution of Alphabet’s X division, which develops “radical innovations”. Their strategy for incubating new technologies has evolved from independent units, to independent subsidiaries, to fully external entities.
The progression tells a story:
Internal units: Minimal separation, antibodies win quickly.
Alphabet subsidiaries: Legal and organizational separation, but still sharing infrastructure, HR systems, cultural gravity.
Fully external entities: True independence, own board, own incentives, can develop genuinely different operating norms.
Over time, Alphabet’s X recognized that you cannot indefinitely fight human nature within a shared institution. Structural separation is necessary. Not just a different mandate, but different means: different hiring, different accountability, and different timelines. The progression from internal units to subsidiaries to fully external entities is evidence of how much distance it actually takes.
Fighting the Ratchet
The ratchet will keep turning unless someone actively fights it. Breaking loops requires someone with enough authority to absorb the political cost. Maintaining separation requires someone with enough altitude to protect the air gap.
Only senior leaders have the positional authority to do this. Everyone else is operating inside the local incentives that favor process accumulation. The person who removes process and enables an error is vulnerable; the person who keeps process and perpetuates sluggishness is safe. Only executives have enough standing to absorb the political cost of removal and enough time horizon to see the benefits materialize.
What does this look like in practice?
Executive attention and urgency. British retailer Julian Richer offers a model for how the process ratchet can reverse outside of a crisis. He created a “Cut the Crap Committee“ to roll back process at a time when Richer Sounds was performing well. Process usually reverses only during crisis because that’s when the issue naturally becomes urgent. What Richer understood is that leaders have the power to create urgency by focusing sustained attention on problems that would otherwise stay invisible. Richer didn’t wait for a crisis to grant him permission. He gave himself permission.
Pruning tied to scale. Every process is calibrated to a moment in time: a particular scale, technology, and competitive landscape. When headcount or revenue crosses a major threshold, old processes don’t automatically become wrong, but many do. Tie mandatory process audits to scaling milestones: each significant growth threshold triggers review. The question isn’t “is this process working?” but “is this process still earning its cost at our current scale?”
Error budgets for overhead. Borrowed from site reliability engineering: define an acceptable threshold for process burden, say, approval cycle times or percentage of time spent on administrative work. If you’re consistently under budget, you’re probably over-controlled. Threshold breaches freeze new process additions until existing ones are pruned. This makes the tradeoff quantitative rather than political.
None of these work as one-time initiatives. The ratchet turns continuously; fighting it requires continuous effort. Ninety-day pushes can build habits, but without sustained executive attention, entropy reasserts itself.
These tactics matter because the alternative, waiting for a crisis to force elaboration, is how most institutions fail.
Process Kudzu
Most people who have joined a small company that succeeded and went on to become a much larger company can share a similar story: The early days are defined by utter chaos and relentless speed in the pursuit of building something customers want. After the company finds success, process starts to creep in. The changes are almost imperceptible at first. Then one day, they wake up and realize it now takes six weeks to execute on something that used to take six days.
This arc is predictable:
Startup: No process because there’s nothing to protect. Speed is the only advantage. You ship, sell, and market without asking permission.
Growth: Customers arrive. Revenue becomes meaningful. Then something goes wrong: a bug, a PR disaster, a compliance failure. The question gets asked: how do we make sure this never happens again? Organic structure begins to take shape, and informal processes arise to avoid the worst outcomes.
Formalization: As a company reaches maturity, informal structures and processes are replaced by explicit hierarchies, policies, and procedures. Extensive red tape introduced during this phase triggers a bureaucratic crisis, with process requirements so stringent that the business loses the ability to function effectively
Elaboration: Attention shifts from the harms process prevents to the harms process creates. The organization takes a critical eye towards process and begins to prune, returning the organization’s ability to adapt and innovate.
This pattern is so common that an entire field, organizational life-cycle theory, has emerged to study it. The mystery lies in the last stage. Most organizations rarely reach elaboration and remain stuck in bureaucratic crises for years, even decades. The ratchet keeps turning until the capacity to move is gone.
We can see the same arc play out with nation-states:
Startup: The United States was a startup country, and westward expansion was its growth strategy. Manifest destiny pushed settlers into territory with minimal federal presence, ad hoc justice, and rampant land grabs. The Wild West wasn’t a policy choice; there was little to protect and no capacity to enforce process.
Growth: The frontier officially closed in 1890 when the Census declared no more contiguous unsettled land. Population concentrated in cities as railroads and industry scaled. Harms became impossible to ignore: Standard Oil’s monopoly, meat-packing horrors exposed by muckrakers, 146 workers killed in the Triangle Shirtwaist fire. The Progressive Era asked a question that would echo for decades: how do we make sure this never happens again? The regulatory state began with the FDA, the FTC, antitrust enforcement, and labor laws.
Formalization: By the 1970s, a new generation of harms demanded attention: smog-choked cities, polluted rivers, destroyed ecosystems. Laws like the National Environmental Policy Act (NEPA) built an apparatus designed to slow down building so abuses could be seen and stopped. The proceduralism driven by laws ushered in a crisis of red tape, slowing down all building indiscriminately. Wind farms that would replace coal plants spend over a decade in permitting.
Elaboration: In the late 1970s, regulations became tangible villains amidst the Winter of Discontent in the UK, gas lines in the U.S., and empty shelves. Think tanks focused attention on postwar interventions as the cause. Airline deregulation in the U.S. was the first domino to fall in the wave leading to the Thatcher/Reagan era, when deregulation was extended to trucking, energy, finance, and labor.
Whether these reversals were wise policy is fiercely contested and beyond the scope of this essay. The point is narrower: this is a rare case when the ratchet actually turned backward, and they share a common structure.
Wartime poses an existential threat to survival. Two world wars concentrated executive authority and created a “just get it done” culture that overrode peacetime proceduralism. Processes didn’t disappear; they were suspended by war powers.
Economic crises create an existential threat to political leadership. Stagflation didn’t threaten Britain’s or America’s survival, but it threatened the governing consensus. Thatcher-Reagan had permission to deregulate because the status quo was politically untenable.
Corporate near-death is the existential equivalent to wartime. Apple was 90 days from bankruptcy at the start of Jobs’s second tenure. IBM faced similarly dire circumstances when Lou Gerstner took over as CEO. The board grants emergency latitude (wartime powers) because the alternative is liquidation.
In all of these examples, existential threat provided the permission structure for reversing the ratchet. Nations have learned to formalize the exception for wartime. Economic crises and corporate near-death still rely on improvised permission; someone has to recognize the threat and act before the formal structures catch up.
This is why most organizations never reach elaboration: the threat has to be visible and credible enough to override the local incentives. Without a crisis, only sustained leadership attention can create the permission structure to reverse the ratchet.
Conclusion
Process is a tool for suppressing variance. It has legitimate, sometimes essential applications. Organizations handling money, lives, or safety-critical systems should suppress variance heavily. The cost of error justifies the cost of rigidity.
Process has a natural tendency to expand beyond its useful boundaries. The mechanism is seductive: each addition makes sense in isolation, prevents a visible harm, and carries only invisible costs. The ratchet turns. Turning it back requires fighting psychology, organizational incentives, and the accumulated weight of well-intentioned decisions.
The leaders who understand this recognize that process management is not administrative overhead; it’s strategic work. Dialing in the level right for each context and actively fighting accumulation is how organizations maintain the capacity for both reliability and innovation.
Most organizations fail at this. They add process and never remove it. They let the ratchet turn until the organization is optimized for preventing yesterday’s mistakes at the cost of discovering tomorrow’s opportunities.
The pattern is predictable, and the mechanism is understood. The question is whether leaders will treat it as their responsibility.
Systems that fear variance eventually fear learning, and systems that fear learning eventually stop adapting. The rest is just a matter of time.


