Posted on 8 mins read

The following laws have been valuable to me over the course of my career.

Brooks’s Law (adding developers to a project)

Adding manpower to a late software project makes it later.

– Fred Brooks, The Mythical Man-Month (1975)

Original context: Software development.

Caveats: Oversimplified.

What it means: Software is not an assembly line. When you add developers and project managers to a project, you decrease the team’s average familiarity with the problem space and existing code, which slows the team down until the new people are up to speed. Further, software development is a highly collaborative task, so it can’t be parceled out to independent contributors.

Cargill’s Law / The Ninety-Ninety Rule (last 10 percent is the hardest)

The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.

– Tom Cargill, quoted by Jon Bentley, “Programming pearls: Bumper-Sticker Computer Science” (1985)

Original context: Software development.

Caveats: Tongue-in-cheek.

What it means: Every project looks easier than it is before you’ve gotten into the details, hence the cliché of software projects blowing past deadlines and budget projections.

See also: Hofstadter’s Law

Chesterton’s Fence (ripping things out before you understand them)

There exists […] a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it."

– G. K. Chesterton, The Thing: Why I Am a Catholic (1946)

Original context: Christian apologetics.

Caveats: Carries some historical baggage.

What it means: A system is typically the result of several non-obvious constraints and long-considered decisions; that is, it has a history. A newcomer who doesn’t bother to learn that history is doomed to re-learn old lessons.

Conway’s Law (systems tend to mirror org charts)

Organizations which design systems … are constrained to produce designs which are copies of the communication structures of these organizations.

– Melvin Conway, “How Do Committees Invent?” (1968)

Original context: System design.

Caveats: None.

What it means: Teams tend to work on their own things and not on other things. When you form a development team, you incentivize the creation of a differentiated software component aligned with the goals of that team. Your system as a whole is destined to become a metaphor for the structure and communication patterns of your organization.

Corollary by yours truly: When two teams’ software components interact, the quality and reliability of the interaction largely depends on how easy it is for contributors on the respective teams to ask each other for favors.

Cunningham’s Law (the unreasonable effectiveness of wrong answers)

The best way to get the right answer on the Internet is not to ask a question; it’s to post the wrong answer.

– Ward Cunningham, quoted by Steven McGeady in The New York Times (2010)

Original context: Wry newspaper column.

Caveats: Cunningham says it’s a “misquote that disproves itself.”

What it means: People are naturally reactive, not proactive. The quote is meant literally, but is also a good approach to work: if you ask what to do, you might not get a response. If you loudly announce what you’re doing, people will correct or stop you as needed.

Eagleson’s Law (the limits of code authorship)

Any code of your own that you haven’t looked at for six or more months might as well have been written by someone else.

– Unknown, possibly Peter S. Eagleson (date unknown)

Original context: Software development.

Caveats: “Six months” is a very rough estimate; actual time may vary.

What it means: If your code is hard to read, it doesn’t matter how well you understand it right now. Any time you put into simplifying it will pay dividends later on.

Gall’s Law (complex systems must evolve from simple systems)

A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.

– John Gall, Systemantics: How Systems Really Work and How They Fail (1975)

Original context: Systems theory.

Caveats: None.

What it means: You can’t build working systems without iterating. Start with the smallest working thing, then build out from there.

Goodhart’s Law (measures becoming targets)

When a measure becomes a target, it ceases to be a good measure.

– Marilyn Strathern, “‘Improving ratings’: audit in the British University system” (1997)

Alternately:

  • “[E]very measure which becomes a target becomes a bad measure.” – Keith Hoskin, The ‘awful idea of accountability’: inscribing people into the measurement of objects. (1996)
  • “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.” – Charles Goodhart, “Problems of Monetary Management: The UK Experience” (1975)

Original context: Economics.

Caveats: None.

What it means: Any metric can be gamed, so be careful what you wish for. Measures are proxies for what you actually want. When people see what you’re measuring, they’ll often deliver it by disregarding or defeating your underlying goal.

Hanlon’s Razor (assume stupidity rather than malice)

Never attribute to malice that which is adequately explained by stupidity.

– Robert J. Hanlon, quoted by Arthur Bloch, Murphy’s Law Book Two: More Reasons Why Things Go Wrong! (1980)

Original context: Humorous quotations.

Caveats: Doesn’t describe patterns of behavior as well as POSIWID.

What it means: Everyone’s incompetent sometimes, but not everyone is a jerk. If you assume incompetence first, you’ll be right most of the time.

Hofstadter’s Law (it takes longer than you expect)

It always takes longer than you expect, even when you take into account Hofstadter’s Law.

– Douglas Hofstadter, Gödel, Escher, Bach: An Eternal Golden Braid (1979)

Original context: Development of chess-playing computers.

Caveats: Tongue-in-cheek.

What it means: A complex task is easily underestimated. Assume it’s going to take longer than you think, then a little longer than that.

See also: Parkinson’s Law

Law of Resource Allocation (efficiency vs. effectiveness)

An efficient system always has need available when a resource becomes free, and an effective system always has a resource available when a need arises.

– Unknown, quoted by Dr. Matthew Russell @matteomics.bsky.social

Original context: Engineering.

Caveats: None.

What it means: Efficiency and effectiveness are in tension. A perfectly efficient system (one where resources like time, money, and energy are never wasted) can’t handle unforeseen problems because there’s no extra capacity. A perfectly effective system (one where everyone’s needs are met and problems are resolved quickly) always uses more resources than it needs.

Parkinson’s Law (work expands to fill time)

Work expands so as to fill the time available for its completion.

– C. Northcote Parkinson, “Parkinson’s Law” (1955)

Original context: Bureaucracy in civil service.

Caveats: More about the nature of human beings than the nature of work.

What it means: If you plan to take all week on something, you probably will.

See also: Cargill’s Law

POSIWID (systems do what they’re supposed to)

The purpose of a system is what it does.

– Stafford Beer, Diagnosing the system for organizations (1985)

Alternately:

Original context: Cybernetics.

Caveats: Meant to aid our understanding of systems, not necessarily their creators.

What it means: Systems tend toward consistent behavior over time, and consistent behavior isn’t explainable by circumstance or accident. The effects of a given system should be assumed features, not bugs—that is, both fundamental and desirable to the maintainers.

Postel’s Law / The Robustness Principle (be consistent, expect inconsistency)

Be conservative in what you do, be liberal in what you accept from others.

– Jon Postel, TRANSMISSION CONTROL PROTOCOL (1980)

Original context: Computer networking (the original specification for TCP).

Caveats: “Conservative” and “liberal” are meant in the general sense, not the political.

What it means: You should write software to behave as predictably as possible (outputs), but not necessarily expect the same predictability from your consumers/users (inputs).

Zawinski’s Law (programs tend to expand)

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

– Jamie Zawinski, “jwzhacks” (1995)

Original context: Software.

Caveats: Most relevant to commercial software.

What it means: People are quicker to buy an application that does 10 things poorly than 1 thing well. Furthermore, everyone expects their use case to be accommodated. Thus successful apps are incentivized to grow into “platforms” with large numbers of features, crowding out smaller and more efficient tools.