Technology
Laws for software engineers
The following adages have been valuable to me over the course of my career.
- Brooks’s Law (adding developers to a project)
- Cargill’s Law / The Ninety-Ninety Rule (last 10 percent is the hardest)
- Chesterton’s Fence (ripping things out before you understand them)
- Conway’s Law (systems tend to mirror org charts)
- Cunningham’s Law (the unreasonable effectiveness of wrong answers)
- Eagleson’s Law (the limits of code authorship)
- Gall’s Law (complex systems must evolve from simple systems)
- Goodhart’s Law (measures becoming targets)
- Greenspun’s Tenth Rule (you can’t keep it simple for long)
- Hanlon’s Razor (assume stupidity rather than malice)
- Hofstadter’s Law (it takes longer than you expect)
- Information Hiding (protect your modules from each other)
- Law of Resource Allocation (efficiency vs. effectiveness)
- Locality of Behavior (code should be easy to understand in small doses)
- No Silver Bullet (all tasks involve irreducible complexity)
- Parkinson’s Law (work expands to fill time)
- POSIWID (systems do what they’re supposed to)
- Postel’s Law / The Robustness Principle (be consistent, expect inconsistency)
- XY Problem (ask about the problem, not what you think the solution is)
- Zawinski’s Law (programs tend to expand)
Brooks’s Law (adding developers to a project)
Adding manpower to a late software project makes it later.
– Fred Brooks, The Mythical Man-Month (1975)
Original context: Software development.
Caveats: Oversimplified.
What it means: Software is not an assembly line. When you add developers and project managers to a project, you decrease the team’s average familiarity with the problem space and existing code, which slows the team down until the new people are up to speed. Further, software development is a highly collaborative task, so it can’t be parceled out to independent contributors.
Cargill’s Law / The Ninety-Ninety Rule (last 10 percent is the hardest)
The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.
– Tom Cargill, quoted by Jon Bentley, “Programming pearls: Bumper-Sticker Computer Science” (1985)
Original context: Software development.
Caveats: Tongue-in-cheek.
What it means: Every project looks easier than it is before you’ve gotten into the details, hence the cliché of software projects blowing past deadlines and budget projections.
See also: Hofstadter’s Law
Chesterton’s Fence (ripping things out before you understand them)
There exists […] a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it."
– G. K. Chesterton, The Thing: Why I Am a Catholic (1946)
Original context: Christian apologetics.
Caveats: Carries some historical baggage.
What it means: A system is typically the result of several non-obvious constraints and long-considered decisions; that is, it has a history. A newcomer who doesn’t bother to learn that history is doomed to re-learn old lessons.
Conway’s Law (systems tend to mirror org charts)
Organizations which design systems … are constrained to produce designs which are copies of the communication structures of these organizations.
– Melvin Conway, “How Do Committees Invent?” (1968)
Original context: System design.
Caveats: None.
What it means: Teams tend to work on their own things and not on other things. When you form a development team, you incentivize the creation of a differentiated software component aligned with the goals of that team. Your system as a whole is destined to become a metaphor for the structure and communication patterns of your organization.
Corollary by yours truly: When two teams’ software components interact, the quality and reliability of the interaction largely depends on how easy it is for contributors on the respective teams to ask each other for favors.
Cunningham’s Law (the unreasonable effectiveness of wrong answers)
The best way to get the right answer on the Internet is not to ask a question; it’s to post the wrong answer.
– Ward Cunningham, quoted by Steven McGeady in The New York Times (2010)
Original context: Wry newspaper column.
Caveats: Cunningham says it’s a “misquote that disproves itself.”
What it means: People are naturally reactive, not proactive. The quote is meant literally, but is also a good approach to getting things done at work: if you ask what to do, you might not get a response. If you loudly announce what you’re doing, people will correct or stop you as needed.
Eagleson’s Law (the limits of code authorship)
Any code of your own that you haven’t looked at for six or more months might as well have been written by someone else.
– Unknown, possibly Peter S. Eagleson (date unknown)
Original context: Software development.
Caveats: “Six months” is a very rough estimate; actual time may vary.
What it means: If your code is hard to read, it doesn’t matter how well you understand it right now. Any time you put into simplifying it will pay dividends later on.
Gall’s Law (complex systems must evolve from simple systems)
A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.
– John Gall, Systemantics: How Systems Really Work and How They Fail (1975)
Original context: Systems theory.
Caveats: None.
What it means: You can’t build working systems without iterating. Start with the smallest working thing, then build out from there.
Goodhart’s Law (measures becoming targets)
When a measure becomes a target, it ceases to be a good measure.
– Marilyn Strathern, “‘Improving ratings’: audit in the British University system” (1997)
Alternately:
- “[E]very measure which becomes a target becomes a bad measure.” – Keith Hoskin, The ‘awful idea of accountability’: inscribing people into the measurement of objects. (1996)
- “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.” – Charles Goodhart, “Problems of Monetary Management: The UK Experience” (1975)
Original context: Economics.
Caveats: None.
What it means: Any metric can be gamed, so be careful what you wish for. Measures are proxies for what you actually want. When people see what you’re measuring, they’ll often deliver it by disregarding or defeating your underlying goal.
Greenspun’s Tenth Rule (you can’t keep it simple for long)
Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.
– Philip Greenspun, “10th rule of programming” (1993)
Original context: Software development.
Caveats: Tongue-in-cheek and confusing. The quote isn’t about Lisp’s literal syntax or semantics, but rather its built-in functionality and extensibility.
What it means: To take it literally: many large programs develop such a need for dynamic, spontaneously-defined behavior that they sleepwalk into owning a custom, half-baked script interpreter that barely works and is far more difficult to maintain than off-the-shelf Lisp. More broadly: simplicity is fleeting but complexity is forever. A good developer knows how to recognize solved problems and plug in robust, well-known technologies rather than saying “we don’t need all that” and writing a one-off class that inexorably grows until it can no longer be understood or maintained.
Hanlon’s Razor (assume stupidity rather than malice)
Never attribute to malice that which is adequately explained by stupidity.
– Robert J. Hanlon, quoted by Arthur Bloch, Murphy’s Law Book Two: More Reasons Why Things Go Wrong! (1980)
Original context: Humorous quotations.
Caveats: Doesn’t describe patterns of behavior as well as POSIWID.
What it means: Everyone’s incompetent sometimes, but not everyone is a jerk. If you assume incompetence first, you’ll be right most of the time.
Hofstadter’s Law (it takes longer than you expect)
It always takes longer than you expect, even when you take into account Hofstadter’s Law.
– Douglas Hofstadter, Gödel, Escher, Bach: An Eternal Golden Braid (1979)
Original context: Development of chess-playing computers.
Caveats: Tongue-in-cheek.
What it means: A complex task is easily underestimated. Assume it’s going to take longer than you think, then a little longer than that.
See also: Parkinson’s Law
Information Hiding (protect your modules from each other)
[I]t is helpful if most system information is hidden from most programmers[.]
– D. L. Parnas, “Information Distribution Aspects of Design Methodology” (1971)
Alternately:
- “Approximately my entire strategy as a dev these days can be summed up like this: if you’re going to do something messy, complicated, dangerous, or unpleasant, put it in the smallest, tightest box you can find and give it the cleanest, simplest interface possible. Don’t let a moment of complexity warp your whole codebase.” – Yours truly, @isaaclyman@toot.cafe (2022)
- “good cut point has narrow interface with rest of system: small number of functions or abstractions that hide complexity demon internally, like trapped in crystal / grug quite satisfied when complexity demon trapped properly in crystal, is best feeling to trap mortal enemy!” – Carson Gross, “The Grug Brained Developer: A layman’s guide to thinking like the self-aware smol brained” (2022)
Original context: Software development.
Caveats: None.
What it means: If you’re solving a complex problem or using a complex API, create a simple interface for it and keep the complexity tightly contained. All codebases have messy parts; they might have to be someone’s problem, but they don’t have to be everyone’s problem.
See also: Locality of Behavior
Law of Resource Allocation (efficiency vs. effectiveness)
An efficient system always has need available when a resource becomes free, and an effective system always has a resource available when a need arises.
– Unknown, quoted by Dr. Matthew Russell @matteomics.bsky.social
Original context: Engineering.
Caveats: None.
What it means: Efficiency and effectiveness are in tension. A perfectly efficient system (one where resources like time, money, and energy are never wasted) can’t handle unforeseen problems because there’s no extra capacity. A perfectly effective system (one where everyone’s needs are met and problems are resolved quickly) always uses more resources than it needs.
Locality of Behavior (code should be easy to understand in small doses)
The primary feature for easy maintenance is locality: Locality is that characteristic of source code that enables a programmer to understand that source by looking at only a small portion of it.
– Richard P. Gabriel, “Patterns of Software: Tales from the Software Community” (1996)
Original context: Software development.
Caveats: None.
What it means: The only way to make your code easy to read (and therefore easy to maintain) is to keep each logical piece of it as topical and self-contained as possible. If a programmer can’t predict the behavior of a method without hunting down ten related files, or if it depends on implicit or “magical” behavior not suggested in the code, it’s not maintainable.
See also: Information Hiding
No Silver Bullet (all tasks involve irreducible complexity)
There is no single development, in either technology or management technique, which by itself promises even one order of magnitude improvement in productivity, in reliability, in simplicity.
– Fred Brooks, “No Silver Bullet: Essence and Accidents of Software Engineering” (1986)
Original context: Software development.
Caveats: None.
What it means: Software development tries to do two kinds of jobs: “essential” ones and “accidental” ones (per Brooks). The essential ones are inherent in the real-life problems we’re trying to solve, and no amount of technology will make them disappear or become conceptually simple. The accidental ones can be simplified by better tools, but we can never get rid of them completely, and even if we could, they’re less than 90% of our workload. So better tools will never give us an order-of-magnitude improvement.
Parkinson’s Law (work expands to fill time)
Work expands so as to fill the time available for its completion.
– C. Northcote Parkinson, “Parkinson’s Law” (1955)
Original context: Bureaucracy in civil service.
Caveats: More about the nature of human beings than the nature of work.
What it means: If you plan to take all week on something, you probably will.
See also: Cargill’s Law
POSIWID (systems do what they’re supposed to)
The purpose of a system is what it does.
– Stafford Beer, Diagnosing the system for organizations (1985)
Alternately:
- “[There is] no point in claiming that the purpose of a system is to do what it constantly fails to do.” – Stafford Beer, quoted by David Benjamin and David Komlos, “The Purpose Of A System Is What It Does, Not What It Claims To Do” (2021)
- “[T]he machine is never broken.” – Anil Dash, “Systems: The Purpose of a System is What It Does” (2024)
Original context: Cybernetics.
Caveats: Meant to aid our understanding of systems, not necessarily their creators.
What it means: Systems tend toward consistent behavior over time, and consistent behavior isn’t explainable by circumstance or accident. The effects of a given system should be assumed features, not bugs—that is, both fundamental and desirable to the maintainers.
Postel’s Law / The Robustness Principle (be consistent, expect inconsistency)
Be conservative in what you do, be liberal in what you accept from others.
– Jon Postel, TRANSMISSION CONTROL PROTOCOL (1980)
Original context: Computer networking (the original specification for TCP).
Caveats: “Conservative” and “liberal” are meant in the general sense, not the political.
What it means: You should write software to behave as predictably as possible (outputs), but not necessarily expect the same predictability from your consumers/users (inputs).
XY Problem (ask about the problem, not what you think the solution is)
Q: How can I use X to do Y?
A: If what you want is to do Y, you should ask that question without pre-supposing the use of a method that may not be appropriate.
– Eric Steven Raymond and Rick Moen, “How To Ask Questions The Smart Way” (2004)
Original context: Software development.
Caveats: There are often several different layers of Y.
What it means: Before you ask for help with something, make sure you’re including all the context, including the original problem you’re trying to solve. If you ask “Why does my spoon keep breaking?”, you’ll waste a lot of time buying stronger spoons and reinforcing them. If you ask “What’s the best way to dig a trench?”, someone will teach you about shovels.
Zawinski’s Law (programs tend to expand)
Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.
– Jamie Zawinski, “jwzhacks” (1995)
Original context: Software.
Caveats: Most relevant to commercial software.
What it means: People are quicker to buy an application that does 10 things poorly than 1 thing well. Furthermore, everyone expects their use case to be accommodated. Thus successful apps are incentivized to grow into “platforms” with large numbers of features, crowding out smaller and more efficient tools.