Old but… still applies Laws of Software Development
I ran across an new eponymous law while reading “Understanding
Enterprise SOA” by Eric Pulier and Hugh Taylor.
Pulier’s rule of reuse:
A programmer will look at another programmer’s output, no matter
how brilliant and declare it garbage The best law’s and rules are named
by others who admire the author’s work, but this one seemed pretty
worthy to me. This got me reminiscing about some laws of software
development that perhaps had the same rigour as Newton’s Laws, Boyle’s
Law or Hooke’s Law.
The rules I have admired during my career are:
adding manpower to a late software project makes it later
It always takes longer than you expect, even when you take
Hofstadter’s Law into account.
Codd’s Rules <http://www.webopedia.com/TERM/C/Codds_Rules.html> of
relational database management systems which I won’t quote here because
there are 13 of them.
The number of transistors
<http://en.wikipedia.org/wiki/Transistors> that can be inexpensively
placed on an integrated circuit
<http://en.wikipedia.org/wiki/Integrated_circuit> is increasing
exponentially <http://en.wikipedia.org/wiki/Exponential_growth> ,
doubling approximately every two years.
This has a number of applications outside of transistors such as
processing speed, memory and disk capacity.
Clarke’s 3rd Law:
Any sufficiently advanced technology is indistinguishable from
The reason I class this as a software development Law is that it points
out the futility of trying to explain the technical details of software
to Business users. If a technical decision does not have any business
impact then it may as well be magic.
I thought I would check what other software development Laws there are
on the net. Here is a good sample from some useful sources listed in the
Amdahl’s Law <http://en.wikipedia.org/wiki/Amdahl>
The speedup gained from running a program on a parallel computer
is greatly limited by the fraction of that program that can’t be
Asimov’s laws (Yes its SF but one day a software developer might have to
1. A robot may not injure a human being or, through inaction, allow
a human being to come to harm.
2. A robot must obey orders given to it by human beings, except
where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such
protection does not conflict with the First or Second Law.
Anytime you wish to demonstrate something, the number of faults
is proportional to the number of viewers.
Clarke’s 1st Law:
When a distinguished but elderly scientist states that something
is possible, he is almost certainly right. When he states that something
is impossible, he is very probably wrong.
Clarke’s 2nd Law:
The only way of discovering the limits of the possible is to
venture a little way past them into the impossible.
Clarke’s 4th Law (He’s not often credited with this one):
For every expert there is an equal and opposite expert.
Conway’s Law <http://en.wikipedia.org/wiki/Conway%27s_Law> :
Any piece of software reflects the organizational structure that
You cannot apply a technological solution to a sociological
Greenspun’s Tenth Rule of Programming (There actually are no rules 1 to 9) Any sufficiently complicated C
<http://en.wikipedia.org/wiki/C_%28programming_language%29> or Fortran
<http://en.wikipedia.org/wiki/Fortran> program contains an ad hoc, informally-specified, bug-ridden
<http://en.wikipedia.org/wiki/Computer_bug> , slow implementation of half of Common Lisp <http://en.wikipedia.org/wiki/Common_Lisp>
This can also be worded “Those who do not understand Lisp are doomed to reinvent it.”
Gustafson’s Law <http://en.wikipedia.org/wiki/Gustafson%27s_Law> (also known as Gustafson-Barsis’ law)
Any sufficiently large problem can be efficiently parallelized <http://en.wikipedia.org/wiki/Parallel_computing>
Kerckhoffs’ law <http://en.wikipedia.org/wiki/Kerckhoffs%27_principle>
on secure cryptography. A cryptosystem should be secure even if everything about the
system, except the key, is public knowledge Lehmann’s Laws
1. Systems that are used must change or automatically become less
2. Through changes the structure of a system becomes ever more
complex and more resources are needed to simplify it
Linus’s law <http://en.wikipedia.org/wiki/Linus%27s_law> – named for
Linus Torvalds <http://en.wikipedia.org/wiki/Linus_Torvalds> , given enough eyeballs, all bugs
<http://en.wikipedia.org/wiki/Computer_bug> are shallow.
Lubarsky’s law of Cybernetic Entomology:
There is always one more bug.
If it works it’s obsolete.
Metcalfe’s law <http://en.wikipedia.org/wiki/Metcalfe%27s_law>
In communications <http://en.wikipedia.org/wiki/Communication>
and network <http://en.wikipedia.org/wiki/Network_theory> theory,
states that the value of a system grows as approximately the square of
the number of users of the system.
If anything can go wrong, it will.
Occam’s Razor: <http://en.wikipedia.org/wiki/Occam>
There are many wordings for Occam’s razor and debate about how it is
interpreted but from software development I think the best wording is
“The simplest solution is usually the best”. This is very nearly the
same as the KISS principal “Keep it simple stupid”.
Technology is dominated by two types of people: those who
understand what they do not manage, and those who manage what they do
Sturgeon’s Revelation (sometimes referred to as his second law):
Ninety percent of everything is crap
If builders built buildings the way programmers wrote programs,
then the first woodpecker that came along would destroy civilization.
Wirth’s law <http://en.wikipedia.org/wiki/Wirth%27s_law>
Software gets slower faster than hardware gets faster.
Zawinski’s law <http://en.wikipedia.org/wiki/Jamie_Zawinski>
Every program attempts to expand until it can read mail. Those
programs which cannot so expand are replaced by ones which can.
Here are still more great laws from Joey DeVilla’s Blog posting
The Law Who Said It What it Says
Ellison’s Law of Cryptography and Usability
Carl Ellison <http://world.std.com/~cme/> The user base for strong
cryptography declines by half with every additional keystroke or mouse
click required to make it work.
Ellison’s Law of Data
Larry Ellison <http://en.wikipedia.org/wiki/Larry_Ellison>
Once the business data have been centralized and integrated, the value
of the database is greater than the sum of the preexisting parts.
Flon’s Axiom <http://www.cs.iastate.edu/~leavens/ComS541Fall98/hw-pages/comparing/index.html>
Lawrence Flon <http://libra.msra.cn/authordetail.aspx?id=403157>
There does not now, nor will there ever, exist a programming language in
which it is the least bit hard to write bad programs.
Gilder’s Law <http://www.netlingo.com/lookup.cfm?term=Gilder>
George Gilder <http://en.wikipedia.org/wiki/George_Gilder>
Bandwidth grows at least three times faster than computer power.
Grosch’s Law <http://en.wikipedia.org/wiki/Grosch>
Herb Grosch <http://en.wikipedia.org/wiki/Herb_Grosch> The cost of
computing systems increases as the square root of the computational
power of the systems.
Whatever the state of a project, the time a project-leader will estimate
for completion is constant.
Heisenbug Uncertainty Principle <http://en.wikipedia.org/wiki/Heisenbug>
Jim Gray <http://en.wikipedia.org/wiki/Jim_Gray_(computer_scientist)>
Most production software bugs are soft: they go away when you look at
Hoare’s Law of Large Programs
C. A. R. Hoare <http://en.wikipedia.org/wiki/C._A._R._Hoare>
Inside every large problem is a small problem struggling to get out.
Jakob’s Law of the Internet User Experience
Users spend most of their time on other sites. This means that users
prefer your site to work the same way as all the other sites they
Joy’s Law <http://www.smallworks.com/archives/00000368.htm>
Bill Joy <http://en.wikipedia.org/wiki/Bill_Joy>
smart(employees) = log(employees), or “No matter who you are, most of
the smartest people work for someone else.”
Lister’s Law <http://www.sauria.com/blog/2004/05/25>
People under time pressure don’t think faster.
Nathan’s First Law <http://research.microsoft.com/ACM97/nm/sld026.htm>
Tom Cargill <http://www.profcon.com/profcon/cargill/index.html>
The first 90% of the code accounts for the first 90% of the development
time. The remaining 10% of the code accounts for the other 90% of the
Bruce Beizer Every method you use to prevent or find bugs leaves a
residue of subtler bugs against which those methods are ineffectual.
Reed’s Law <http://en.wikipedia.org/wiki/Reed> David P. Reed
<http://en.wikipedia.org/wiki/David_P._Reed> The utility of large
networks, particularly social networks, scales exponentially with the
size of the network.
Sixty percent of software’s dollar is spent on maintenance, and sixty
percent of that maintenance is enhancement.
Lincoln Spector <http://www.thelinkinspector.com/>
The time it takes your favorite application to complete a given task
doubles with each new revision.
Spafford’s Adoption Rule
ial%20Value.html> George Spafford <http://www.spaffordconsulting.com/>
For just about any technology, be it an operating system, application or
network, when a sufficient level of adoption is reached, that technology
then becomes a threat vector.
Some unattributed “laws” that are worth mentioning:
* Build a system that even a fool can use, and only a fool would
want to use it.
* Any program over 100 instructions can be simplified by 3
instructions (without losing any functionality).
* Any idiot can learn to use computers, and many do
* There’s never time to do it right in the first place, but
there’s always time to do it over when it doesn’t work.
This posting started with Pulier’s Law: “A programmer will look at
another programmer’s output, no matter how brilliant and declare it
garbage”. At the risk of repeating Pulier’s conceit of naming law about
himself I propose:
No amount of documentation is ever sufficient to completely
understand a system Bibliography
Cohen’s Law: Magical Shizz Happens, it’s like jazz, just watch and listen to the space between.