Sunday, October 7, 2012

The science of software ...


Back in 2002, I had my own form of Buccaneering.  As a developer on a UNIX aircraft project, I'd been in the buisness for over 5 years, and noticed a few parallels between some fundamental laws of physics and software engineering.

I'd written these up, and had them pinned on my desk as "The Talks Physical Laws Of Software Engineering".  Unfortunately I've lost all copies of them, so I'm recreating them from memory ... as you can see they're not comprehensive and were meant more for fun, and from a development over testing perspective.

But what they're an example of the concept of Buccaneering, taking a parallel idea from (in this case) physics, and bringing it into the software engineering context.  Of course none are particularly ground-breaking (although we were caught out by rule 6 at the time) ...

1)  Problems in code (defects) will remain, unless work is done on them [Newton's First Law Of Motion]  Really, defects just don't go away ...

2)  As you are doing work to remove defects, you are also doing work to potentially introduce new defects [Newton's Third Law of Motion]

3) Software that is developed without any monitoring will tend towards chaos [Second Law Of Thermodynamics]  You are just going to hope that everything is good?  Let me know how that works out for you ...

4) Small deviations applied over long enough distances cause massive errors [Trigonometry]  So try and find any defects early.

5) Any system will have levels of uncertainty [Heisenberg's Law of Uncertainty]  Of course you want to minimise uncertainty (which is part of what testing's about), but you cannot remove it altogether!  It will always be there in unknown finite levels.

6) The more closely you observe an event, the more likely you are to be impeding it [Heisenberg's Law of Uncertainty]  Coined as we learned that trying to rerun defects using a debugger could prevent the problems re-occuring - mainly as many debuggers of the time would initialise data stacks which would otherwise contain random data.


Maybe you've read these, and come up with a couple more examples in your head?  Congratulations!  You're thinking outside the box ...

1 comment:

  1. Hazen's Law - To err is human, but to really F it up you need a computer.

    (Not really mine, but can be relavent at times.)

    ReplyDelete