Jaa


The Minefield Myth (Part 1)

In my studies at university I studied anthropology. Several courses I took surveyed folklore and its relevance in modern society. Many people mistakenly believe that most folklore (folktales, legends, myths, ballads, etc.) are purely fictional and simply fanciful tales. However, folklore is usually based on some grain of truth, or is used to instill societal or religious mores and values. For example, social scientists have found that many ancient civilizations have folklore regarding a massive “flood” in the distant past which wiped out huge populations of people. Did this actually occur? Well, we don’t know for certain, but geological evidence does suggests is that at one time coastal waters did rise significantly. Was this caused by cyclical change in the earth’s temperatures or by a series of earthquakes causing tsunami’s to ravage coastal villages? We don’t know; but the folklore may indicate that at some point many societies suffered a devastating travesty caused by rising waters. Was the story embellished over time…certainly. Another example is the “Cinderella” story. There are over 450 versions of the “Cinderella” story around the world.  The story is about over-coming adversity and oppression, and avoiding self-pity and selfishness. Basically, it is much more than a Disney animation, in traditional folklore it has been passed down through the generations to reinforce societal values.

The first time I read about a minefield analogy was in the context of sampling. Later, Brian Marick used a similar analogy to suggest repeating tests (regression testing) is not likely to reveal new bugs. Marick’s analogy is perpetuated by Bach, Kaner, and others who tend to diminish the value of regression testing (especially automated testing) because we are simply traversing a minefield by following a previously cleared path.

The Marick minefield analogy is simply an alternate perspective of Beizer’s pesticide paradox which states “Every method you use to prevent or find bugs leaves a residue of subtler bugs against which those methods are ineffectual. Basically, no single approach to software testing is effective in identifying all categories of defects, and we must use many approaches in software testing and vary our tests. In that context I absolutely agree with the analogy.

However, a basic problem of Marick’s minefield analogy as it is often misrepresented is that it seems to treat the software under test as a static, unchanging field of easily exposed mines.

If you were hired as a consultant to come in a perform a rapid evaluation of a software product using a sampling approach such as exploratory testing, then Marick’s minefield analogy is a wonderful strategy. In that context re-running a test provides no new value and has little probability of exposing new information.  However, for the rest of us who work in iterative software development lifecycles (including agile lifecycles) building complex systems with interdependent components the minefield analogy may not be as useful.

For example, in complex systems with interdependent modules we know that a change in one module can adversely affect other modules that have some dependence on that module. So, a change in one module can impact the functional behavior of other modules. In layman’s terms, activating a mine while traversing one path through the minefield may reactivate an already cleared mine in another part of the minefield, or even plant a new mine in a previously traversed path.

In iterative development lifecycles, the minefield is in constant flux (at least until the code complete stage, but even then the code is changing as issues are being addressed.) In iterative lifecycles features are being added, changed, and possibly removed during the process. Depending on the length of your product lifecycle the changes can be massive. The PDC release of Windows 95 ‘looked’ very different as compared to the final release.  The build verification/acceptance test suite for Windows 95 was a relatively static baseline regression test suite that continued to find ‘regression’ problems up to the final weeks of the project due to code churn.

Also, not all mines are equal! Some mines are quite easy to detect while others are very hard to find which is why systematic probing is still used by professional’s to clear latent minefields. Similarly, an exploratory approach to testing software will easily reveal some bugs very quickly, but without ‘systematic probing’ we could just as easily overlook other types of issues.

There are also different types of mines which may be activated differently, so traversing a minefield with a size 10 boot may not activate the mine, but someone with a size 12 boot, or who weighs more than the previous person may in fact activate the mine. Likewise, traversing the same path through software using different data or applying a more systematic analysis of a path may reveal interesting information or expose anomalies that were not previously discovered. For example, throwing simple ASCII characters at a text input control is not likely to expose any bugs (or restated it is likely to show us a clear path through the minefield). However, when we take that same exact path using Unicode characters, or Unicode surrogate pair characters we are very likely to expose problems not revealed previously.

In part 2 I will discuss regression testing and specific situations where regression testing is very valuable.

Comments

  • Anonymous
    January 28, 2009
    Great posts, keep them coming! (Please...with sugar on top ;-)

  • Anonymous
    January 29, 2009
    Thanks Chai! I hope you like part 2