Jaa


Human-oriented software design is error-oriented

Alternate title: Cognitive dissonance in software design.

How to know if a design process is help or hindrance for good decision making?

As the design activity consists of a myriad of decisions, all the way from idea to released bits and back, the whole quality of the outcome is the sum quality of those individual and interrelated decisions. One bad decision could spoil an entire design even if successive decisions would be better ones. The real fix is to go back and remove the decision subtree branched at the wrong choice. If a given design process enables such ability, then it seems to be of help.

The ability of going back in the decision tree often is based on how far we have gone from the wrong choice. How many other decisions are dependants of the wrong turn? Costs in terms of time, effort, money, etc. could prevent us from going back. Hence, a good design process helps us to reduce the amount of dependencies upon a given choice before it has been revisited and corroborated.

Such a design process has to consider that (1) humans are its main players —not computers or tools— and (2) the intellectual nature of the object under design: software solutions to human problems. Therefore, the design process must take into account the ways humans behave when taking decisions and solve problems.

Taking good decisions, solving the right problems, and solving them rightly, are all human endeavors, error-prone, and full of blind spots. A root cause of many software development failures is to ignore those very simple facts —note that I say simple, which could be completely different than easy—.

A human-oriented design process is error-oriented; it plans for the most likely event: human error. Consequently, a human-oriented design process must consider, for one, along with other important stuff, the cognitive dissonance theory in the field of psychology about how the human brain seems to be “designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on us the comforting delusion that we, personally, do not have any; about how and why people unintentionally blind themselves so that they fail to notice vital events and information that might make them question their behavior or their convictions” —Mistakes Were Made (but not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris and Elliot Aronson. If such theory describes, explains and predicts a significant number of cases, then it could help to control the process of software design, where there is plethora of decisions to make. For example, it could, with relatively sound basis, help designers to choose the time to revisit a design decision: early and often (that recurrent tenet once again), at least, the most important decisions. Otherwise, if the time to revisit is extended, then more dependencies could amount; or the decision gets justified by the effect of the self-protection mechanism: “we cannot be wrong”.

Mechanisms of preservation of the ego are normal, human; as well as the self-awareness related to the adult life, that where we know what we are doing.

“Drivers cannot avoid having blind spots in their field of vision, but good drivers are aware of them...We cannot avoid our psychological blind spots, but if we are unaware of them we may become unwittingly reckless, crossing ethical lines and making foolish decisions” —Carol Tavris and Elliot Aronson.

“The greatest of faults, I should say, is to be conscious of none” –Thomas Carlyle

Another cause for the blind spots in software designers is the performance metrics in the organizations they are part of. Oftentimes, those metrics are not error-oriented but error-punishers and thus most designers have no incentive to look for disconfirming evidence of the accuracy of their design decisions. For more of this, see Measuring and Managing Performance in Organizations by Robert D. Austin.

Most important decisions in good software design are tentative, provisional —like in a serious scientific endeavor—, kind of timid and bold at the same time. As the number of corroborations increased so the confidence on the goodness of such corroborated decisions. Q: How do you know it will work? A: Because I have been watching it work incrementally already.