Freigeben über


Contextual blindness: or How to take things completely out of context

Many testers are familiar with the concept of inattentional blindness (or at least should be in my opinion). Basically inattentional blindness occurs when we are so visually focused on a task  or object that we completely fail to see something out of the ordinary.

But, I am going to introduce my own neologism that I will refer to as contextual blindness. Contextual blindness occurs when someone is so restrictive in their thinking or so biased by their own opinion that they take references to a document or study completely out of context to support their own biased argument. In essence, they are ignoring the original context in which the statements are made, and perverting the sentence or using a statement out of its original context to support an oppositional point of view.

I too have been guilty of this when I made statements without completely researching available data or carefully reviewing empirical, or factually substantiated evidence. (These days, if I don't have sufficient data or information to support a strong argument for or against something then I try to preface my statement with "I suspect... " or "in my opinion. .."). However, some people seem to make a habit out of making wild, and often fallacious statements often in seemingly juvenile attempts of one-upsmanship. I suspect that sometimes people do this because they think they are beyond reproach; that they assume to know more than others, or that they consider themselves to be such an expert that nobody should question anything they say.

As I have gotten older and a bit more wiser, I have learned to question things and reassess my position or my ideas from time to time. I often speak with recognized industry experts, read several books and studies (often presenting  contradictory approaches or perspectives), review empirical data (and just to be clear, IMHO bug count as the only data point offered as empirical data is about as useful as nipples on men), and when I can I try to experiment or experience new things in order to draw my own conclusions. By now your probably asking where I am going with all this...there is a point...read on!

This evening a person sent me mail asking me if my keynote at EuroStar last year was "an attack on certification programs." She knew I gave one of the keynotes at EuroStar, but was a little shocked that I would attack certification programs. I told her that I gave one of 5 keynote addresses at EuroStar 2007 and my talk was entitled The Path to Professionalism: Skills of star performers, and gave her my perspective of certifications. In her response she forwarded me some mail from a distribution list which included excerpts from a rather lively debate between two other members of the list in which one  of the participants in the debate stated,

"The Department of Defense has identified the failure of traditional testing processes, including the problem of over-documentation, as one of the top five problems in IT. The keynote speech at Eurostar, in December was an attack on certification programs such as the CSQE."

So, I reminded her that the second sentence should more appropriately read, "One of the keynote speeches at Eurostar...". I told her that Michael Bolton gave that talk. (I must say that it was the first time I had seen him speak and I his stage presence is excellent, but I wasn't overly impressed with his arguments against certification because he simply denigrated the existing programs without offering any other solutions. Several delegates at the conference later asked me about his comments and I deferred by stating, "when in Rome." (Certifications in Europe are highly valued by employers for a variety of reasons.)  But, the above mentioned statement was merely misleading; it is actually the first statement that is most fallacious, and taken completely out of context.

The statement "The Department of Defense has identified the failure of traditional testing processes, including the problem of over-documentation, as one of the top five problems in IT" used the following study as a reference. So, let's take a critical look at that statement and question its truth or validity. (Because, in our jobs as professional testers understanding the correct context, critical thinking and logical questioning are important skills!)

"All we want are the facts, ma'am"

Fact #1. The DoD did not identify anything. The study was conducted by the National Defense Industrial Association (NDIA), which is not in anyway shape or form a part of the Department of Defense (DoD). NDIA members include "individuals from academia, government, the military services, small businesses, prime contractors, and the international community, the opportunity to network effectively with the government."  There were 26 participants in this study and one participant represented the DoD.

Fact #2. The study did not identify "the failure of traditional testing processes."  The purpose of the study conducted by 26 participants was to "Identify the top 5 software engineering problems or issues prevalent within the defense industry. " The participants actually came up with 7 issues, and they determined that one of the top issues in software engineering included "Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems. "

Fact #3. The problem of over-documentation was in the context of a discussion related to tests with a "disproportionate effort on detailed procedures." This was not one of the 5 (actually 7) problems identified, the participants concluded "Tests are over-documented with disproportionate effort on detailed procedures. " as a possible reason to explain the 5th top issue of "Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems." I suspect this statement speaks to the fact that many documented tests that I have seen by under-trained testers are simply prescriptive, regimented scripts based on some interpretation of an ambiguous requirements document rather than well-formed tests designed from an in-depth analysis of the system under test.

What the study really said...

But, not only did this person take parts of several statements in the report, munge them together in an attempt to use the findings completely out of context; this person also completely ignored (or purposefully omitted)  other points discussed in relation to this specific problem such as:

  • Over-reliance on testing alone rather than robust SW verification techniques.
  • Manual testing techniques are labor-intensive, scale poorly, and are unproductive relative to the large investment of resources."
  • Compliance based tools do not adequately cover risks or failure conditions
  • Tests are over-documented with disproportionate effort on detailed procedures
  • Education, training, certifications are inadequate to develop effective test skills.

The person also ignored (or purposefully omitted) the recommendations by the participants which included:

  • Sponsor a study of state-of-the-practice verification and testing approaches.
  • Review/update testing policies and guidance to emphasize robust, productive approaches that maximize ROI.
  • Review adequacy of verification plans/approaches early in the acq. life cycle.
  • Emphasize skilled investigation throughout the life cycle, based on coverage, risk mitigation, high volume automation.
  • Strengthen curricula, training, certifications, career incentives for testing roles.

Now, I really don't understand how someone can read this report and misconstrue the information to imply that traditional testing processes (and it is not clear what they are referring to here), or that over-documentation is one of the top 5 problems identified by the DoD; especially when the report also recommends strengthening policies or guidelines for full requirements traceability.

I have my suspicions as to why the person who made the fallacious remark above might omit all the facts and additional counterpoints in their argument. I suspect those details were not revealed because those points do not support (in fact, they completely dispute) the context-driven ideology that emphasizes manual GUI testing and the vehement opposition to documentation, test automation, coverage analysis, robust verification techniques, strengthening of certifications, or essentially anything that involves the need for greater technical know-how, logical analysis, or measurable advancement of the testing profession.

Comments

  • Anonymous
    February 12, 2008
    >>>I suspect those details were not revealed because those points do not support (in fact, they completely dispute) the context-driven ideology that emphasizes manual GUI testing and the vehement opposition to documentation, test automation, coverage analysis, robust verification techniques, strengthening of certifications, or essentially anything that involves the need for greater technical know-how, logical analysis, or measurable advancement of the testing profession. This sentense is bit tough to read and I suggest you rephrase this in 2-3 more sentenses so that we can understand your viewpoint I assume that the word "I suspect" or "In my opinon" is applicable the eitnre sentence above. Meaning -
  1. You can not very sure if context driven community opposes test automation and other things you mentioned.
  2. You are not likely to be sure if context driven community opposes anything that involves the need for greater technical know-how, logical analysis, or measurable advancement of the testing profession. Can you share few examples to demonstrate above view point of yours? As a "self proclaimed" member of context driven testing community, I can say for sure that our community does not oppose test automation and anything that involves the need for greater technical know-how, logical analysis, or measurable advancement of the testing profession. Please do check the principles of context driven school .. nothing there suggests that we oppose test automation or a need for technical know-how. http://www.context-driven-testing.com/ Please do let me know if I have mis-interpreted your words "out of context"
  • Anonymous
    February 12, 2008
    PingBack from http://www.biosensorab.org/2008/02/13/contextual-blindness-or-how-to-take-things-completely-out-of-context/

  • Anonymous
    February 13, 2008
    I totally agree with your article; this is a problem that, frankly, occurs everywhere (politics comes to mind -- let me form an opinion, then massage/twist the facts to justify that opinion). Really, though, this is only one step removed from the pure dogmatic approach of just having an opinion and ignoring any facts. That said, though, I don't know if 'blindness' is quite the word I would use. Obstinateness (yes, that's a word) or just dogged ignorance mixed with hubris. Although that last one doesn't really make for a catchy phrase: "Contextual ignorance through hubris". Personally, I have found over the years that there are many people happy to say "This is the way we do it," but only a few people willing to ask "Is there a better way to do it?"

  • Anonymous
    February 13, 2008
    The comment has been removed

  • Anonymous
    February 13, 2008
    The comment has been removed

  • Anonymous
    February 14, 2008
    The comment has been removed

  • Anonymous
    February 14, 2008
    The comment has been removed

  • Anonymous
    February 15, 2008
    The comment has been removed

  • Anonymous
    February 15, 2008
    The comment has been removed

  • Anonymous
    February 20, 2008
    The comment has been removed

  • Anonymous
    February 20, 2008
    The comment has been removed

  • Anonymous
    February 20, 2008
    The comment has been removed

  • Anonymous
    February 20, 2008
    The comment has been removed

  • Anonymous
    February 25, 2008
    The comment has been removed

  • Anonymous
    February 26, 2008
    Hi Shrini, Thanks for reiterating the demo point Michael made above. I am not sure why staged demos would be considered testing in anyone's mind; perhaps that is why they call them demos instead of testing?   The audit/compliance makes perfect sense, and I agree, but see the point below because I think tester's 'explore' the requirement to understand what to audit or how to audit something? I find it quite interesting that you only list problems, issues and bugs as the "kind of new information that would come out of a sapient human testing." A professional tester who is truly wise and uses sound judgement would of course realize that defects, problems and issues are only a fraction of the valuable information testing should provide to the organization to reduce exposure to risk. So based on the above, I would suspect that you would consider structural testing mythical and ellusive, especially if the only information one can provide is bug count, time, and 'feel-good' observations. In essence (according to your definition), everything which provides new information is exploratory. I guess we could say that even scripted tests are exploratory because they are derived from a 'sapient' person 'exploring' the requirements and designing tests! Unless of course, you do not consider that a person who applies critical thinking and use of their cognitive skills to analyze (since you state analysis is highly exploratory) the requirements and use sound judgement (sapience) to design a set of both positive and negative tests (scripted tests) from the requirements to be exploratory testing.

  • Anonymous
    February 26, 2008
    The comment has been removed

  • Anonymous
    February 26, 2008
    The comment has been removed

  • Anonymous
    March 04, 2008
    The comment has been removed

  • Anonymous
    March 07, 2008
    I am not sure what to call it. I don't really suspect it was laziness, apathy, or even ignorance. I think some people have a habit of avoiding facts or taking things completely out of context to further some personal agenda.