Escape Analysis; or, 'how *did* that bug slip through testing?'
After the Indigo team's last milestone (M4), most of our feature test teams
did an analysis of the bugs in their feature area, and extracted some useful 'lessons
learned' to help us improve our testing in future milestones. Here are some
of the lessons we learned:
1. Do
breadth-first testing, sanity testing, and test prototyping as early as possible.
A higher-than-expected number of our bugs were found by developers and casual users
of the release; bugs that we expected test to find, and ones that they later did find.
We could have saved those folks some wasted time if we had a basic level of testing
available in all feature areas as early as possible in the milestone.
2. Be
very thorough in design and spec reviews; it pays off. Explore the alleyways
of the design, and ask obscure questions about threading and error handling.
You'll save yourself a lot of grief by working these issues out in design, rather
than discovering them through exhaustive investigation during later testing.
3. Be
very clear about how testing of cross-team dependencies will be done. Documentation,
communication, and review across feature teams is important. If you ever find
yourself saying "we don't need to test that, team XYZ is doing it", stop yourself,
and go double-check that team XYZ agrees with your conclusion.
4. Participate
in dev code reviews and set aside some time for white-box testing. My team tends
to go back-and-forth on the value of white-box testing. In our M4 milestone,
however, did identify a number of bugs that could have been caught if we (test) had
a better concept of the internal product design. There were also several bugs
that would have been extremely difficult to find without using some kind of white-box
testing technique like fault injection.