Test Code Must Be As Solid As Dev Code
All good development projects follow certain basic practices to ensure code quality. They use source control, get code reviewed, build daily, etc. Unfortunately, sometimes even when the shipping product follows these practices, the test team doesn't. This is true even here at Microsoft. It shouldn't be the case, however. Test code needs to be just as good as dev code.
First flaky test code can make determining the source of an error difficult. Tests that cannot be trusted make it hard to convince developers to fix issues. No one wants to believe there is a bug in their code so a flaky test becomes an easy scapegoat.
Second, spurious failures take time to triage. Test cases that fall over because they are unstable will take a lot of time to maintain. This is time that cannot be spent writing new test automation or testing new corners of the product.
Finally, poorly written test code will hide bugs in the product. If test code crashes, any bugs in the product after that point will be missed. Similarly, poor test code may not execute as expected.. I've seen test code that returns with a pass too early and doesn't execute much of the intended test case.
To ensure that test code is high quality, it is important to follow similar procedures to what development follows (or should be following) when checking in their code. This includes getting all non-trivial changes code reviewed, putting all changes in source control, making test code part of the daily build (you do have a daily build for your product don't you?), and using static verification tools like PCLint, high warning levels, or the code analysis built into Visual Studio.
Comments
- Anonymous
August 07, 2008
It should be done but sounds like a resource nightmare - "hey boss, we need some testers to test the test code..." How close are you to getting this to happen at MS ? Any problems in a) getting a budget for this ? b) finding enough testers capable of doing it ?