Test-driven development
A lot have already been said about Test Driven Development (TDD) by a lot of people, but I'd still like to add my 0.02paisa.
We have an internal requirement of checking in UnitTests along with the product code and the code coverage for the unit tests needs to be high. Most of our developers have over 85% code coverage.
In my sources I decided to try out TDD. I used the following steps
- Write the method's prototype so that it matches the design doc and throw a NotImplementedException in it.
- Write unit-tests in Visual Studio Team System unit-test framework. I try to cover all the requirements, even the ones required for negative testing like pass an invalid handle and catch an ArgumentNullException and verify that the correct argument is mentioned in ArgumentNullException.Param.
- After that I run the tests. All the tests obviously fail with lots of X marks.
- Then I go about fixing each of the test failures by adding the functionality in the code.
- After each fix (or couple of them) I run the tests and the red marks keep changing into test passed green ticks.
- Once I'm done, I run with code coverage and then add more tests if required to cover the blocks which were not touched by the tests.
Even though the system looks simple it has helped me enormously by catching multiple bugs at the beginning. Even trivial tests like tests for GetHashCode and operator overloads found issues :) The fun in seeing all those X marks disappear one after the other brings in a childish zeal to get them done even faster.
The other goodness is that after every code change later I can easily fire these tests for sanity check.
Comments
- Anonymous
December 12, 2006
Generally with proper TTD you write a single test (that fails) then write the code till it passes. You then write the next test, and modify the code till it passes. This means you never write more code than you need.But glad to see you're having fun with TTD. - Anonymous
February 07, 2007
The comment has been removed