Yes Virginia, Developers Really *Can* Test
I've been coding for something like twenty years, but I've only really been testing that code for about ten. Oh, sure, I would bang on my apps in ways I thought were similar to what my users would do, and I stepped through my code and stuff like that, but I didn't really know what I was doing.
I knew that something was wrong, though. Making the same mistakes and fixing the same bugs time after time tend to help a person come to that conclusion. <g/> Code Complete and Writing Solid Code and Testing Computer Software were like bolts of lightning directed straight at my brain. My code got better and my users got happier. I still didn't really know what I was doing, but at least I *knew* that now!
Then I discovered unit testing. Talk about your life-changing events! Now I could write consistent, repeatable tests that were easy to run. Once again, my code got better and my users got happier. That combined with a whole lot more reading meant I was starting to actually know what I was doing.
Fast forward to now. One of my primary responsibilities is helping my team move along this same path in rather a shorter amount of time than I took to travel it on my own. I'm happy to say everyone is making great progress -- which is a good thing, since we have a bloody lot of code to write! Our automation stack (and its unit tests!) makes for a good chunk of code, but it doesn't end there. We have all those test cases to write, too. Some of our pairwise test cases are turning out to be quite complex; certain of them are more complicated than some of the production code we're testing, I think! One of us estimated recently that we're writing twice as much code as Dev is; I don't think that statement is far off.
We have so much still to do, in fact, that Dev is pitching in for several weeks. How cool is that?!? We have pretty good communication between Dev and Test, but it's not as good as it could be, and having them working on our tests can't help but make that communication better.
Our developers write integration tests for the code they write using our automation stack, but this few weeks are immersing them in our world much further than they've gone previously. They are actually spending entire days doing nothing but writing test cases. Not application code. Test cases. And they're starting to understand why we do all the strange and weird things we do! Just this morning I overheard one developer talking to another about verification: "I just cleaned up a whole bunch of problems that Test has been complaining about for awhile. I never understood why they cared -- they were minor issues that didn't have any visible effect. But when you have to verify all this data in all these test cases, and you can't just hard-code in 'expected' deviations like these, those minor issues become major issues! I guess I knew this before, but I hadn't really internalized it until now."
Oh. My. Gosh. Developers starting to get testing! They're doing a great job, too! Which just proves my long-held theory that developers can be trained. <g/>
*** Comments, questions, feedback? Can you test? Want a fun job on a great team? Send two coding samples and an explanation of why you chose them, and of course your resume, to me at michhu at microsoft dot com. I need a tester and my team needs a data binding developer (i.e., a developer that wants to play with databinding), program managers, and a product manager. Great coding skills required for all positions.
Comments
- Anonymous
September 08, 2004
With this large amount of complex test code, what's the error rate like in your tests? Do you keep any metrics for the number of bugs found in tests versus the number of bugs found in the actual application? I wonder how good the developer written tests are going to be compared to the tester written tests. - Anonymous
September 08, 2004
The only concern that arises based on what you have written is, shouldn't these unit tests already be written?
Testing should only have to be doing GUI/presentation and acceptance type testing. - Anonymous
September 08, 2004
Nicholas:
We keep the error rate in our tests due to bugs in the tests themselves very low by doing all the work in helper methods that we unit test the daylights out of. We haven't kept hard metrics about bugs in our test cases versus bugs found in the application, but we don't turn a test on until it passes 100% except for known application bugs so only application bugs should be found after that.
The tests our developers are writing for us should be no different than what we would have written, for several reasons: 1) We code review before they check in, so if they skimp we call them on it; 2) We've pretty well laid out the tests they're helping on, so for the most part they just have to implement; 3) They've quickly realized how much they don't know and so are quick to call us in for brain dumps whenever they're at all unsure about what they're supposed to do or how they're supposed to go about it. - Anonymous
September 08, 2004
B:
Dev is required to write unit tests before checkin in, and for the most part they have been doing this. The tests they're helping us write aren't unit tests by any means but rather are typical tester tests.
I completely agree (as do my devs and in fact my entire team) that developers should write whatever tests are necessary to ensure their code is correct before checking in. That includes unit tests but may include integration and UI tests as well. However, I've found that developers generally don't go as deep as testers do. This is partly due to lack of training -- they just don't know what to care about.
The tests our developers are helping us write are actual tester-type tests. That's what's so cool! Rather than going on ahead and writing more code they're helping us test the code they wrote!