Share via


Does YAGNI ever apply to tests?

I've been writing a small utility to help us do some configuration setup for testing. It needs to walk a directory structure, find all instances of a specific xml file, and then make some modifications to the file.

I TDD'd the class that does the XML file stuff, and I'm confident that it's working well. I'm now going to do the class to walk of the directory structure and find the files.

And there is my dilemna. I don't know if I'm going to do TDD on that.

I know exactly how to write it, I've written it before, and my experience is that that is code that never changes nor breaks. And, figuring out how to write tests for it is going to be somewhat complex because I'll have to isolate out the live file system parts.

So, I've already decided what I'm going to do, but I'm curious what you think. Does YAGNI apply to test code, or is that the first step to the dark side?

Comments

  • Anonymous
    June 27, 2007
    There is some validity to that logic.  If I had one of my developers ask me that, my response would be "write the tests"; I'll give you the benefit of the doubt and trust you on "never changes nor[sic] breaks" and suggest in your case it's not needed.  (if it's that common though, why hasn't already been componentized for reuse and this test question already been covered? Unit tests are another area, do you unit test your unit tests? (ad infinitum...)

  • Anonymous
    June 27, 2007
    The comment has been removed

  • Anonymous
    June 27, 2007
    I don't think Michael's list is suggesting that code that accesses a database, accesses the file system, or accesses the network shouldn't be tested.  He's merely suggesting they shouldn't be called "unit tests".

  • Anonymous
    June 27, 2007
    Indeed, the very next sentence in the article confirms Peter's belief: <quote> Tests that do these things aren't bad. Often they are worth writing, and they can be written in a unit test harness. However, it is important to be able to separate them from true unit tests so that we can keep a set of tests that we can run fast whenever we make our changes. </quote> Ideally, I'd agree - but the separation can be costlier than writing the tests and living with the smell, in my experience. Not always, but sometimes. Jon

  • Anonymous
    June 27, 2007
    The comment has been removed

  • Anonymous
    June 27, 2007
    "I know exactly how to write it, I've written it before, and my experience is that that is code that never changes nor breaks." Then why are you writing it again?

  • Anonymous
    June 27, 2007
    It sounds like the problem is that the test is too obvious and hard to write without basically rewriting the code.  Kind of like writing a unit test to prove that a method adding 2 and 2, and returning 4, is working right.  It's hard to write a test for an "embarrassingly obvious" type of function. Since you're confident the code is working right, maybe the thing to do at this point is refactor a bit to make it possible to put the code in a library so next time you won't have to write it again. And as you refactor, the code may get less obvious to the point where you need to write tests on it anyway.  Two birds with one stone.

  • Anonymous
    June 28, 2007
    There is a development and maintenance cost for each test just as there is a cost for each feature. You have to balance the expense against the return on the investment. The return is future bug detection without integration testing. Ideally you assign your limited budget of time to the highest priority areas, including your budget for writing tests. If you don't have lots of time to maintain the tests as you change the tested code, or the tested code will never change, you should not write the tests. But you cannot use the TDD buzzword either. You have sinned inexcusably! I have seen a lot of redundant testing done with TDD, whereby unit tests re-test the same code tested by integration tests. I believe the philosophy is that too much testing is better than not enough. It is a blunt tool.