Software Testing 7: Dogfood and Automation
I got a lot of requests to write more about testing. There was also a request to hear more about my Bernese Mountain Dog (berner.org). What better way to kill two birds with one stone than to talk a bit about how we use dogfooding internally to test Visual Studio.
Testing Visual Studio is all about the automation effort. Here automation is the largest deliverable a test team has. If it is possible to write code to do testing for us… we do. Most of the testers here are actually “Software Design Engineers in Testing” meaning we try and hire very competent developers who also happen to have the talents for testing. One benefit to this is a large library of tests we hand off to sustained engineering to be run any time a service pack to VS or a component we depend on is released. The other, more important benefit is that we are testing through writing code with the product itself. We get lucky here since, for example, you can’t exactly use Excel to test Excel as effectively as you can use Visual Studio to test Visual Studio.
Some tidbits about this automation: You can think of our automation on two levels. The lower level is essentially an abstract of the product itself in mostly managed code. (A lot of VB.Net, some in C#, and some in C++. It’s up to the team really.) If there is a tool window in the IDE there is a class that represents that object and exposes it’s UI to the next level of automation, the test cases. We’ve gotten exceedingly good at developing this low level abstraction and for this release merged almost all of the separate efforts to create this into one massive internal project so that only one team has to write the toolbox component and anyone else can use or update it. It’s been impressive to watch the entire internal community of testers create and grow this project from almost nothing in such a short time. The end product, at this level, is several class libraries built to support components you might be testing that you can leverage at the next level.
The next level of our automation is the actual tests themselves that are run on some regular basis depending on the priority of the test. The tests could be simply a line by line scripted test that calls into the lower level, does a verification, and reports the result or as complex as a Test Model that is designed to cover all of the actions/states that a specific feature can be put through. It’s case by case here for each tester to determine the best approach for testing a given feature. For example: testing the concept of window management in the shell is ideally suited for model based testing because the behavior of a tool window can be described clearly with a finite state model that includes floating, docked, autohidden, etc.
Ok, that was slightly over “quick“, but I could offer a more detailed glimpse into all of this if you’d like. Back to dogfooding: The type of development described above clearly gets us some great coverage on the basic features needed to develop and debug class libraries. So how does this dogfooding get us coverage on something like the winforms designer? We don’t write winforms as part of this test automation, but we find a ton of bugs in winforms by simply making sure we can write this automation that executes the given test plan. As testers are writing test cases they are essentially executing the test plan that was written (ideally) before the feature was developed. It means that while we are writing this automation we are not going through that plan very quickly or several times. This is probably an acceptable risk since each release will have several test passes where every test is executed and the bug regression rate is only around 5%. Having this automation also means that the time to execute all of our tests goes down significantly over just having all manual tests.
Once a release is mostly stable it means that we can begin using VS to write the tests and using the new framework to execute the tests, but it can be effective to use the previous release for dogfooding as well since it allows us to constantly be comparing the new components to their older counterparts and make sure we are really making progress.
Dogfooding doesn’t always taste good. You can’t test effectively if you can’t use most of the features. For this reason most people do not update their dogfood build with the nightly build all the time. You look for a build that enables you to make more progress with your tests. But if a bug prevents you from dogfooding it obviously becomes a fairly high priority since dogfooding is probably best when the food is as fresh as possible.
Dogfooding isn’t all about us testers. At a large team meeting yesterday there was a slide about dogfooding entitled “This time we are serious”. A chuckle arose. By comparison to other Microsoft products <Cough> Office <Cough> Visual Studio is not as dogfooded around Microsoft as we would like. It really means two things. 1. We haven’t pushed it as hard as other teams <Cough> Exchange <Cough> as necessary and 2. We probably are not yet meeting all the needs of a Microsoft developer. The second part has frustrated me for a long time. There have been arguments made that somehow developing at Microsoft is different than what our customers do. This is, at best, only a half truth. Sure, not every VB customer is writing an operating system, but I’m confident that if we are failing some internal customers we are most likely failing some external customers. Similarly, if you can’t find a good number of people that think your own product is worth using at your own company it might be a sign you should change a few things.
Do what you can to incorporate your product into your job. If you are developing a word processor you should be using it any chance you get to create your own documentation. If you are developing blog software, then blog about your progress. Do whatever you can to give yourself a different perspective of the software you intend to create. The more views of something you have the more defects you will find. If you are developing a product that you can’t integrate into your day to day job then you should be creating a dogfooding test plan where you and interested customers spend dedicated time using the product the way it is meant to be used. It’s especially important to get feedback from these different perspectives and incorporate it into your release. Because if 1 of 20 people find a problem with your software then you can imagine the issues that will be found when you have thousands of different perspectives on your product.
And now I present Oliver. Someone who really likes to eat his dog food. :-)
Comments
- Anonymous
February 14, 2004
The comment has been removed - Anonymous
February 15, 2004
Those are some great shots of your newfies. I love meeting them with Oliver because it reminds him he is not the biggest dog around. - Anonymous
February 19, 2006
PingBack from http://dogfood.phatsitez.com/2006/02/19/scooblog-by-josh-ledgard-software-testing-7/ - Anonymous
February 20, 2006
PingBack from http://dogfood.phatsitez.com/2006/02/20/scooblog-by-josh-ledgard-software-testing-7-2/ - Anonymous
June 14, 2009
PingBack from http://cutebirdbaths.info/story.php?id=181