Automation tasks for the OneNote test team
Right now we have a little bit of spare "bandwidth" to complete some automation tasks which have been building up over the last few months. The lingo around here never lets us refer to spare time: we always call someone who has a few free hours to spare as having "extra bandwidth." Like every other job in the world, though, we have about 15 tasks to complete and time and people (i.e., "bandwidth") to complete four of them. Here's a broad overview of some of what we are up against for the next few weeks:
- We have a push to "backfill" legacy automation. A never completed task is having any area 100% automated. In general, we have tests which verify the feature is included in the build as our highest priority automation. Next comes acceptance tests: not only is the feature present, but it actually works. After that comes a suite of tests: comprehensive feature tests, performance, specific international tests, etc… We know some areas which have been around for awhile could use some more automation added to reduce the manual workload, so we can add this task to the list.
- Performance tests. We are working on a suite using our new UI-less system to measure performance of OneNote and gather the statistics from month to month. Designing these tests and measuring the correct metrics for each test is not a trivial task. The automated script itself is usually fairly easy to create - the difficulty is knowing what to measure and how.
- We have a tool which starts OneNote and (among other things) modifies notebooks over and over again. It's called "OneAuto," and we have one person dedicated to making it easier to deploy across multiple machines, run with designated configurations (like running against a notebook on a SharePoint server, on a UNC path, against Windows Home Server, etc…) and produce log files which are easier to parse.
- We change the version of OneNote from one version of Office to another. We noticed we had hard coded the version number in several different locations in our automation project. We have a work item to add code to get the version number from the installed build of office. This is just a good engineering practice and we need to get it implemented (even though the current system is working).
- Visual Studio 2008 finally has been released. We need to migrate from the final beta to the released version. We may get a deadline from Office to do this, or we may be able to set out own. If nothing else, our project and solution files need to be migrated, but there may some other work to complete as well.
- Before we can get to item number 1 above, we need to analyze our current automated scripts to see what we have covered. We have reporting tools for this, but we changed some of the fields in the report, so we have to re-tag the existing scripts with the new data. It's an easy enough task, but could take 2-3 days to complete.
- (We have a whole slew of more work, but it's low priority relative to the work above, and a little difficult to explain without becoming boring. Refactoring code, exposing it to naming convention checking tools and the like are examples of these tasks).
Before we start on any of these, we need to define what each of these actually means (item 5 is a wildcard, but item 3 has a good plan already for it), cost out how long each will take, prioritize them, assign them to the people we have and then hold ourselves accountable to getting them done.
Questions, comments, concerns and criticisms always welcome,
John
Comments
Anonymous
February 18, 2008
PingBack from http://www.biosensorab.org/2008/02/18/automation-tasks-for-the-onenote-test-team/Anonymous
February 21, 2008
I had an old machine give out on me earlier this week. The memory on it had started giving parity errors,Anonymous
February 21, 2008
I had an old machine give out on me earlier this week. The memory on it had started giving parity errors