Thoughts on Exit Criteria

On occasion I write an email to the Team System team that steps back and tries to explain a little bit about the background of why we do something rather than just what we’re doing and when it’s due.  I’ve been wondering recently if these updates would be interesting to folks outside of our team.  So, I’m going to try publishing a slightly sanitized version of these emails and gauge the response.  Let me know what you think and we’ll see where this takes us.

~~~~~~~

Over the past week there’s been a concerted effort to review and clarify the exit criteria for future releases of VSTS10 including beta and release. At last count we had approximately 25 exit criteria that we’re managing for this release.  Some, such as Performance and Side-by-Side, are familiar to many of you while others, such as App Isolation and Windows Logo, might not be.  As I sat through hours of meetings last week reviewing each of these in detail I thought about why these are important.  In the end, I realized these criteria help us pop above the din of day to day activity and instead see the broader fundamentals for the release.  For instance, it doesn’t matter if we have the world’s best Test Impact Analysis if installing VSTS10 causes other applications to stop working or if the performance of the overall system is horrible.  Customers will just back away before they even realize the benefits of the value we’ve added to the product.  So, as we make our way through what sometimes seems like unending test passes and bug triages over the next year, keeping these fundamentals in mind will help us ensure that 1000’s and 1000’s of decisions each of us will make are in line with a high quality release that our customers will appreciate and value.

We’ve also established a ramp for most of these exit criteria such that the Beta criteria are reduced when compared to RTM.  In part, this is in recognition that when we ship Beta we’ll still have at least time before we ship the final version so we don’t have to be all buttoned up at that point.  But it’s important to start practicing driving this type of quality into the product.  Many of these things take time to get right and we can’t assume that we can make them all go green in the final weeks of the project unless we’ve taken a measured approach to get there. 

For various reasons, some teams will not be able to meet the graduated ramp for the exit criteria that we’ve outlined.  When we roll these guidelines out you’ll see that some are identified as “critical” while others are labeled “important”.  The idea here is to guide teams into how to apply their limited resources in the case where they don’t believe they can meet all of the Beta1 criteria.  We’ll be making exceptions for specific criteria on a team by team basis through a negotiation between the team, the divisional exit criteria owner and the divisional release team.  These discussions will focus on ensuring that the right tradeoffs are made while not endangering the overall integrity of the release.  This information will also have a strong influence during the DCR approval process.  Teams that are in good shape with regards to bugs and exit criteria will have a much better chance of getting their DCRs approved than those teams who have considerable bug or exit criteria debt to pay.  

One thing that’s not well captured in the exit criteria that I want to continue to find a way to shine a light on is the state of integration between features.  A team that’s nailed their features, exit criteria and bugs might still not be ready to ship if their features aren’t well integrated together.  This is obviously more subjective than some of the other tests that we run but nonetheless important to our customers.  I think our end-to-end scenario effort that has been underway for the past 6 months is an important part of this discussion and we’ll keep looking for ways to ensure the state of this aspect of our product is well understood by everyone.

Starting in December we’ll be tracking our exit criteria state on a common divisional scorecard in TFS.  I appreciate your efforts to help us keep it as up to date as possible so that we can view our progress and make good decisions as a team based on accurate information.  As we manage the VSTS10 endgame, it’s very informative to use the scorecard to view trends for teams as well as particular criteria to guide our decisions.  If at any time you think our scorecard doesn’t really represent the true nature of our situation, please don’t hesitate to drop me a note so that we can work to rectify the situation together.

As always, I’m eager to hear from you on how we can improve our engineering process and systems so please let me know what you think about exit criteria or any other aspect of how things are going with VSTS10.

Yours,

jeff

Comments

  • Anonymous
    December 01, 2008
    Patrick Sirr on Database Project Import Scripts Add In Mike Fourie on MSBuild Extension Pack 3.5.1.0...