Freigeben über


Acceptance Test Engineering Guidance - project kick-off

Traditionally, p&p primary audience included developers and architects. Even though we have some coverage of the testing discpline (in particularly Testing .NET Application Blocks - Version 1.0 guide and Performance Testing Guidance for Web Applications), it is a small portion of the p&p assets [link to catalog]. In our interactions with customers, we hear requests for good guidance on testing – all kinds of testing.

So, back in the fall 2007, I’ve put a number of projects related to test engineering and test automation on the patterns & practices backlog. Several other important projects took precedence (GAT/GAX, Unity, Enterprise Library). And now, after we have shipped GAT/GAX 1.4 and Unity 1.0, I am happy to be able to kick off this project with the focus on acceptance test engineering.

The core team consists of Michael Puleio , Jon Bach and I, Grigori Melnik. Michael is not a tester but a great developer with passion for testing, test automation and test tools. Jon is a professional tester, he is the manager of corporate intellect at Quardev Labs and a co-inventor of session-based test management for managing and measuring exploratory testing. I have devoted a number of years researching executable acceptance test-driven development (with FIT) and the relationships between software requirements and acceptance tests (see this article with my stance on this).

The topics we plan to focus on in this project include:

- test objectives and strategy,

- product readiness/acceptance,

- defining and reconciling good-enough criteria in various industrial contexts,

- working with customers and customer-proxies,

- supporting stories/requirements with acceptance tests.

We intend to support our guidance with case studies and exercises from the real world.

We are running this project as an agile project, with weekly iterations, standups, collocated team, etc. This deserves a separate blog post, which I’ll probably do later this month.

In the meantime, feel free to post your comments and thoughts on any specific (painful) aspects of acceptance testing that you would like help with. Also, if you have an interesting experience with acceptance testing that you'd like to share and perhaps be profiled in our guide as a case study, we'd like to hear about it.

Comments

  • Anonymous
    April 08, 2008
    The other day, I mentioned a new project here at p&p, Acceptance Testing Guidance , Grigori has his

  • Anonymous
    April 09, 2008
    FIT is a great tool, I wish MS P&P could extend it (or create a similar tool). I would suggest incorporating it into MOSS or the Reporting Services. My challenge today is how would I do a test in my database for data quality?

  • Anonymous
    April 28, 2008
    The comment has been removed

  • Anonymous
    May 19, 2008
    Hi Gregory, its great to see that my proposed performance improvements have made it into the final Entlib release. http://geekswithblogs.net/akraus1/archive/2008/01/14/118543.aspx Did you profile the enhancements or do you have any numbers at hand? Yours,  Alois Kraus

  • Anonymous
    May 31, 2008
    Tech-Ed 2008 Developers , June 2-6, Orlando, FL. I’ll be giving a talk on the Enterprise Library (DVP02-TLC).

  • Anonymous
    June 26, 2008
    In our quest to produce actionable acceptance testing guidance , we are looking for hard-to-test scenarios