Motley says: “Agile doesn't change testing - developers can just throw code 'over the wall' to the test team”
Summary
|
Motley: Developers should just throw the completed code "over the wall" to the test team when they are done. There is nothing different with testing in an agile world vs. a waterfall world.
|
|
Maven: Testing in an agile world is a completely different way of thinking about testing. "Testing" becomes more quality assurance, with testers involved in a development iteration from the start. The team rallies around test-first strategies, and deals with any bug and test debt before the interest becomes too high. |
______________________________
[Context: Motley is used to a waterfall style of software development where developers write code and pass it to a test team to validate once the code is complete. Now that Motley's team has taken an agile approach to software development, he is a little confused how the test team fits in, and if their role actually changes or stays the same.]
Motley: We have been practicing agile development for a couple months now and the testers on the team are a still confused as to what they should be doing. I don't see what the problem is - I tell them to just wait until I have something to check-in near the end of the iteration and then they can have at it. While they are testing I go on to the next task.
Maven: So you throw it "over the wall" to the test team once the code is finished.
Motley: Basically, yeah. Got a problem with that?
Maven: Maybe that "worked" in a waterfall world, but that model is not in line with agile principles. Agile brings about a new way of thinking about delivering software. Developers no longer "throw it over the wall" to the testers when the code for a user story is checked-in. Testing becomes more of a forethought than an afterthought, with quality built in at all stages of development.
Motley: How so? There is nothing to test until I finish the code! You're a little slow today, Mave.
Maven: Au contraire, mon ami. As we discussed previously, with agile we break up larger features into small vertical slices of functionality that deliver real end user value. The team is involved in setting exit criteria, or "done" definitions, for those slices and each piece gets implemented one at a time. The testers are involved from the beginning of the sprint-
Motley: But there is nothing to test until the code is written!
Maven: Agile presents us with a new way to think about testing. In a company like ours that has a tester career model, we need to adjust our thinking of what a tester is from a quality control view - where testers validate the functionality that the devs wrote, complete with running test passes and filing bugs - to a quality assurance view - where testers help prevent bugs from occurring in the first place. We treat testers as fellow engineers with a focus on test practices, but they should also be capable of writing production code when required, and devs should also be capable of developing test automation and pitching in to validate tasks. Testing is more of a team responsibility, and everyone's focus is high quality sooner in the cycle/iteration vs. testing in quality late in the cycle/iteration. The development of a slice of functionality is no longer phased, with an implementation and test phase. Instead, the testers work in lockstep with the developers, even thinking through and writing tests before any code is written.
Motley: I am still confused - what exactly do they do before any code is written?
Maven: On a good agile team, every engineer is "test-infected". Testing actually becomes a primary focus for the team, with everyone (including developers) pitching in as required. Driving testability helps drive a good design and helps promote solid engineering practices, such as ensuring unit tests are checked-in with every change to product code. All that said, there can still be a specific test role if warranted. Testers help with requirements brainstorming, design brainstorming, prototyping, technical investigation, and automation infrastructure design. Perhaps most importantly, however, is the creation of acceptance tests as soon as a story is defined for an iteration.
Motley: How can a tester create an acceptance test when there is nothing yet to test?
Maven: In the same manor as a developer could write tests before code. Remember our discussion on Test-Driven Development (TDD)? Think of this as Acceptance Test-Driven Development. Ideally tests are written in some form that can eventually be executed, such as in code or using a tool like Fitnesse, which allows even customers to write executable tests. However, writing tests using plain English in a team notebook can also be just as valuable. The tests help drive the overall design giving the developers check-in criteria. Let's leave ATDD for another discussion.
Motley: It still seems like testers would be idle while I write the code.
Maven: Not really. Testers can help with other activities in addition to those I already mentioned, such as writing unit tests, developing further acceptance tests (it can be more than one per story), participating in code reviews, doing private buddy testing prior to finalizing the code, developing integration tests, and even writing product code as any other engineer on the team. The important concept here is that the team marches towards the same goal - the completion of a particular story. There are plenty of parallel and collaborative activities involved in story development, and the team should not go on until a story has met the exit criteria. Once the code is checked-in, there is very little validation that should be done by the testers because most of it was done up front.
Motley: There still needs to be some level of integration testing. The individual story may be functionally complete, but we typically have to put the developed components together with other pieces.
Maven: True, and a very astute point. The unit tests should do the validation of the individual components, but components still need to be glued together. Testers can validate the integration and file bugs as they come across them. Another difference with agile is that developers do not start working on new functionality at this point - their first priority is addressing all of the issues found during integration, preferably as they come up. Bugs are a sign of not being done. The team - not just the developers - but the team, should be embarrassed when bugs are found post-check-in. Whereas with waterfall you may have a big "stabilization"/bug fix stage at the end, with agile you fix the bugs before declaring victory on a story. Bugs are debt, and should be dealt with before new features are added.
Motley: But it still may happen that a story may be coded but not necessarily completely validated before the end of an iteration. What then?
Maven: Yes, it happens. In that case, the team rallies around getting the story done before moving on. The task is inserted into the next sprint, and you do not get credit for finishing the task in the previous sprint (and thus you do not demo it). The team takes care of any test debt before moving on.
Motley: What about building up the set of automation infrastructure? Testers often get behind on that early.
Maven: If automation infrastructure is important to the team, then everyone pitches in to get it done quickly - including the developers. However, test infrastructure should be built in the same fashion as product code - incrementally with vertical slices. Don't build a huge amount of infrastructure and add tests later. Build enough infrastructure to support your first few tests, and then continue to add-on and refactor from there. Remember, testers are engineers just like the developers, and there is no reason both roles cannot work the same way.
Motley: I am skeptical, but I guess this change of thinking could work.
Maven: It can absolutely work! Of course, there are caveats. First, keeping work in small chunks as opposed to big changes helps this model. Second, for the team to succeed it requires time to focus on the task at hand. Randomizing the test team with, for example, test passes for other simultaneous releases will build up test debt and destroy this model. Third, some testing has to occur closer to the tail end of the iteration or even cycle, such as end-to-end integration testing, performance testing, and stress testing. Once again, the team rallies around those test goals and everyone focuses on completion of those tasks.
Motley: There is always a catch, isn't there?
Maven: Those are minor caveats. The important concept is to act as a team fulfilling the same goals. The team builds one story at a time, fully completes them according to a "done" definition, integrates as they go, and minimizes debt as much as possible. The product is always of shippable quality at the end of an iteration (with occasional exceptions - be pragmatic). In order to be successful, it really requires a test-first line of thinking, but let's hit on that in more detail next time.
______________________________
James' Pointer: At Microsoft, the Software Development Engineer in Test, or SDET, is an important role. The best SDETs help hammer out bugs before they even hit the source control system. In fact, in my opinion, the best teams have testers that report very few bugs because they practiced quality assurance instead of quality control. Measuring an SDET by the number of bugs they report is obviously wrong in this model. The SDET career model is also not directly aligned to agile, but we make it work. Agile tends to blur the lines between the Software Development Engineer (SDE) role and the SDET role. Perhaps in the future, engineers become more generalized, with some taking on a test role and others a development role, but all being capable of switching depending on team need.
James' Double Pointer Indirection: Using this kind of model, where test is involved up-front, typically produces fewer features in a given time period. The core reason is that features are developed to a "done" state that is far stricter than with a typical waterfall team. However, you save time in the long run because you avoid the lengthy bug tail. Think of it as "slowing down to go fast", a common theme in my everyday activities at Microsoft.
Maven's Resources:
- A selection of articles around agile testing: https://www.exampler.com/testing-com/agile/
- Decent book that describes what it is like to be a tester in an agile environment:
https://www.amazon.com/Agile-Testing-Practical-Guide-Testers/dp/0321534468
- Very good magazine that gives several perspectives on testing in an agile world:
https://www.testingexperience.com/testingexperience03_09.pdf