February 12th, 2013
The Geckos were coming towards the end of the sprint (day 28 of 30) and were having their daily scrum in the morning. The team were standing around the sprint backlog board and were taking it in turns to update the rest of the team on what they had been working on and their plan for the day. An extract from the conversation:
“Guy & I have finished coding the photo upload feature and have handed it over to Roxie and Eve to test” Christian said “and that’s us done for this sprint so we are probably going to make a start on the ratings feature from the product backlog to give ourselves a bit of a headstart for the next sprint. Obviously we have no impediments because I’m finished! Hooray!”
Christian then threw the speaker’s ball to Roxie to indicate that he wished to hear from her next so Roxie took her turn:
“This will probably take us the last two days of the sprint to test. I was hoping for it a little earlier and so we were kind of hanging around a little yesterday waiting for Christian and Guy to finish but now we have it we can get cracking on it. I am a bit worried about testing the photo upload feature as I think that might be a can of worms but at the moment I have no impediments”
The sprint burndown for the team was looking healthy and, if anything, indicating that they were likely to be finished early for the sprint so there seemed little to worry about. In fact, with Christian & Guy starting on some of the Sprint 2 stuff, they were actually in an even healthier position than the sprint Burndown was implying.
It came as a bit of a surprise to the Product Owner Ericka, then, when she was told at the end of the Sprint that this important feature was not actually ready to deploy as it didn’t pass testing. The sprint burndown had been telling everyone that we were absolutely on track and some of the team had even started working on “bonus” features that hadn’t been planned into this sprint.
In the retrospective the team discussed what happened to cause the team to miss their commitment.
“Well we managed to get everything done our end” said Christian “it was just the testers that didn’t manage to do their stuff”
“Well if you had managed to get it to us quicker, we might have had a chance” Roxie replied “We only had two days to test it and I did say that the photo upload was likely to be a can of worms.”
Marcelle, the ScrumMaster, quickly tried to defuse the situation “Whose fault it was doesn’t interest me, and I am pretty sure it doesn’t interest Ericka. All she sees is the team failed to deliver her priority #4 feature this Sprint but somehow managed to find time to work on her priority #9 feature.”
“What makes you believe that the testing of that feature was purely the responsibility of Roxie & Eve?” Marcelle asked Guy & Christian
“The fact that they are testers” they quickly answered, in unison.
“How do you feel about the fact that the effort you put in on that feature returned zero credit?” he then asked
“That’s not true. The code is there. It is almost complete” Guy said, getting a little defensive
“Well from the point of view of the project, that feature is ‘not done’. It’s not classed as ‘almost done’. Just because you felt you have done ‘your bit’ doesn’t count for anything because it’s not potentially deployable” Marcelle said “We all understood this when we talked about our definition of done in sprint planning”
He then asked “And the ratings feature that you started while Roxie & Eve were testing, is not potentially deployable either so that is potentially wasted effort as well isn’t it?”
“Only if Ericka decides she doesn’t want it any more” Christian answered, sulking a little now
“What do you suspect she would prefer to have received? All of feature 4 or part of feature 4, which can’t be deployed, and some research/design on feature 9?”
“Well obviously she would prefer feature 4 but we’re not testers” Guy said
“So, because you aren’t testers you can’t be involved in testing?” Marcelle asked “Is there no other way you could organise yourselves to reduce the risk of this happening? Is this worth spending some time on in the retrospective?”
The “developers don’t test” syndrome is probably one of the most common role clashes in Scrum teams. Because most organisations are set up around functional areas and individual career paths are focussed on functional specialisms, usually based around the stages of a typical waterfall lifecycle, this is incredibly commonplace and is even backed up with job descriptions. In Scrum, the role of developer is intended to mean a member of a development team whose primary responsibility, above all others, is to help the team turn product backlog items from requirements or user needs to “done”.
The default definition of “done” in Scrum is “potentially deployable” or “potentially releasable” and while this is going to be slightly different in from organisation to organisation and, potentially, even from team to team, things can rarely be described as “done” if they haven’t been tested. So testing must be part of a Scrum team and this will, in most organisations, invariably involve people with the primary responsibility and skillset of testing to be part of the Scrum team. This does not, however, allow the other members of the development team to abdicate responsibility for testing to those “testers”. Testing is a whole-team responsibility in Scrum.
Funnily enough it is not just the developers who are concerned or anxious about blurring this boundary between development and test. There are status concerns, quality concerns, job security concerns, career progression concerns. I have seen a lot of organisations who attach lower status to testers than they do to developers and only when you have become “good enough” are you allowed to become a developer. This is absurd in my opinion but, nevertheless, is a cultural reality (re-inforced by pay rates etc) that some ScrumMasters will encounter and so may need to work hard at countering (with the help of management, HR etc if necessary). Equally, I have met many who believe that you have to have a “special mindset” to be a good tester and developers are a different breed. As such, they can’t be trusted to test. Again, this is a ridiculous position from my perspective and one that is almost certainly going to become self-fulfilling. The more that developers are not trusted to test, the more they will act that way and shirk the responsibility of writing good code.
Will spreading my skills away from pure development into the testing arena reduce my marketability? Will I have to sacrifice my focus on being the best in my field in order to learn (and implement) new skills? Am I going to have to become a hybrid? Will I miss out on some sexy new languages or coding techniques just to do more testing? These are just some of the concerns over job security and career progression that people have when facing this situation. In contrast, the growing evidence suggests (unsurprisingly in my view) that the rates for developers with good testing/quality skills are much higher than those without.
Marcelle (or, more accurately, the team) has a number of options. They already seem to be used to the practice of pairing (2 team members working on the same task together) as Christian & Guy were paired up on the coding of the photo upload feature while Roxie & Eve were paired up on the testing of it. Perhaps an alternative would be to pair Christian & Eve and Guy & Roxie. This way, the testing can be undertaken during development and development can evolve to the feedback of the ongoing testing process. It is much more collaborative and agile at heart than pairing developers to feed testers.
Most Scrum teams find pretty quickly that, in order for them to make regular, ongoing progress, they need to invest in automation of their tests and this practice will most likely be helpful to the Geckos too. This usually leads to test-first or test-driven development, which will help reduce the bottleneck that the Geckos are feeling currently.
Another option is for the team to implement some form of kanban type limits on the stages within a sprint. For example, the team would set a limit of a maximum of two items to be “in test” at any one time. No further items are then allowed to be worked on until there is testing capacity and the team then “swarm” around the items at the bottleneck stage to clear up capacity.
However the team decide to move forward, they need to be comfortable with the choice so Marcelle would do well to facilitate the discussion gently but thoroughly (possibly with an HR presence) but he should also bear in mind that sometimes teams need to be pushed outside of their comfort zone in order to develop and this area is definitely outside of most teams’ comfort zones.
“It is better for the developers to be surfing than writing code that won’t be needed. If they went surfing, they would have fun and I would have a less expensive system and fewer headaches to maintain” – Jeff Sutherland
The concept of “done” is critical to Scrum. Teams do not have the luxury of being able to just do the coding this sprint and test it next sprint. In Scrum, every feature that the teams take on needs to be potentially releasable at the end of the sprint.
Ultimately, there is no such thing in Scrum as a tester or a coder. There are only 3 roles in Scrum – ScrumMaster, Product Owner and Team. The team consists of the skills necessary to get items from the product backlog to a state of “done” and it is everyone’s responsibility to make sure things get “done”. There is no point any member of the team getting all of their tasks done if the features are not complete, and that involves testing. If there is a bottleneck in skills, the team have a responsibility to do what needs to be done, to the best of their collective ability. And sometimes doing nothing (not adding code that can’t be tested) is the best response.
0 comments | add comment|