In an earlier post, I made reference to Alistair Cockburn’s application of Shu Ha Ri to agile adoption. This comment summarizes a session at Agile 2008 in which the participants broke this agile maturity scale down even further:
0- The Agile Convert attempts to understand and learn all the Practices.
1- The Agile Purist follows all the Practices (this is the evangelical recent agile convert).
2- The Agile Pragmatist starts to realize that not all practices work in all situations and pursues the Agile Principles molding the Practices to their specific environment.
3- The Agile Purist follows all the Principles.
4- The Agile Pragmatist starts to realize that not all Principles work in all situations and pursues the Agile Values molding the Practices and Principles to their specific environment.
5- The Agile Purist follows all the Values.
6- The Agile Pragmatist starts to realize that not all values work in all situations and pursues the ??? molding the practices to their specific environment.
7- The self-actualized agile follower realizes that Agile is the embodiment of some higher understanding that can be applied in part or whole to any environment to help deliver more valueâ€¦ forget the boundaries of values, principles, or practicesâ€¦ these are just simple mechanisms to enable the education of agile to others.
On this scale, I would rate myself about a 4, with quite a bit of 7 thinking thrown in.
Where do you stand on this scale?
One of the Ten Tips for Agile Testing is “Use risk based testing.”
You can never test everything with the same (extensive) depth; even in a waterfall project you have to make choices. In an agile project all the activities are time boxed so you have to make choices about how extensively you want to test each feature. We use a risk based approach to determine which test activities we are going to carry out for a feature during the iteration. The risk level of every feature is determined by the customer and the teams. It is a very transparent process so the customer knows exactly which test activities are executed for every feature
I’ve heard a lot about risk-based testing in the last couple of years. I don’t mean to sound like a know-it-all, but I’ve been employing risk-based testing for years (which goes along with risk management mentioned in yesterday’s post), only I didn’t know it had a specific name. What do others think of this?
In a new post, Scott Barber reminds testers that there may often be valid business reasons for making decisions that may run counter to the tester’s view of what it takes to build quality software:
Most testers I meet simply have not been exposed to the virtually impossible business challenges regularly facing development projects that often lead to decisions that appear completely counter to a commitment to quality when taken out of context. The fact is that there are a huge number of factors influencing a software development project that, at any particular point in the project, may rightly take precedence over an individual tester’s assessment of quality. Given their lack of exposure, it’s no wonder testers seem to habitually take a “my team doesn’t listen to me” point of view.
When I conduct job interviews with QA engineers, I often test the candidate’s awareness of these factors by asking this question: “Can you name a time when you just had to put your foot down with regard to quality? For example, declared that the software can’t ship due to quality concerns, etc.?”
It’s a little bit of a trick question. The answer that I hope to hear is: No; it’s not my job to make those decisions; it’s my job to provide risk assessment data to decision makers who do have to make these tough decisions. Secondarily, if I’m doing my job correctly throughout the dev cycle, there should not be any surprises of this type. If a situation is building that might result in such a confrontation, then I haven’t done my job in monitoring the situation, trying to solve it, or at the very least keeping management in the loop on the building crisis, so that they can make appropriate contingency plans. There’s nothing management likes less than getting into a crisis situation with no warning.
Down under, Dean Cornish has been having a hard time finding qualified QA engineers, and in his recent blog post, he ponders why that is.
In his post, Dean throws out a lot of possible reasons for this problem, but the end of the post gets to the heart of the matter for him:
Off the top of my head I cannot recall a single university in this country that talks about a career in testing as an equally viable career choice in the same vein as development. Even though in the workplace, I’d argue that testers have an equally as important role as developers. This discrepancy contributes to our lack of growth in mature and capable candidates, leading us to see the same poor candidates going from shop to shop and always somehow getting through the front door.
It is as though testing has become the place for people who fail at being a dev, a system analyst or business analyst or if you can pull a visa and need something where the demand is so great that the quality of the screening is frequently wavered to get “warm bodies” through the door.
Maybe the situation is different in Australia than in Austin, but I’m not sure I see the same dearth of qualified candidates. And as for Dean’s concern about testing not being seen as “an equally viable career choice . . .as development”, as far as I can tell, that’s always been the case. If anything, this situation might be better than it used to be as the software industry has matured.
I’d love to hear others’ thoughts and experiences.
Amr Elssamadisy, author of Agile Adoption Patterns, has published an ‘Agile Adoption Cheat Sheet‘ in InformIT in which he outlines the steps to take in order to adopt agile.
One thing strikes me right away about this article: it makes no mention of the agile manifesto or the agile manifesto’s principles. Perhaps the author assumes that the reader already understands these basics, but I don’t really see how an organization can adopt agile without starting with the manifesto and principles. As I have mentioned before, you can follow all the steps in this guide but still not be agile unless the entire organization understands and buys into the values and principles of the manifesto.
Update: Amr Elssamadisy, the author of the article, replies in the comments.
I’ve spent a lot of time in the Borland booth the last few days, explaining our new BMS software to passersby. It’s actually been a lot of fun, as I’ve had the opportunity to talk with lots of people about our own agile experiences at Borland.
The most interesting conversation was with an agile coach from Sweden. If I understood him correctly, he believes that scrum has reached the point of being just another buzzword, and that many organizations are pursuing it just for buzzword-compliance, not understanding the fundamentals and value of scrum. I’m sure there’s a certain amount of that, but I don’t think it’s widespread enough to warrant his sharp opinion.
He did raise interesting one point, though: fork out a couple thousand dollars, attend two days of training, and you, too, can call yourself a certified scrum master. He believes that scrum masters should have much more training than that. He said that the agile coaches he’s working with have developed a year-long process for learning how to facilitate agile.
This guy’s thoughts on what it takes to be a qualified scrum master meshes with a conversation I had yesterday with some of my Borland coworkers. One of them was suggesting that scrum masters could benefit from facilitator training since facilitation is an established discipline that’s very similar to the role of scrum master. I would love to get some facilitation training.
Unfortunately, I didn’t get to attend any conference sessions on Tuesday. Several Borland employees got hung up in Chicago due to bad weather, so I spent most of the day working the Borland booth. I plan to attend some sessions today.