Today, an Amerigas employee who is involved with the rollout of their self-service propane bottle exchange service contacted me regarding my poor experience with the service at my local Home Depot.
He asked me some specific questions about my experience in order to try to troubleshoot the problem.
That’s just so cool.
Author: Stan
Problem-solvers vs task-doers
James Bach posted the following comment to my recent post about context-driven testing:
[Context-driven testing] is not self-evident, except to problem-solvers. To “task-doers” it is self-evident that they should memorize practices and ignore context– because they will say that context doesn’t really change or matter much.
I’ve been mulling that thought over for the last few days. In my jobs as QA lead/manager/architect, I’ve designed and deployed a lot of QA processes over the years: defect tracking, automated testing, test management, etc.
I would rate my success in this work as only so-so: many of the processes I’ve developed were not followed very well, not by very many people or not for very long. This apparent lack of success has troubled me.
But if you view my process-development work in the context of problem-solvers vs. task-doers, maybe those aren’t the right judgment criteria.
I’ve almost always worked in environments that valued problem-solvers (like me) over task-doers (or process-followers), and when I’ve implemented said processes, my coworkers and I were very clear that we would implement as much process as necessary and no more. Nothing bugs problem-solvers like me more than processes that are followed for their own sake–especially processes that do not seem to serve any valuable purpose.
So, within a group of problem-solvers, processes arise, evolve and die based on need; processes don’t tend to live on if they don’t have clear value. In such an environment, then, the appropriate criteria for judging the success of a process would be: did the process serve its intended purpose? And based on the general success of the groups that used the processes that I implemented, I would say my work was fairly successful: the processes that I developed and implemented usually served the need at hand, and then evolved or died based on changing needs.
Self-service software fail
Over the holidays, I noticed that the Home Depot near our house had installed a self-service propane bottle exchange system. It consists of a computer kiosk and a rack of cages for propane bottles.
I watched the demo:
Step #1: You swipe your credit card and make the purchase.
Step #2: The door to an empty cage pops open.
Step #3: You insert your empty bottle in the cage and close the door.
Step #4: The door to a cage containing a full bottle pops open.
Step #5: You take your full bottle and go on your merry way.
As soon as I saw it, I knew I had to try it. One of the big advantages is that you can replace your propane bottle even when the Home Depot store is closed. As a QA engineer, however, I knew better than to try it after hours.
So, last Sunday afternoon, I took my empty propane bottle over to the store and tried it out. It worked great until the end of step #3. After I deposited my empty bottle and closed the cage door, the kiosk screen thanked me for my transaction and did not continue with steps #4 and #5. No error messages. Nothing. I was left standing in the parking lot with a $19.97 debit and no propane.
I walked up to the service desk in the store, and explained what had happened. The service desk employee told me that it had been doing that. He added, in a matter of fact tone, “It’s computerized, you know” as if that explained everything. He grabbed his keys, took me out to a different rack of cages with padlocks on them, gave me a full bottle and said, “Good thing we kept these around.”
As a software developer, I’m proud to see that the products of my industry inspire such great confidence and excitement in people.
Context-based testing
Cem Kaner, James Bach and Bret Pettichord have been developing a concept that they’re calling context-driven testing. Here’s one version of the emerging definition:
Context-driven testers choose their testing objectives, techniques, and deliverables (including test documentation) by looking first to the details of the specific situation, including the desires of the stakeholders who commissioned the testing. The essence of context-driven testing is project-appropriate application of skill and judgment. The Context-Driven School of testing places this approach to testing within a humanistic social and ethical framework.
Ultimately, context-driven testing is about doing the best we can with what we get. Rather than trying to apply “best practices,†we accept that very different practices (even different definitions of common testing terms) will work best under different circumstances.
It’s good that Cem, Bret and James are trying to define this concept, but I thought that context-based testing, like risk-based testing, was self-evident. It’s good to have principles to follow, but I would never dream of following the prescribed process without evaluating its appropriateness to the situation and adapting it appropriately.
One example where context-based testing is necessary is automated testing. I absolutely believe that as much testing as possible should be automated, but developing automated tests takes time and effort and there are other contentions for the resources. With functional automation, this often results in a phased automation strategy: start small, expand based on priorities over time.
Who to hire
A colleague mentioned to me the other day that he prefers to interview people who are currently unemployed. When I asked him why, he explained that they’re more eager to negotiate and more likely to accept an offer than someone who is currently employed.
You can most certainly fill an open position faster by interviewing unemployed candidates but you’re less likely to hire the best candidate for the job. A candidate who is already bringing home a paycheck generally has a greater ability to consider the suitability of the job than a candidate who needs that paycheck.
In general, I think that hiring the best person is probably of greater importance to the company than filling an opening quickly.
What do you think?
Ends and means
James Shore, author of The Art of Agile Development, has a new blog post, The Decline and Fall of Agile, in which he declares that “the agile movement has been in decline for several years now,” and by “agile” he specifically means “scrum.” He writes:
There are a lot of teams right now failing with Agile. These teams are working in short cycles. The increased planning frequency has given them more control over their work and they’re discovering and fixing some problems. They feel good, and they really are seeing more success than they were before.
But they aren’t working in shared workspaces or emphasizing high-bandwidth communication. They’re don’t have on-site customers or work in cross-functional teams. They don’t even finish all of their stories by the end of each Sprint, let alone deliver releasable software, and they certainly don’t use good engineering practices.
These teams say they’re Agile, but they’re just planning (and replanning) frequently. Short cycles and the ability to re-plan are the benefit that Agile gives you. It’s the reward, not the method. These psuedo-Agile teams are having dessert every night and skipping their vegetables. By leaving out all the other stuff–the stuff that’s really Agile–they’re setting themselves up for rotten teeth, an oversized waistline, and ultimate failure. They feel good now, but it won’t last.
As I’ve said before, the thing that sets agile apart is that it is, at heart, a set of values and principles, not just a set of practices. Methods such as scrum or XP provide practices that have been demonstrated to support the agile principles. But unless you understand why you’re undertaking the practices, you’re setting yourself up for problems.
As agile gains wider acceptance and as organizations start to drive agile adoption from the top down, it’s inevitable that some groups will mistake the practices of scrum for a recipe for success. All we can do is continue to focus our education on the agile values and principles and ask people, “Why are you doing that?”
Survey says: 40% of CIOs are clueless
A new survey of CIOs found that “40% of CIO’s reported a general indifference towards the quality of the software they produce.” Also see the blog post about the survey with some good comments. Interesting. Not surprising, but interesting.
The ten commandments of agile
Blogger Kishore Kumar, who apparently has no direct experience of agile, recently looked at the agile manifesto and its related principles of agile and concluded that agile is ‘the new religion‘ and that the agile principles constitute its twelve commandments.
Over at the Borland Agile Transformation blog, I wrote three blog posts in response to Mr. Kumar’s thoughts on the first ‘commandment’ (see here, here and here).
I thought I would move back to my blog for my thoughts on the second ‘commandment’: Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.
Mr. Kishore lists three possible reasons why requirements change late in the process:
- The system analysts did a bad job of requirements elicitation,
- The business guys do not know their business or do not bother to explain things to the IT guys, or
- The business need itself changes frequently (i.e. competitive landscape changes frequently in an unpredictable fashion)
Mr. Kishore comments that while he has seen the first two happen, he has never encountered an example of the third, and spends the rest of his post explaining why.
But let’s look at the first two sources of changing requirements. Based on his wording, I have to assume that Mr. Kishore considers these to be problems with the process. Why should one adopt a methodology to deal with the results of these problems?
It is indeed a problem if, for instance, the business-oriented people don’t adequately communicate customer needs up front. But a conventional waterfall-type project doesn’t have any good mechanism to deal with such screw-ups. If such a SNAFU happens, then your project is delayed while you re-plan and re-work, or you take some other sort of shortcut in order to meet the schedule. As long as each step of the process goes according to plan, you’re good, but as soon as any of them encounters unexpected problems, you’re screwed.
But agile assumes that imperfection, not perfection, is the normal state of affairs. That’s the beauty of agile.
New blog
I’ve been asked to participate in a new Borland company group blog focusing on agile and other related issues. Check it out.
Agile Testing, the book
Agile Testing: A Practical Guide for Testers and Agile Teams, the much anticipated book by agile testing gurus Lisa Crispin and Janet Gregory, is scheduled for publication in December, but apparently, it’s already available as an ebook via via Safari books online. Call me old fashioned, but I think I’ll wait for the paper publication.