250 hours of practice – January

As said in my post a couple of weeks ago, this year I would try to spend 250 hours on practicing and enhancing my testing skills. This post is a report on how I fared in January 2012. (Leaving my personal favourite untill the end…)

I started enthusiastically on January 2nd by following up on a post about the “Follow the link exercise” by Jeff Lucas. In short the exercise is to choose a blog post of your liking. You start reading it critically and then follow every link mentioned in the post. You then pursue this with every post that you read in a one hour session.

In my session, that actually lasted two hours, among others I followed up on a link to Alan Page’s blog “Tooth of the Weasel”. This post contained an overview of posts Alan wrote in 2011 so there were enough links in there to follow-up:
My job as a Tester
What is Testing?
Test Design for Automation
Numberz Challenge
Beyond Regression Tests
Judgment in Testing
Lost in the weeds

Although I had heard about Alan Page I was not yet familiar with his work. It pleasantly surprised me with some useful ideas and even some advice for my personal goals for this year. Let me give you some quotes I found interesting:

“What you do or don’t define as testing may differ per context.”

Automated testing “starts the same as always. Design your test first then automate where eligible. Coded tests do not replace, but enhance human tests.”

“Do not only use automated testing for regression. Vary the data, the sequence, randomize, to find new information” data driven testing

“Are testers’ second class citizens? NO. Are they whiners? Yes; Figure out how to get and earn respect!”

My second (larger) series of practice session(s) started with watching the 2011 GTAC keynote by Alberto Savoia with the ominous title “Test is dead”. You can read more about this on the blog post I wrote “Is testing dead?”

My third endeavour entailed reading the hardcopy of the book “Essential Software Testdesign” by Torbjörn Ryber. The E-book is free to download, but I liked the content enough to want to own it. Some warning is in order however. Even the hardcopy has a somewhat annoying number of typos, illogical sentences and even faults. Nevertheless the concepts Ryber discusses are helpful for many a tester.

Early in January the DEWT’s met up again. This time to discuss and prepare the TestNet event about context-driven testing. On January 18, some 150 testers visited the event to watch James M. Bach and Michael Bolton do a one hour introduction on context-driven testing using Go to meet  (which btw. worked brilliantly). After the break the DEWT Zeger van Hese, Ruud Cox, Ray Oei and myself gave a number of lightning talks followed by Q&A. Themes of the talks were “On being context-driven”; “Spin-Off”; “Context-Driven expert”; “Test Plan”.

All in all these activities got me some 20 hours of practice bringing me well en route for the 250 hours of testing practice. But to be honest I am even more of a test nerd. I have spent another 10-15 hours on following Twitter feeds with a peak while participating in a #Testchat lead by Lisa Crispin asking the following questions:
Q1: Have you worked on a “test automation project” that succeeded? What helped it succeed
Q2: What do you think upper management should know about testing? (not limited to automation)
Q3: related some to Q2: How do you keep your testing transparent to others on your team and in the organization?
Q4: Are testers on your team treated with the same respect as programmers?
Q5: sometimes the tester is undone by the process. Documentation outdated leading to looking like lack of knowledge

The last practice activity however was, for me personally, the most engaging, emotional and gratifying one.

In December I contacted Markus Gärtner to ask him for a challenge to see if I was worthy enough to enter the realms of the Miago-Do software school of testing. This actually the first step of the challenge having found a member. Markus offered my “The light saber” challenge. Several times during the challenge I would sent Markus my test investigation results and as many times Markus answered. I used several heuristic approaches, tried to inform the customer based on his needs and eventually offered a solution using personas. Somewhat to my despair Markus’s answers were getting shorter and repetitive and I asked Markus to debrief me.

We organized a one hour Skype session and went on with the challenge discussing results, progress en feelings during the challenge. Eventually we came to the point where Markus would reveal if I was allowed to enter Miagi-Do. The result got me stunned, silent and humbled for a moment… Not only was I a new member I was one of the members to, fully endorsed by other instructors, become a Black-belt.

I can only say again. Thanks guys, I am honoured.


What do you put in your test plan?

Theme event

This evening, January 18, I will be on stage during the TestNet theme event about Context-Driven Testing. The evening will start with a duo presentation by James Bach and Michael Bolton followed by a series of short presentations, lightning talks and discussion. In one of those lightning talks I will share a personal experience report. It describes how I changed the way I make test plans. Since most of you will not be able to be there (or might have never heard about TestNet*) I am sharing my experience with you in this post.


Even if a lightning talk last only for five minutes it still requires some preparation. So to extra prepare myself I placed my experience into perspective and placed the following message on Twitter:

” @Arborosa: Question to my tweeps: What items do you put in a test plan?  I’ll put the results on my Blog. (please retweet)”

This resulted in the following responses:

Rob van Steenbergen (@rvansteenbergen)
Scope of testproject, context of product (with mindmap), product risks and qlty attributes and risk approach, planning, who tests, stakeholders, testing tools, explanations abt testing for orgs that are still learning. TP is also promotion material for testing.

Stephan Kämper (@S_2K)
Well, what to put in a plan? A (current) goal of what you’re planning. The major way you’ll follow to reach said goal. A ‘Plan B’. (Known) risks – What’s the risk of following the plan? …the risk of *not* following it? Tools & Techniques? Not sure about these.

Nitin Hatekar (@nhatekar)
Entry and exit criteria for each test phase and specific test approach for each phase. Scope of testing and the estimates for completion of in-scope test efforts. A section for assumptions, risks & blockers as well.

Rik Marselis (@rikmarselis)
For your testplans take IEEE829 (1998) as a starting point. And see tmap.net for templates 😉 (And after a reprimand by Huib Schoots to be more serious) Don’t start with making the testplan. First make the outline of the testreport. That’s your deliverable! The testreport outline must be discussed with stakeholders. Then you have startingpoint for your testplan.
Jesper L. Ottosen (@jlottoosen)
Generally answers and descriptions to “how” – to the level required of the context. ie #itdepends 😉
Jan Jaap Cannegieter (@jjcannegieter)
Write in your testplan the info your stakeholders need. So ask your stakeholders what kind of info they need. Write it for them!
Generally I see two trains of thought here. On the one side there is the idea of having more or less fixed items in a test plan. Things like scope, approach and (product) risks. On the other side the idea to not start with fixed items or a template, but to ask the stakeholders what information they need to have in the test plan. As you will see this kind of follows the change I made.
From old to new
Historically my organization has approached software testing by following a standardized test approach based on TMap. Similarly test planning is, or rather was, based on an extended TMap style “Master Test Plan” template. The raw template itself counts 24 pages when empty, but includes some examples and explanation. The idea is to fully fill in all items in the template, see list below, and get it signed off by the principal stakeholders.
In short the template was as follows (ChaptersParagraphs and Sub-paragraphs):
Colophon Strategy
Management summary Test levels
Goals Entry – exit criteria
Preconditions Test objects
Budget and milestones Scope
Assignment Dependencies
Introduction Project risks
Assignment Communication & Procedures
Client Reporting
Assignee Meetings
Scope of the assignment Procedures
Test basis Test product / Deliverables
Objective Project documentation
Preconditions Testware
Starting points Test Infrastructure
Release from assignment Workplace
Test strategy Test environment
Product Risk Analysis Budget, planning & organization
Test goals Budget
Compenent per characteristic Planning
Test goal vs component matrix Team composition

I have to admit that all items in themselves are in some way relevant to testing software. But one can argue the usefulness of some of these items and more so of having these items together in one document.

The latter is best illustrated by a remark my mentor made when after three months, of being a professional tester, I was writing my first Master Test Plan. He said: “Don’t waste time. Take one of my plans. Ctrl-H the project name, change the stakeholders and check if there is mention of specifics not relevant to your project and change them. All else you can leave the same. So, even if I resisted the idea, like my colleagues I learned to do the drill; fill the template in a copy and paste style. Only occasionally I had a stakeholder question what was in it and disturb from actual testing.

Some five years ago I changed departments and found myself in a place in which I not only was free to use only those elements that I felt were useful, but I could start changing the template and the use of it entirely. But there was resistance both from the testers and the stakeholders. The testers, I think, because some of them now had to think and communicate more and the stakeholders because this broke with the standard process and they too would have to get involved and think more. To break the deadlock I started with an experiment. I filled in the template not only complete but to the letter of the “law”. I ended up with a 36 page document which I immediately send out to all stakeholders with an invitation to meet next week, meantime thoroughly check it and be ready to sign off the document during the meeting.

At the meeting the stakeholders were sitting silently, sighing at the thought of having to go through all the 36 pages. I didn’t do that. Instead I asked how many of them had read the document. With 6 out of 8 I was actually impressed. I then asked  how many of them had reviewed it. Still 4 out of the 6. I then asked who found the document a pleasure to read, who fully understood its content and thought it was of value to the project. As I hoped for all the attendees broke out in commenting the document, its length, its irrelevance, the difficulty of the content etc….

I decided to then pull out the rabbit and said “I agree with you all. I too think it’s basically of no use. There is no point in reviewing it. But we still need to write a test plan. So why don’t you tell me what you actually do want to know about testing your product.”

We spend an hour or so discussion what they wanted to know about testing. Agreed that since we are a financial institution we still have to follow certain rules, regulations and guidelines and that I would deliver a new document the same week.

I ended up writing a document that was still 24 pages long. But now it not only adhered to the documentation standards but of those 24 pages 11 were purely related to testing as way to mitigate risks and provide information about the product for this project and another 4 on testing and test heuristics in general. The original document had no  explanation and only 8 pages related to actual testing.


Approach writing a test plan as you would approach any test activity. Figure out what information your stakeholders need, if there are other things to consider like rules, regulations or standards. Use your personal experience and other references you think useful and then write a plan that suits your model of the context and verify and confirm it with your stakeholders.