Saturday, 31 August 2013

I need feedback NOW!

Problem: Feedback cycle time causes loss in productivity.

One of the productivity killers is long feedback times – The longer it takes to get feedback the more time is wasted either waiting or context switching.

Consider this example: Development has completed fixes to several bugs, but these are not tested until a week later. During this week the developers has written 100 lines of code and is no longer in context of the work done. When feedback from test, a customer demo or another source the developer has to start from scratch. It also goes for testers writing test cases, business analysts writing specifications etc. for the purpose of this article we will focus on feedback cycle time between development and testers.

Solution: Explorative test, and reviews.

The agile projects we run makes the need for feedback even more urgent, as a week of waiting will not fit into a sprint of 14 days. In order to facilitate quick feedback we utilize explorative test and reviews, and this is how it is applies in a sprint.

When the sprint the starts reviews are applied the Sprint Backlog Item (SBI), focusing on code and test ability. This forces participants into thinking before acting, and flushes a series of questions. These questions reduces the chance of misunderstandings and misinterpretations of the requirements.

While developer profiles write code, test profiles write test in the team. This test focusses on the acceptance criterias of the SBIs under development, and is often no more than test charters. The test charters can then be executed as soon as deployment is done to a test environment.

Formalized test cases can be recorded using the Microsoft Test Manager (MTM) for explorative testing, making test case writing a 2nd priority when preparing a team for fast feedback. For details on how to record bugs, test cases and general usage of MTM for explorative test please visit:

Before anything ships all code is reviewed using StyleCop, but on top of that pair reviews amongst the developers is encouraged. Another thing that I have not mentioned is unit test, but (I hope) that it goes without saying that unit tests is the fastest feedback a developer can get – A must have!

Things that might be worth trying to reduce feedback time: Developer runs test cases directly related to acceptance criteria for the SBI he is working on BEFORE check-in. We have not tried this in practice, as it would require detailed test cases being written up front.

Happy testing!


Friday, 30 August 2013

Give me your entry criteria - please!

Problem: Testing can be started when the project plan says so?

If you've ever been involved with testing in projects you know that they tend to hit this invisible "time's up"-wall. And that's exactly when testing will start. Not that the product is complete or the test lab is ready but estimates, plan and reality meet and the test starts.

As an experienced test professional you know that this will have a number of consequenses like defect opening rate that will break records, testers that will be demotivated (especially if they are from the business), project managers that will demand a million answers (their minds are always set for "green"), etc.

So what to do?

Solution: As early as possible create and maintain a list of entry criteria for the given test phase!

I know this is the most trivial suggestion in the world. It's so basic that is screams. So rudimentary it cannot be missed in any project. But why is it that so many projects they tend to skip this exercise?

Sometimes because the list will be ignored anyway. Some organisations tend to turn the blind eye to formal testing activities and are thus bound to fail. So no matter whether you like it or not the first exercise that an entry criteria list can be used for is the benchmark: "Test does matter". If not, then adjust your ambitions and test approach.

Anyway, preparing an entry criteria list is fun. A certain kind of fun, yes, but it is operational or should aim to be operational on a fairly low level. Not to the extent that all activities have to be specified in minutes and seconds but "Test lab ready" is usually to broad to accomplish.

If you find yourself in a project that can only come up with high level entry criteria you have another challenge. This is the clear indication that the project is immature, no-one understands the goals, the constraints and ownership are missing. That problem needs to be adressed or you can have the best test plan which nobody will be able to exercise. But the missing project goals have to be handled on project management level. Back to the test lab and some work that can be done while PM takes care of project goals.

"Test lab" is instead a good example of a category that can be used for grouping related activities on the list. You know you need some physical space, desks, hardware, network etc. This goes into the list. Documentation might be another.

The fun part is especially true when you find somebody with an interest this area. Again - end users or business are usually your friend. They don't know what "entry criteria" are but they can be woken up at 2 am and will be ready to specifically state what they expect. Usually this will be in form of a one liner as in: "Everything works". This is where the final fun part starts because as a test professional you have the possibility to challenge the "everything works-expectation" and understand what's really important and what's not important at all. This will impact entry criteria and can be used as a leverage towards project management and be used in prioritisation of import defects etc.

"Must have" and "nice to have" categories as well as a "done" mark is a great way to communicate to all relevant stakeholders that there is a list of known and understood activities that have to take place before it makes sense to start testing. Otherwise it will be a bad trip in the roller coaster.

Entry criteria lists as dead documents are boring and useless but as support for living discussions and negotiations they are priceless and tangible. "This is what you get - will it work out for you".

There is more inspiration to be found in this document which I came across searching for the generic check list for entry criteria. Such one does exist but you need to adjust it to your project reality and goals.

Tuesday, 27 August 2013

Are we done yet?

Problem: Agile teams often struggle with definition of done. Either because it does not exist, or is not used.

Traditional thinking in project management is "Plan the Work, Work the plan", and that really does not comply with the agile World. This means that agile teams needs to have a definition of done, as they do not have a plan to work. For some time we struggled with our DoD, causing all kinds of grief for the test effort, as everyone had their own ideas about what it meant to be done.

Solution: "Done" must be defined, communicated, and agreed by the team.

The experiences we got from working a little with this problem in the sprint retrospectives was as follows.

Defining DoD should be simple, measurable and written somewhere for easy reference. Simplicity is advisable, as a very large DoD definition is unwieldy and often ends up being ignored, as it would require a project to maintain it. It needs to be measurable, else it will be impossible to determine if fulfilled. Keep it in writing, and stick it on your board, that will remind the team about the ‘contract’ defined in the DoD.

Communication about the DoD is easy - Read it out loud at each sprint planning meeting, and ask this question: “Will this do for our sprint?” If the answer to that is no then discuss why and make necessary changes. Agreement is established through this exercise, as the team signs off on the DoD as part of the discussion in the sprint planning.


Some advise that might be useful when defining Definition of Done:
  • Set quality bars that ensures that business value grows with each release
  • Use common sense
  • KISS your DoD
  • Use the retrospective to evolve DoD


Our DoD: 
  • Sprint Backlog Items (SBI) included in the delivery are "Done" when:
    • SBI is documented, prioritized & Broken into tasks
    • All development tasks have associated change sets
    • Code has been reviewed using StyleCop & unit tests covers 85%
    • All test tasks have associated test charters or test cases
    • Functional and non-functional requirements outlined in acceptance criterias are covered by test.
  • Severity 1 bugs are fixed, severity 2+ have been reviewed.
  • Regression test has been executed
  • Acceptance test has been executed (if delivery to client takes place)

Have a nice day & Happy testing!


Friday, 23 August 2013

Microsoft Certification 70-498 Delivering Continuous Value with Visual Studio 2012 Application Lifecycle Management

Problem: Passing Microsoft Certification 70-498 Delivering Continuous Value with Visual Studio 2012 Application Lifecycle Management
Working with Application Lifecycle Management (ALM) is something that has always interested me – It means the world for quality, work satisfaction and stakeholder happiness. Following my latest certification at Microsoft I decided to explore the ALM subject a little more and came across this little promo video for Visual Studio from Microsoft:
This video explains why it is worth looking into ALM.

Solution: Study ALM as a concept and explore Visual Studio.  


Microsoft Learning really have some nice pointers to materials for studying – Check out the links under preparation options.

This is how I prepared myself for the exam:
  • I had a look at the Microsoft Learning and printed the study guide (skills measured section).
  • I watched the Channel9 videos – Approximately 6 hours of video with great educational value, not to mention fun as the presenters are really great!
  • I Browsed for information on how MS runs ALM in Visual Studio.
  • Work with projects in Visual Studio – We use the web version, and knowing your way around in there should do the trick. I printed many screenshots, and filled those with notes, that helped me remember details on the Visual Studio tool.

Tips that I found useful:
  • Having just taken the certification 70-497 – Some of the questions revolve around testing and QA, making it possible to reuse that knowledge. Otherwise spend a little time on the testing / QA topics, Channel9 can give you some video materials if needed.
  • Waste reduction is important to understand, especially faster feedback and cycle time. Tools like PowerPoint story boarding and Feedback Client can help with this.
  • Understanding Scrum as a process is essential – I got that for free, being a certified Scrum master, but if you are not familiar with the process, check this out:
What I learned from completing the certification:
  • Reading Cumulative Flow Diagrams, decoding things like; Cycle time, Lead time and understanding the impact of WIP items.
  • How to get the most of the abundant information recorded when using MS Visual Studio as a tool for application development.
  • Link between the methodologies for software development and how the process are supported in Visual Studio.
  • Tools for tweaking our projects – Something I look forward to using in the future.
Have a nice day & Happy testing!

Saturday, 17 August 2013

KISS your test plans, but not goodbye

Problem: Long and boring test plan documents are never read, hence end up not being used.
On many of the larger programs I have been managing, test plans have been huge documents build around the IEEE829 standard – These documents could be as large as 50+ pages, and so detailed that most readers got bored to death, and skipped reading the plan.

I really like the structure of the IEEE test plans, but not the result, as it ends up with a result that is unwieldy and hard to use in practice. The test plan template holds a lot of valuable sections that provide the test team with just the right information, but communicating the plan across to other stakeholders without causing an information overload is very hard.

Solution: Apply KISS principle to the version of the plan you use for communication.

I have had great success replicating the plan in a digestible version I PowerPoint  - I usually use 14 slides, one for each section in the IEEE test plan.
  • Introduction
  • Test items
  • Features to be tested
  • Features not to be tested
  • Approach
  • Item pass/fail criteria
  • Suspension criteria and resumption requirements
  • Test deliverables
  • Testing tasks
  • Environmental needs
  • Responsibilities
  • Staffing and training needs
  • Schedule
  • Risks and contingencies

Each slide holds the very essence of what is in the test plan document – Like a management summary. Some of the slides holds items that some stakeholders needs to commit to, and these are the ones that you need to pay very close attention to.
Test items, Features to be tested & Features not to be tested – This is your delivery contact with the project, make sure that he Project Manager & Client understand this, especially the what and why we have test items that are not tested.
Item pass/fail criteria & Suspension criteria and resumption requirements – This is the consequences of poor deliveries, this requires buy in from the Steering Group and Project manager. When the fighting gets though and time is running out this will be challenged, and having had the discussion while people are not under pressure makes the arguments more constructive.

Test deliverables & Testing tasks – Focus on the dependencies to other project deliveries. Critical path in the test effort is very important to mitigate risk and give you a better chance of success later on.

Responsibilities, Staffing and training needs & Schedule – Most important of all, who will show up, doing what and when. Often test borrows, steals and fights for resources, especially stealing and fighting can be avoided with clear agreements with other stakeholders.
Risks and contingencies – I use to call this the CMA (Cover My Ass) clause of the plan. This is where you can rant about judgement day, but it really does not matter much. What matters is to have clear and formalized procedures for risk management, so use the time getting those in place instead of listing all the risks you can possible think of.

Most important of all – Remember that your plan is like reality, it changes. These changes must be reflected in the plan, so keep it up to date, and renegotiate agreements that are obsolete due to the changes. If you do not do this I can assure you that you will have to compensate for the lack of planning with hard work.

Have a nice day & Happy usability testing!


Friday, 16 August 2013

Too many combinations to test...

Problem: Too many parameters to test all combinations thoroughly!

Ever been in a project that had to develop and test something that involved a large number of input variables as part of a business process?

Then you also know that testing all combinations is impossible and if you are that test analyst or test manager dealing with the project it will give you sleepless nights.

Solution: Apply pairwise or all-pairs as part of your test preparation effort!

Apart from the fact that the math behind is a bit tricky to understand the good part is that this actually works - in practice. A lot of tools exist - free to use to support the process of actually figuring out which combinations to test in order to get high coverage.

There is an excellent site that holds a list of updated tools. My favourites include:

  • PICT from Microsoft
  • CTE-XL from Berner & Mattner
PICT because it is command line, simple to download and uses input from at text file. It can hardly get any more simple, but command line will also be less appealing to some users. The learning curve is something like 10-15 minutes and you'll have your first result.

CTE-XL because it has a GUI and it has the potential to create "test case input" that can be exported to Quality Center. It takes more time to setup, more time to understand the UI and usage, but it has great potential to do graphical overviews of complex dependencies.

PICT has a new "GUI" because somebody decided to do an Excel sheet that interfaces with the tool. PICTMaster can be downloaded from Sourceforge. Well documented by the way!

Pairwise is good but it doesn't replace the business analyst who has the experience and knowledge to pinpoint import business test scenarios. However you will get good input for estimation and in some cases input for pinpointing unclear or missing requirements. More on that in a later post.


Tuesday, 6 August 2013

Testing usability

Problem: Usability cannot be tested with a binary result!

Ever raised a bug entitled: “Rude and weird error message, if user …” or “System displays illogical behavior, when …”? I have, and often the case raises debate in either the development team or with the product owner / customer – Why? Because usability is not measured in binary, true or false, like functional requirements. One could argue if I understand it, then anyone will, while another argument could be you are not target audience for the application, hence this is not a problem – Guess what? Usability is very much my problem, as it will make or break the application.

Solution: Apply Heuristic evaluation as part of your test.

We frequently do usability studies of our applications, some of them being released to Mr & Ms Denmark, meaning that the average citizen will be subject to any usability deadlocks that we introduce. This is how I usually look at usability:

I prepare a test charter, and run an exploratory test session. The aim of the session is to execute one of the user stories, and the driver for evaluation are the heuristics. Not much rocket science in the approach, but it gives a nice structure, and well-defined approach, that ensure that you get good coverage in your usability test. When I have run a test session for all user stories applicable for a usability study, then I am done, and can use the results for a nice usability report, or just a bucket load of bugs for someone to fix.

If you want to go all in look for templates using dr. Google. There are many to pick from, but this one is nice:

Have a nice day & Happy usability testing!