Wednesday, 26 November 2014

Efficient reporting

Problem: Inefficient reports are not read, hence wasting effort in the development organization.
It has become evident to me that reporting, and especially efficient reporting in becoming paramount in software development projects. Complexity is higher than ever and deadlines are tight, leaving little room in the organization for reading, understanding and reflecting on huge reports.
Solution: Write reports that deliver information refined for the receiver.
An effective report presents and analyses facts and evidence that are relevant to the specific problem or issue of the report, and does so brief and precise. All reports need to be clear, concise and well structured. The key to writing an effective report is to allocate time for planning and preparation.
I suggest that following steps are applied:
Understand the purpose of the report and the recipient group - Consider who the report is for and why it is being written.
Gather information for the report - Your information may come from a variety of sources, make sure that you know them. In addition to this you need to consider how much information you will need, depending on how much detail is required in the report. Keep referring to your report purpose and recipient list to help decide level of information.
Organizing the materials – Organize your content so it makes logical sense, for a test summary report it could be test preparation, execution and finally test results.
Writing the report – Start by drafting of the report, take time to consider and make notes on the points you will make using the facts and evidence you have gathered. What conclusions can be drawn from the material?
Review your work – It goes without saying that a review is needed, like any other workproduct it pays off to read it twice and weed out the spelling mistakes and contradictions that entered the report while writing it.
Present it to the world – A report not reaching an audience looses it’s meaning. Make sure that you communicate the report to the stakeholders. And make sure that you make it easily available.
Other tips:
·         Be careful with reporting templates, their generic nature often include all kinds of information that might not serve you. Make sure that you critically review the template to lose anything that does not support your report.
·         Your management summary must be short and to the point – Often this is the only thing read, by the recipient. The more manager-friendly your report is, the bigger you chance of creating awareness in the organization. If distributing the report by mail, then consider including the management summary in the mail body text.
·         Be very clear about conclusions and recommendations – I often include in the management summary as bullet lists, to ensure that they are communicated unambiguously to the reader.
·         Avoid Cover My Ass (CMA) clauses all over the report – CMA clauses blur your message, and hurts your creditability with the reader.
·         Remember that test is all about dealing with information about the delivery, meaning that the reports and communication of same is paramount for documenting the results of your hard work.
Happy Reporting!

Monday, 24 November 2014

Happy reading

If you fancy reading a bit about the current state of affairs within testing take at look at this excellent report. Yes it is advertising but it might underline some of the issues that you are dealing with in your own organisation.

"Testing" is after all "testing" and we can always use some extra voices to state our case. Whether it's more money or a broader scope for testing (and more money) or the adoption of new techniques and tool (and more money) or something completely different it is usefull with a little inspiration.

So click the link and fill in the form and you have access to 30 minutes of written entertainment

There are no major testing or QA revolutions mentioned in the report and even though it might not apply to your organisation it is still interesting to compare your own expectations to the forecasts in the report. After all there might be a thing or two your own organisation has overlooked.

Happy reading

Thursday, 20 November 2014

Strange error messages

I got the following error the other day, while working in a web-based tool.
It seems that the developers of this application did not foresee my actions and my 'inappropriate' use of the tool, hence did not do anything to help the me understand what went wrong when presenting the error mesage. From my perspective the message might as well have been – Website exploded! Have a nice day!
This experience made me think of an excellent blog post I read a while back on doing meaningful error messages to the user. It was written by Ben Rowe, who advocates for the use of the 4H’s when communicating with the user: Human, Helpful, Humorous & Humble – Check it out it is very informative:
Next time you encounter a message like the one I saw, then include a link to the 4H's blog post in the defect report you file, to help getting understandable error messages to the users.
Happy Testing!

Monday, 17 November 2014

Combination testing strategies

Problem: Complete testing is impossible.
I had an interesting discussion about test of a critical feature. The discussion was on test coverage and completeness in the planned test. The argument was that this feature was of a criticality that required “complete testing”
Solution: Acknowledge that complete testing is not possible and apply combination testing strategies.
There are enormous numbers of possible tests. To test everything, you would have to:
·         Test every possible input to every variable.
·         Test every possible combination of inputs to every combination of variables.
·         Test every possible sequence through the program.
·         Test every hardware / software configuration, including configurations of servers not under your control.
·         Test every way in which the user might try to use the program.
No test satisfies all of above. How do we balance them? It is a matter of balancing the tradeoffs vs. risk, meaning that you get the highest coverage. I came across a nice slide set that I need to share, as it details different combination testing strategies:
Happy testing!

Thursday, 13 November 2014

What is your mission?

Problem: Expectations vs. test team not aligned with the test team’s mission.
Do you know what your team’s mission is? Has this been aligned with expectations in the project and the customer? It is not unlikely that your answer is no and no, suggesting that there is a misalignment between your teams goals and the expectations of the project or maybe even the customer’s expectations
Solution: Make and communicate your team’s mission statement.
First step is to write a mission statement for the test team in the project. Do not confuse this with the mission statement that might exist for the testing services or QA department in general, this one needs to be for the project test team. There are numerous good guides to writing a mission statement – Google will help you with that.
Here are some bullets for inspiration when looking for your mission statement:
·         Find defects
·         Block premature product releases
·         Help customer make go/no-go decisions
·         Minimize technical support costs
·         Assess conformance to specification
·         Conform to regulations
·         Minimize safety-related risk
·         Find safe scenarios for use of the product
·         Assess and/or assure quality
·         Verify correctness of the product
Then it is time to starting communicating it. Start with the stakeholders closest to you, project management etc. and ensure that you at some point show it to the customer, and have a nice discussion on expectations to the outcome of the work being done by the test team. The mission statement is the means for a constructive dialogue on expectations and once you have come to an agreement with your peers it will give your team a clear purpose.
Print the mission statement and stick it on the wall – Make sure that it is catches the eye of those passing by, as it will lure them over for a chat about expectations and purpose. This will ensure that you are reminded about your purpose, and get a chance to reflect on the mission statement every now and then – Like plans, your mission statement might need revision in case expectations, stakeholders or deliveries change.
Happy testing!

Monday, 10 November 2014

The curse of the notification e-mail...

I still remember one of the selling points from the Mercury sales pitch for TestDirector back in the days: ”The system is so clever! It will send you a notification Email with all the information you need every time that you get something assigned to you!” Since then I have received thousands of notification Emails from all kinds of ALM systems – Not all of them were especially clever or informative.

The notification Email is very a very popular feature, and widely implemented in various ALM and defect/issue management solutions. For those not very involved in the project they are a joy, as you do not need to visit the ALM tool unless you get a mail. For those in key positions in the workflow they are a pain, as you are already in the tool daily, and can find this information based on your dashboard or queries in the Tool. For those not understanding what they are used for, the mails become the curse that threatens the use of the ALM solution.

It is the administrator of the tool that carries the key to unleashing the curse, as (s)he can by default set the notification rules in a way that is not supporting the users. If (s)he at the same time adopts a  strategy where notification rules are global and not fixed to the user roles, then BAD things will happen.

The Emails are often not understood, containing lots of irrelevant data, or out of context, making them the equivalence of SPAM. It is wasting the recipient’s time, and makes him angry, fuelling the curse even further. Angry users are not very constructive, and those familiar with change management theory knows that it can provoke powerful negative reactions from stakeholders

This is why it is important to consider the usage of notification emails carefully before firing the mail-cannon at your organization. My advice would be to:

·         Identify and understand your stakeholders – Who will receive the mails?

·         Define who needs to be involved, who needs to be informed and what are the volumes of notifications.

·         Make sure all stakeholders are appropriately informed about the ALM notifications and ensure that your offer them a way to modify the mail subscriptions to fit their needs.

·         Consider doing daily, weekly or monthly notification roundup mails (if feature is available in tool).

·         Ensure that the template used for the mails has only the required information needed, and not everything.

Alternatively you are likely to experience some of the following:

·         Users applies mail autofiltering, sending the notification mail directly in the archive (or deleted items)

·         Users complains that ALM system is a pain, and stops using the tool.

·         Users gets stressed by all the mails they get.

·         Users starts to forward notification emails, disabling the ALM workflow in the tool.

·         Users looses track of other items in their crowded inbox.

All of the above is waste, which is injected and maintained directly in your development cost – Not as clever as the sales pitch originally stated…

Happy testing!


Monday, 3 November 2014

Tips for registering defects

We have just been through a couple rounds of testing, and gathered a lot of observations. Some of these are not defects, they are the result of tester wish-listing, misunderstandings, wrong test cases etc. This is however as expected, and as always calls for a bug triage and refinement BEFORE any fixing and distribution of the defects happen – In other words, no defect management before you are actual sure that it is a defect.
Luckily for us, the testers have been very careful when raising an issue – Something that we benefit from now, saving vast amounts of time in the analysis. Despite the testers being new to testing, they have reported the observations in a way that facilitates future analysis. Simply by following our tips for issue reporting!
There  is little new in the tips for issue reporting below for all you professional testers, but for business reps who are invited to join a project as testers, everything is new, and that is when you will benefit from guides that explains key areas of your test process. We made such a guideline, consisting of a manual for using the test tool – Stripped for anything but what the testers needed. And a list of advice to help filing the issues while testing.
Make sure that you nurse your testers, so they understand their role and the expectations towards their deliveries – They become a powerful asset if you do.
 Happy testing!
Tips for registering issues:
Reproduce the issue before filing an issue report: Your issue should be reproducible. Make sure your steps are robust enough to reproduce the issue without any ambiguity by someone who is not you. If the issue is not reproducible every time you can still file an issue mentioning the periodic nature of the bug.
Report the problem immediately: If you found any issue while testing, do not postpone the detailed report to later, instead write the issue report immediately. This will ensure a good and detailed bug report. If you decide to write the bug report later on, then chances are that you miss the important steps in your report.
Spend 2 extra minutes on the description: When writing a description please adhere to the following structure, to ensure that sufficient information is captured:
·         Reproduce steps: Clearly mention the steps to reproduce the issue, to facilitate reproduction of the issue by developers and others who will work on fixing the problem.
·         Expected result: How application is expected behave on above mentioned steps. Add a reference (if known) to the requirement that appears to be violated.
·         Actual result: What is the actual result on running above steps. Include a screenshot of the scenario, or clear detailing on test scenario, including test data used.
Read issue report before hitting SAVE button
Read all sentences, wording, steps used in bug report. See if any sentence is creating ambiguity that can lead to misinterpretation. Misleading words or sentences should be avoided in order to have a clear bug report, and ease communication with the receiver.