Thanks for stopping by the Testing FTW! Blog. We are not trying to reinvent the wheel, the world or testing in general. Instead, we share the same belief in solution driven test management. In short, it's all about making quality problems go away, while getting most from your effort, while creating visibility and room for test in projects. The blog focusses on the solution to problems encountered as part of the software development cycle, aiming especially on the activities that relates to test.
My first take on Test Driven Development was many years ago, before
agile had really made an impact. We faced the following challenges:
Customer wanted shorter development cycles.
Development was outsourced.
Test resources was scarce and had no coding skills.
Offshore resources had little business domain knowledge.
In order to address especially the last two bullets it became evident
that we had to change strategy, and focus on implementing a process that
supported the offshore development team and enabled the onshore resources to
assist the offshore team though reviews and guidance.
TDD was introduced as part of the low level specification done by the
offshore team. One of the sections in the specification was dealing with the
unit test cases that had to be written in order to cover the functionality
detailed in the spec.
Another challenge was that the offshoring happened fast, and we had little
time to train the team. That meant that we had to come up with ways of “testing”
the team’s understanding before countless hours was used coding a solution. We
found that testing the solution late in the development cycle often proved too
late to counter misunderstandings. This meant that we came up with this very
simple procedure for handing over the development assignment to the offshore
Walkthrough of business requirements and related high level specification,
to empower the offshore team to take up development. Offshore team did low-level
detailed specifications on solution including test cases, and these was sent
for review. Testers reviewed test cases, architects reviewed the solution, and
sr. developers reviewed the pseudo code for completeness and capability with
current solution. The interesting thing was that it almost always proved to be the
test cases that gave the pointers on whether or not the proposed solution was
in line with expectations.
Once low-level design was approved the test cases was implemented – Once
done they were checked in and added to system codebase, and baselined as part
of the delivery. After development of the actual solution all test cases was part
of the regression test suite, meaning that we soon had lots of automated test on
the project leaving us with a high level of code coverage.
The real challenge of introducing TDD was to shape the organization to
facilitate this new way of working, and enforcing the procedures. There was quite
a stir in the onshore organization, not only did they have to embrace the new
offshore colleagues, they also had to hand over some of their assignments to
them. At first the offshore colleagues were put off by the constant review and scrutiny
of their code – Little code could be written before passing a series of quality
gates. These gates was not in any of the standard development processes,
meaning that they were sailing uncharted waters. This meant that there was a
lot of explaining and discussion up front before the TDD approach could be
Biggest impact was on the testers – they had to abandon trivial
functional testing, as this responsibility now rested on the shoulders of the
developers. This was hard, as they were used to doing test cases one to one on
the functional requirements. Their scope was now expanding to a compiling test
results of TDD into test coverage reporting, and then doing testing in areas
that looked a little weak. On top of this, they were now in charge of the
factory acceptance, calling for testing that was focused on the system as a
whole, and challenging their business domain knowledge far more than they were