There’s in an email thread in our office discussing a problem with the unit test plans of our newbie testers. Our senior tester raised that the newbies’ test plans seem to have been directly ported from the functional specs. That didn’t sound good. There’s a saying that the devil is in the details. And our functional specs usually just offered the high-level, logical design. If they had indeed simply transcribed the fspecs into test cases, then they might be missing critical test cases. Another problem raised was that the stringent standards compromise the efficiency of the test plan preparation. Now, with the level of details they are putting into the test plan, I worry whether they’ll actually use those plans during test execution.
I reckon that just as we try to make our programs usable, we should also do the same for our test plans. We must recall that the test plan is a tool, a guide. It should help us testers rather than inhibit us from doing out jobs efficiently. On the other hand, we also have to consider that sometimes the test plan is a deliverable. But for what exactly do the users want the unit test plans for? Knowing this would help us determine the level of details that we put into the test plans. There’s got to be some ground in between wherein we satisfy user requirements AND still manage to come up with a more maintainable, more usable, more efficient test documentation. Not finding that common ground can only be wasteful. Now if only I knew how. =/
Conveniently, after some googling, I came across a blog post by James Bach. In his post entitled “Fighting bad test documentation“, he mentioned some ideas on how to shift to more concise test documents:
- Show management how much less testing we are able to do because we are spending so much time with documents.
- Show management how certain kinds of testing isn’t done at all just because it is hard to document (exploratory testing and complex scenario tests often fall in this category). This is perhaps the most chilling effect of over-documentation, especially in the realm of medical devices. I keep seeing medical device test doc that is simplistic, in all its obesity, to the point of near worthlessness.
- Examine closely what testers are doing and show that they aren’t even following the documentation (often they aren’t, in my experience as a consultant who audits test processes).
- Demonstrate the power of exploratory testing (a less heavily documented approach). One day of ET is often sufficient to find what would take a week to find when following detailed documented test procedures.
- Demonstrate the value of concise test documentation (matrices, outlines).
- Consider documenting at the level of test activities rather than test cases.
- Consider automatic documentation (via log files produced by the product under test or via an external logging tool such as Spector).
- Ask the question: what exactly are we getting from our documentation? Don’t accept any theoretical answers. For example, one typical answer is that documentation protects the company from the ill effects of staff turnover. But does it? Probably not, in my experience. That’s a theory based on ignorance about how people learn. In real life, new testers learn what they need to know by playing with the product itself, and talking to the people around them. In my experience, testers come up to speed in a few days at most. And in my experience, test documentation is often of such poor quality that it’s better ignored than followed. You have to go by your own experience, of course. I’m just suggesting that you ask the questions and make your own observations.