Just recently, I found myself complaining for the lack of validations being done on a particular project’s email address fields. But on second thought, perhaps it just wasn’t obvious which were valid email addresses and which ones weren’t. Most of us would only know the format of an email to be email@example.com. During newbie training or whatever it is that passes as project training, we weren’t exactly trained to identify valid / invalid email addresses even though we encounter email address validation in almost all projects.
Similarly, there are other functionalities that we encounter in nearly all projects that we work on — login, logout, registration, activation, upload of file, saving of a file, etc. For these functionalities, we have routine test cases (mostly if not totally undocumented) that we usually try out. In the course of finding more bugs across several projects, our set of routine test cases grows. I reckon these reusable test cases ought to be kept and tracked in some sort of a test matrix (fancy way of saying list or table of test cases) as they can help you when you need to test/retest some similar function, and they’d be especially helpful to newbies who may not have tested such functions ever.
I suppose preparing these test matrices could be even made into an exercise for newbie training. For instance, we can ask them to identify what values they would try out for testing a field that accepts only integers. Afterwards, whatever one misses and another one finds can be shared among the trainees. Usually, it’s making mistakes and learning from them that etches better into memory.
Anyway, this might be something worth suggesting or looking into. It would be nice if this could be crowdsourced over the interwebs, or even among the testers in the office.