Usability testing isn’t quite a usual thing here in [company name]. Sure, we can log usability comments in our buglists, but an actual usability test is a different thing altogether. When I first spotted a certain book on that subject, my initial thoughts were on applying it for designing for usability rather than on testing for it.
Unexpectedly, a need to consider and emphasize usability, and to actually conduct a usability test came in JV. Fortunately, I came across the book on usability testing again. The book “Don’t Make Me Think” is quite comprehensive, plus it’s only around 200 pages long. Although its main target is for web applications or web sites, its main points could be adapted for JV’s shrink-wrapped product. After getting some ideas from the book and reading some more about it on the net, I then tried to apply the little that I know by running an actual set of usability test sessions.
I didn’t prepare a step-by-step, detailed script of how the sessions would go. I simply prepared an outline of the screens that we needed to go through and the tasks that can be tried out. This actually allowed for flexibility. I essentially wanted to cover the book’s prescribed methods. One is the “Get it” testing wherein the you (the facilitator) present to them (the usability testers) the screens and try to see whether they get it — What are their first impressions on the screen? Do they immediately get what the available controls are for? Or what the screen is for? Do they get how it works? Or what it is for? The other is the “Key task testing” wherein you ask them to do certain tasks using the product and you try to see how well they do and whether they encounter any difficulties — Which parts took them more time than expected? Which parts were they prone to make errors? Which parts frustrated them?
I went through 5 sessions with one non-JV guy each and an average of 2.5 hrs each per session. During the course of which, I enouraged the usability testers to think out loud and to just be candid in giving their feedback — be it negative or positive. Afterwards, we sifted through the comments, and chose which ones we could and should apply before our deadline. We basically went for fixes requiring minimal effort but with obvious results, and for fixes to items we accept as valid usability flaws that could frustrate our potential users. Frustration brought about by a difficult-to-use product is a deterrent — chances are you’d brand the provider as evil and you’d think twice about using their products, or you’d complain a lot, or you’d look for something better.