Usability Testing – 15 year review

One of the perks of working at a university, particularly if you’re a geek, is free access to journal articles. When I have time, I try to take a look at what been published recently in the field of usability and Human Computer Interaction.

A recent find is Holingsed and Novick (2007) who provide a review of usability testing practices of the last 15 years. They show that heuristic evaluations and cognitive walkthroughs are the most popular, with formal usability studies being reduced in ambition or abandoned.

I can understand the attractiveness of heuristic testing as its ROI is relatively high. But, one of the reasons I value usability testing, with real volunteers, is that you actually get to meet the customer. It’s often way to easy to avoid the people who use our website and, as a result, avoid tackling some of the major usability issues we may have. Obviously, the method you use must suit the questions you are asking but, for me, every opportunity to meet and understand the customer should be taken.

Holingsed, T. Novick, D. (2007) Usability Inspection Methods after 15 Years of Research and Practice. SIGDOC’07:Proceedings of the 25th annual ACM international conference on Design of communication.

The next SIGDOC conference will be Sept 2008.

Usability of the English website increases after relaunch…

Usability of the english website for Lund University has increased from 50% to 80% since we relaunched the pages in September 2007.

The 80% figure is based off the success rate of student volunteers attempting to complete key tasks on our website.

Usability report – March 2008

Much of this success is from giving clear links to the education database, the increased completeness of this database and the use of target group focused pages.

Tasks included finding course information or making an application.

The average number of page views, per task, has been reduced by half – this is positive, and reflects a more user-centric design with clearer menus and links.

80% – fine, but what’s still not working? Our education database is still problematical, to say the least, and some of the new pages I introduced are simply not working.

The Future Students pages, for example, simply confuse the students and add an extra level of pages which they clearly could do without. Simplification is definitely needed.

The transition of moving students from our site to studera.nu is also not handled well, they are often dumped off our website by a ‘Apply Now’ link into a whole new environment – this is not good service.

Usability testing only tells us so much. We can identify the major problems but the more subtle issues such as the effectiveness of course descriptions, or the role that student reviews play in the decision making process, are hard to assess by this type of testing alone.

I also focused only on the central website – the sometimes complicated relationship between this, and the other websites of the university, which is often a source of frustration (judging by many of the survey comments) was not assessed.

The report contains some suggestions for improvement which I’ll be implementing over the next few months and then test with a further group of volunteers.