You suspect someone’s great idea for the website isn’t any better than what you have today and you need to talk them out of spending money needlessly.
You’re worried that a competitor’s design is outperforming yours in some very important ways, and you need to build a case to fund site improvements.
You’re redesigning your website, and you want to make sure the new version is not only highly usable but also more efficient for users than the old version.
These are just three of the very real reasons to run a comparative usability test on a design.
Most in the digital community are already familiar with traditional usability testing—where five to 10 users are recruited to think aloud while they use a website to accomplish tasks. But, there are even more techniques in the UX professional’s arsenal to gather usability metrics and insights. One very valuable (and very affordable) technique is to use an unmoderated first click test to run a design comparison study.
Wait, What’s First Click Testing?
Research has shown that when users have a successful first click on a task, their probability of getting the entire task scenario correct is 87%. If the first click is incorrect, their chances of being successful drop to under 50%. Thus, first clicks are a fantastic indicator of how usable a site is.
Back in the day, researchers would capture the first click that a user took during a traditional usability test. The sample size was too small to be statistically relevant but the information gleaned was still useful. Enter tools like Optimal Workshop and Usability Hub (and others) that took the process online. Using their tools for a modest fee, you can upload static images from any website, write a task, indicate where the successful first clicks ought to be, and distribute it to users via social media or an email list to test. You’ll have dozens, if not hundreds, of responses within hours. These tools capture first click success rates and the time it took for the user to click.
These results are insightful on their own, but they are even more powerful when turned into a comparison test.
Making a Design Comparison Test
To turn a simple first click test into a comparative usability test, you’ll run the same task questions with two different designs. This could be a screenshot of your current website versus an image of your proposed design change, your design against a competitor’s, or two potential design options against each other, to name a few.
It’s possible to run the comparison test with the same users answering questions on both designs or with a different set of users for each design (within subjects or between subjects, if you paid attention in stats class). The only difference will be how you calculate your findings. Gather results from 30 or more participants and then compare the success rates and the time it took to successfully complete the tasks. You’ll know your winners and your losers, either by success rate, time on task, or both. Remember to report the significance and margin of error. These will be provided by the tool, but if you do more slicing and dicing of the results you may need to calculate them on your own. Like all good statistical explorations, the key is to start with a good hypothesis and a solid plan and have the willingness to be proven wrong.
Aside: I’ve oversimplified this quite a bit. The out-of-the-box stats you’ll get in whatever tool you use are a good start. But, if you’re interested in learning more about how to do different types of statistical analysis on your usability test results, I highly recommend this book and this book.
Evidence-Based Decision Making
With first click comparison testing, you’ll have evidence to support design decision-making. Gone are the days of guessing or going with the preference of the loudest voice in the room, because you’ll actually know, quantitatively, which design is more successful or more efficient for users. You’ll be able to prove or disprove with real numbers that design changes are necessary and are worth the effort. So, spend your money and time wisely: run a design comparison test.