Krissy Scoufis / User Insights / July 1st, 2014

5 Usability Testing Lessons Learned the Hard Way

Last week I wrote about how easy it is to start a usability testing strategy, but while it’s easy to get started, it is also easy to make mistakes.  In my years of running and applying usability tests, I’ve hit some roadblocks and learned some hard lessons along the way. So this week, I want to share some lessons I had to learn through trial and error that will hopefully make your first venture into testing a little easier.

“Making mistakes simply means you are learning faster.” ~ Weston H. Agor

Do a Dry Run

Before you deploy your tests, do a dry run by previewing the tests like your study participants will see the tasks. Just reading through your tasks in order or better yet, having someone else in your office take a look at it for you will bring clarity. If you are also responsible for developing the product, you may be too close to it to catch an error in the user flow that would make it difficult for a participant to complete the tasks. Proofreading blindness is a common affliction. Our brains are extremely effective at auto-correcting, especially when we read something we are familiar with. For example, read the text in this triangle:

paris

Did you catch the typo? Most people do not see the mistake because it is a phrase we are familiar with. Our brains are always striving for efficiency and struggling over typos takes effort. So behind the scenes, our brains will slightly alter and auto-correct the phrase without losing any of the phrase’s meaning. It is this ability to fill in gaps and fix our errors subconsciously that makes it really important to run through the tests before we deploy the study.

Deploy One Test First and then the Rest

As a best practice, I always deploy one test before I order the remaining tests in the study. Whether you are a UX Pro or a novice, inevitably there will some part of the test that either does not make sense to the study participants or even contains an error. Frequently, no matter how many times you double check the test plan, you will not identify the error until you are watching the user struggle with the task. After you have viewed your pilot test, if there are no bugs, great, move forward with the others. If you find something significant about the test that you need to change, you can use that first test as a beta, make the change and launch all the tests again.

Double Check Your Questions

Task creation is the trickiest part of usability testing. Read through each task and ask yourself – does this question lead the user to the action? Try to avoid mentioning the name of the source of action in the task – describe instead what you want them to do. For example, if you are wondering whether users will be able to find resources on your site, don’t ask them to click the resources button. Instead, ask them where they would go if they wanted to find supplemental information about a topic in the form of a PDF.

Check for Password-protected Sites

Many sites that I work with are password protected on a soft-launch site as they are still in development. With cookies enabled, I do not need to continually log in, so passwords are not top of mind. A new test participant will have to log in though, so make sure you give them the necessary credentials if the site you are testing is locked behind a password. If sharing credentials is not an option, moderating the tests using a screenshare tool is an excellent work around. I will use GoToMeeting.com and take control of the screen when credentials need to be added, then pass controls back to the users to finish the test.

Avoid Running Too Many Tests

Last week, I wrote that the optimal number of tests to deploy is between 3 and 8. If you are running more tests than that, you begin to see the same user observations over and over again so you aren’t learning anything new by running additional tests. Not only is this a waste of money, but it takes time to carefully watch and annotate each user testing video. Test participants will typically spend between 10 and 20 minutes on the test.  Make sure you have built in enough time to watch, annotate and create a highlight reel of the tests.

Having a test protocol in place is a great place to start forming good testing habits. So, ask yourself, are you doing everything you can to incorporate user feedback to improve the products and sites you create?