One of the great things about doing UX work these days is that there excellent software tools available which help us learn about our users quickly. One of the nice things about working in the Sears UX shop is that we get access to these tools. I recently got a chance to try out Verify by Zurb to learn more about the findability of the ‘Refine’ buttons our mobile results pages. I’d Iike to share some observations about what it was like to use Verify as a newbie as well as what I think I learned through the testing.
Verify is part of a suite of tools by Zurb – an amazing company who design websites in addition to building really useful products that help the rest of us design websites. The suite of apps that they market include Verify (for concept testing), Influence (for presentations), Solidify (for prototype testing), and Notable (for interface testing).
Right off the bat, I liked Verify as soon as I visited the website. It’s just well-designed and informative and you quickly get the impression that these folks mean business!
Once on the site, I was able to quickly learn more about the Verify application and what it does. Just to be sure, I also consulted with one of our department user research experts, and we decided to run a “Click Test” on those troublesome buttons.
Verify Click Tests work by quickly collecting and collating user feedback based on screen mock-ups. Setting up a test is a snap. You craft a question based on whatever you’re trying to learn about, upload a mockup, tweak a couple of settings like the duration of the test, basic demographics to include and then let it fly. Participants read the question on one screen and then click on your mockup to indicate what they would do (i.e. click on) depending on the scenario that you gave them. Contingent on the pool of participants (we used Amazon Mechanical Turk), the time of day you launch and so forth, you might start getting feedback immediately.
I got feedback right away, and it was pretty thrilling.
Using Verify to Test Mobile Designs:
Luckily, it’s not every button that we were worried about…just the ‘Refine’ button that appears on various pages in our Deals and Coupons Center. In the current deals experience, ‘Refine’ appears as an inline element in the results toolbar (see Condition 1, below).
My business counterparts mentioned to me during a meeting a week or so ago that they were worried that users are having a hard time identifying where the button is (firstly) and what it does (secondly). Totally valid concerns. They have metrics to support their concerns usually and they’re on the ball in the business side of the house, so I took this seriously and decided to investigate.
So, we have a two-part issue regarding ‘Refine’: 1) do users see it? and 2) do users know what it does in the context in question? With a Click Test, you can get answers to both questions based on the click-map which is produced during the testing as well as the time that it takes the users to click on something. Ultimately, if they click somewhere besides the ‘Refine’ button and/or it took a long time to click, then basically there’s a visibility issue or a functional issue.
To create a comprehensive test, I prepared separate mock-ups of the Hot Deals page layout:
Then I worked on framing the test instructions properly. I didn’t want to be too obvious about where we expected the users to click or tap because that would have biased the results so I settled for:
“Where would you click or tap if you wanted to see only items under $100?”
Then we rolled the tests.
- When the ‘Refine’ button is placed inline with the number of results, it is clicked faster and more accurately.
- Correct clicks on ‘Refine’ are lowest when the button is not inline, but the results bar is present.
- Time to click is also highest when ‘Refine’ is separate and the bar is present
- This combination also produced the most clicks on OS controls (the back button, sharing etc.)
- Separating the ‘Refine’ button, not displaying the results bar, and using a card UI for the individual results cards produced the highest number of correct clicks and no clicks on OS controls.
The third condition was clearly the winner in terms of the number of correct clicks on the ‘Refine’ button, but the amount of time is still a bit worrisome. In any case, after evaluating all three conditions, it seems that users did not have as much trouble identifying and clicking ‘Refine’ as we were initially lead to believe.
Besides the results for this particular question, though, I have to say that Zurb Verify is very easy to use, and perhaps more importantly very fast. I found it amazingly easy to set up the Click Test and the test itself was live, running and collecting information before I could think twice about it. Now that’s a tool that I want to use frequently!
What are your experiences with Verify?