Archive for August, 2011

Choosing the Right Website User Testing Tools for the Job

August 31, 2011 16 comments

I have a client with a new website that I’m in the process of finishing. They’re a small coffee shop and the owner has a real passion for making the perfect cup. He’s the type of guy who regularly makes trips to the farms that grow the beans he roasts. At any time, you can walk into his shop and he has 18 fresh roasted coffees available for purchase. As on-top of their game as they are with coffee, it’s also true that like most busy business owners whose day-to-day operations consume all of their time, their current website looks like a pre-Web 2.0 relic. It performs like one too. That’s the reason I’m building a new one. It’s also why I’m confident that my new design will raise sales immediately. There’s practically nowhere to go but up.

Be that as it may, I’d rather sell my clients facts, rather than confidence, so in the coming weeks, we’re going to be doing some user testing. Essentially, the website is going into its beta release.

The purpose of the user test is to get feedback about the new design to understand if there are any parts of the website that are confusing to users, and to test the site for hidden bugs that might prevent a user from getting the information they need or from making a purchase.

We will be getting people to test the website that are already familiar with the brand. Some of them will come from their Facebook page; the rest will be users who order coffee regularly from the current website.

At our disposal are a plethora of tools to record and measure how a user interacts with the website, including:

  • video taping the client (over the shoulder)
  • tracking mouse behavior
  • heatmaps
  • recording the user’s interaction with the website
  • interviewing the user
  • looking at web analytics

It’s also possible to have an eye-tracking study done but cost and time factors rule that out for this test.

So what tools should I use? I’m not interested in trying them all. I just want to use the tools that are appropriate for my needs and that provide me the feedback in the most actionable way possible.

Before I start to look at the tools though, I think it’s wise to list out the questions we want to answer and constraints that we have to work under for this user test:


  • Can users find the product they want to buy?
    • Are users confused by the choices in the main menu?
  • Can users buy the product easily? (examine the checkout process)
    • Did users easily find the information they expected in the places they expected (i.e. could they find out about returns, the privacy policy, security of the checkout process etc., when they needed it?)
  • Do users have enough information to make a buying decision on a coffee they have never purchased before?
  • Does every part of the website function as it is intended to? (this concerns the mechanics of the website)
    • Can the users find any bugs in the site?


  • Two week time frame
  • Users limited to Facebook fans and current users
  • Want to do it as inexpensively as possible

From these questions and constraints we can make some immediate decisions:


  • We don’t need a service like Ethnio which will find users to test the website.
  • We will want to do interviews with the users. The choices here are either to do a personal interview which can be recorded or to have the user fill out a questionnaire. I would prefer to have a video taped interview so we can look at the user’s body language and facial expressions. However, it’s possible that we will need to use a questionnaire instead due to time limitations. In either case, we would also like to be able to follow up with the users if we have additional questions.
  • We need to record how the user uses the website. This could take the form of videotaping how they use the website in the over-the-shoulder style. Since we can expect to have users comb the site while we are not physically present, capturing their screen data seems to be a better option. We should look at programs that record the user’s session and mouse behavior. It would also be helpful to be able to retrieve that data as visual information, via heatmap or some other similar visual display.
  • Because we will be testing a limited number of users, it is not important to track analytics. We should have the individual feedback from each user which is more granular data than analytics by itself. There might be a tool out there that combines analytics with user testing tools. If so, it’s worth looking at but realistically it’s not necessary.

That seems to about cover it. We’ve been able to distill our needs into two essential items:


  1. An interview/questionnaire
  2. Tools that record the user’s session on the website and provide quality reporting tools

Now that we finally know what we need we can look at what’s available and decide which tools best fit our needs. In the event that multiple tools can do the job, we’ll assess the pros and cons and then make a judgement call.

I’ve done some research already into what tools are available and they are all listed in the “Tools” section of links in the sidebar. I’m sure there are more tools than what I’ve listed but I’m also reasonably sure that we can find the right tools for the job from the websites listed. For the purposes of this article, and for our experiment, we’ll confine our set of possibilities to the 10 following choices:

Selection Set

We can use as many of them as necessary but since we’re also trying to be economical with our time and our money, ideally we’re looking for a one-stop shop that’s free. It’s unlikely we’ll find that. But hey, a guy can hope.

Let’s take a brief look at each tool:


ClickTale offers a suite of tools that look ideal for our needs. They include:

  • visitor recordings
  • mouse move heatmaps
  • mouse click heatmaps
  • conversion funnels
  • attention heatmaps
  • the ability to watch a user’s activity in realtime
  • form analytics
  • campaign tracking

Unfortunately, all of that great tracking doesn’t come for free. They offer three plans, starting at $99/month and going up to $990/month. The fact that there’s a monthly fee indicates that this product is meant for ongoing user testing. While this is a consideration – we plan to do additional user tests in the future – it’s $290/month for the plan that gives us access to everything in the above list. It’s possible that this might be the right solution for us, but it’s likely that it will prove to be prohibitively expensive.


CrazyEgg is primarily a click-mapping tool. I’ve had extensive usage with this tool in the past. It’s great for what it does: track mouse-clicks and display the data in several easy-to-understand maps. It doesn’t record where the user moves their mouse. It doesn’t record a video of the entire session. And it costs money. Plans range from $19/month to $189/month with the $49/month plan making the most sense since it updates hourly instead of daily like the cheaper $19/month package. For our purposes, CrazyEgg might be too limited in its capabilities for us to use it, especially at $49/month. It’s likely at that price range that we will find other tools that offer more functionality. We shall see.


Feng-GUI, if it works as advertised, is a great idea and could be very helpful to web designers. The idea is that designers can upload an image of a project they want analyzed, be it a web page, a print ad, or other design. Feng-GUI “looks” at the image using an algorithm that mimics the human eye and human attention to generate a range of visual-attention maps. At $25 for 10 images, this is a dirt cheap way to get early feedback on web designs.

You could make the argument that we would benefit from submitting an image of the front page, the category page, the product description page, the cart, and the checkout pages to Feng-GUI to see what it spits back out. For $2.50 an image it’s not the kind of thing that needs a lot of thought. Just do it already.

I can’t help but think that this service is much more helpful at the beginning of the design cycle than it is at the user testing phase. But I’m biased. It’s my design. And I can’t bear the idea of the algorithm telling me that I need to redesign significant portions of the website. I want to know that before I code it all up so that users are picking nits instead of tearing the entire site to shreds.

Regardless of my feelings as a designer who is attached to his design, the fact remains that Feng-GUI provides a way to get approximate eye-tracking metrics on a design for a very cheap price. As I said at the beginning of this post, I wanted to leave eye-tracking out of this discussion because of the cost and time factors. But Feng-GUI looks like such a great tool that we’re going to have to give it a try.

At a minimum, we’ll have some interesting feedback that we can compare with the feedback we get from our users. It will be enlightening to see if it agrees with and can predict the users behavior and/or interview responses, or if it contradicts them. For the price of a cup of coffee, we can’t say no.


Inspectlet is a site that allows users the ability to record visitor sessions, display mouse-click heat maps, and tracks analytics in real time. In short, it does everything that we need to satisfy point #2 on our two-point list of needs. Its heatmap reporting isn’t as extensive as what ClickTale and CrazyEgg provide but for our purposes, because our test will be limited to 5-10 people, a heatmap of mouse movements is more of a luxury than a necessity. Unlike CrazyEgg, Inspectlet offers the additional features of being able to record the user’s session and access to real-time analytics.

Unlike ClickTale, Inspectlet is much more reasonably priced. Plans range from $7.99/month to $89.99/month.

Even better, for us, the $7.99/month plan looks perfect. It includes:

  • Up to 50,000 Pageviews
  • Full Real-time Analytics
  • 800 Screen Captures
  • Unlimited Heatmaps
  • 1 Custom Metric

To top it all off, they offer the first week for free! It seems likely that we will be using Inspectlet as part of our user test. It has the right features at the right price.

But just to be sure, let’s continue to see what the rest of the list has to offer. Maybe we’ll find something even better.


KISSmetrics calls themselves “person-based analytics”. I wasn’t able to figure out exactly what that means by looking at their website. I believe it has something to do with showing Analytics to different personality types. For example, the web dev sees one thing, the sales guy another, and the stockroom sees a third. But it’s a little unclear.

KISSmetrics first popped up on my radar when WIRED did a story on them about how they track users in a way that they cannot delete. The hype was massively overblown but the company was forced to use a new method which users could disable.

What KISSmetrics really looks like they’re doing is that they’re tracking users all around the web to get a better understanding of just what exactly brought them to your website. In that sense, KISSmetrics looks like a great tool for ongoing analytics and testing. Prices start at $29/month.

Loop 11

Loop 11 almost owns this thing. There’s one big gaping assumption that I made when I got all sweet on Inspectlet a few minutes ago. I’m assuming that the user will know what they are supposed to do, or that they will be prompted by somebody to accomplish a task. This may be true. Whether that’s the case or not, the reality is that the user is going to have to be prompted to take specific actions on the site so we can measure how they behave.

Loop 11 provides a way to do that. Loop 11 is a tool that allows users to create a user test. They generate a link which can be sent to people in order to get them to participate in the test. This would be helpful to us because we could put the link out on the company’s Facebook page and get more than 5-10 users’ feedback on the site.

They have partnered with OpenHallway to allow their users to record the visitor’s session as well. This gives rich feedback in the form of Loop 11 reports and the individual video sessions.

The reason I said that Loop 11 almost owns the category is because there are a few features missing. The biggest one is the lack of heatmaps. One could argue that it’d be nice to have Inspectlet’s analytics but, as we’ve already covered, analytics isn’t necessary for what we’re doing. But the heatmaps, those are a necessity. In my experience with CrazyEgg, everybody, even the dumbest guy in the room understands heatmaps – without instruction. It’s such a powerful tool in that sense that it’s a must-have for us.

The massive downside is the cost. It’s $350 per test. Is it worth it? That remains to be seen. If we can conduct our own tests, then probably not.


Morae is a user testing solution similar to Loop 11 but it’s sold as a software package, has more features, and is more expensive. The entire package costs $1,495. It seems appropriate for a usability business or for a company that is dedicated to ongoing usability testing. For our purposes, it’s too expensive and it seems limited to testing users on specific machines. It’s not appropriate, unless I’m misunderstanding, for conducting a user test via the Internet.


SilverBack is like Morae in that it is tied to a specific computer. It’s much more reasonably priced at $69.95 and while it doesn’t offer as complete of a feature set as Morae, it does seem like a great tool for quick in-house user tests. A cool tool but it’s not right for what we’re doing here.


Usabilla looks really cool. It seems to be essentially a less expensive version of Loop 11. They are a website that allows you to create and conduct user tests. They have great reporting tools, including heatmaps, unlike Loop 11. But while Loop 11 has partnered with OpenHallway to offer video recording of user visits, Usabilla doesn’t offer this feature. However, after looking at OpenHallway’s website, I think it might be possible to feed the link that Usabilla generates into OpenHallway so that OpenHallway can record the user tests. Using OpenHallway will cost $19/month for their smallest package, which will work fine for our needs.

Usabilla’s price range goes from free all the way up to $139/month. Depending on how much data my client is comfortable sharing on this website, the free version could work just fine for us since it includes 1 test with 10 participants. The downside is that the results are public. The next tier up is $49/month, allows for 1 test with 50 participants and the results are private.


Finally, we have UserFly. UserFly is like OpenHallway in that it’s a simple tool that records how a user interacts with your website for late playback. Unlike OpenHallway which is priced based on hard drive space (their smallest plan allows for 90 minutes of video), UserFly is priced based on the number of captures. They offer 10 captures a month for free. Prices top out at $200/month for 10,000 captures. Who would ever look at 10,000 captures is anybody’s guess.

My gut feeling here is that OpenHallway is a better value. They also seem to play nice with Loop 11 which makes me think they can play nice with Usabilla. While UserFly is definitely worth trying – especially since we can do so for free – it really seems like a good topic for a blog post but it’s not quite what we need for this user test.

The Final Results

After all of that, it seems clear that Feng-GUI, Inspectlet, and Usabilla are the tools we should use for our user tests. With Feng-GUI we will get some great data that approximates an eye-tracking study. With Usabilla we have a tool where we can actually conduct a user test. We can write the instructions for the user and they can take the test without need for us to moderate it. It will provide us with great feedback based on the outcome of the user tests. We will use Inspectlet to record the user studies and will generate additional data and heatmaps that will show us more directly how the users behaved on the website.

The cost is reasonable too. For one month of testing it will cost:

  • $25 for 10 Feng-GUI tests
  • $7.99 for Inspectlet (the first week is free)
  • $49 for 100 tests on Usabilla (or 10 for free)

Total: $81.99

If we found ourselves on an extremely limited budget, we could manage to conduct 10 user tests for free. We would have to do without the Feng-GUI analysis and we’d have to allow the Usabilla reports to be made public but the upside is, it wouldn’t cost us a dime. However, in this particular case, $82 seems like a completely reasonable price to pay for a month of user testing.

Count me in.

Categories: articles Tags: ,

The Ben and Newman Show Podcast #002

August 29, 2011 Leave a comment

If you had talked to me last week, you might have become convinced that I was a meteorologist. Hurricane Irene was tracking to come on land within a few miles of Wilmington, NC which is where we’re based. Rather than watch the news, where the general theme was “RUN! RUN FOR YOUR LIVES!” I was up to my eyes in weather data from the Internet. Besides that, it actually did hit land fairly close to here on Saturday morning.

In anticipation of the storm, Newman hightailed it to Georgia and I decided to wait it out (probably because I wasn’t listening to the people who wanted me to RUN! RUN FOR MY LIFE!).

But we didn’t want to leave you high and dry without a podcast today so we had to improvise.

We hooked up a Skype chat and had a nice long, rambling conversation.

Categories: podcasts

Guilt and Shame / or / The way things were

August 26, 2011 2 comments

Perhaps you are like me and get little notes from your friends like this, “Could you read over this paper and tell me what you think?”.  As a student, I asked and answered that question all the time.

As a web builder, I do the same thing with the small sites I create for friends and family.  I send an email to a few choice folks and ask them something like, “Hey, please look over this website I’m building and let know what you think.”

User-testing is a step in the design and construction process.  I’m guilty of treating it like a  small step. A very small step.  It was the equivalent of proofreading. Do all the links work?  Are there any obvious misspellings? I feel there were two reasons for this:

  1. The evaluation phase of the design process is listed at the end.  Because of this we treat it like the end of a linear process and it’s generally rushed.
  2. We – designers, site-owners, and builders – don’t treat the designing and building of sites as an iterative process.  We start and we stop.

But what if you don’t have a small site for a family member or friend?  What if this site is a business and the stakes are much higher?

I have to tell you that I feel a bit ashamed of my PAST  flippant attitude towards testing and revision for business sites (Sorry former Clients!).  But, now that I’ve turned my laser-like focus on user-testing and the evaluation phase of the design process, I’ve changed.

Which brings into sharp relief, the Facebook status update I saw this morning from a fellow web builder:

The most basic website usertest

Now that I’m on the path of web user-testing expertise, I was really shocked by this. I’ll go into why I was shocked briefly – The question, “need opinions” is too general, the testers (his facebook friends) are not serious , and the results won’t improve the page.

However, It was more the shock that I’ve made this a part of my site evaluations in the past, too.

I spoke with Tom later in the day.   He knew he needed actually user-testing.  He said, “I’m 95% sure of the site. It will work.  It’s that 5% of doubt that I’d like to erase with user testing.”

So I asked “What are you looking for in a user test?” He answered simply:

  • actionable suggestions that improve conversions
  • define confusing areas and ambiguities
  • how to make the site as simple as possible, but no simpler
  • how to make the site as easy to use as possible

We talked about options available on the web. Like

None of these seemed to be a perfect match.  Either they were too expensive – $40 a test for a random tester we agreed seems high -, or they seem to give too much data and not enough ”actionable suggestions”.

I hope to follow this thread and work with Tom more in future blog posts.  He is actively looking for User-testing services now. So, if you can recommend or give reviews, please do.  Perhaps he can serve as a case study and enlighten the process for all of us.

(In order to keep myself on task and away from Google,  I wrote the questions I wanted to answer here at the bottom of the page. I would delete them, but I figure they may be interesting to you, my dear reader.)


  1. What’s the difference between Quality assurance and User-testing?
  2. What is the bare minimum you should do for a user-test… or, I should say, site evaluation and revision?
  3. What’s the difference between UI experts and ‘random testers’?
  4. Is there a standard Website evaluation form for general site improvement?  Visual design and layout?  Color Scheme?  Functionality?
  5. Should you test for everything – all facets of a site –  at once?

On Usability and Anagnorisis

August 24, 2011 Leave a comment

A few years back, I was working at a local web development firm in the marketing department. I was responsible for maintaining our client’s AdWords accounts and for a bit of SEO here and there. Invariably, when a client started to use AdWords a three step process would take place:

Step 1: They’d be highly excited about increasing traffic to their website. YAY the money train is a comin’!
Step 2: They’d watch with glee as their traffic numbers went up. HOORAY beer!
Step 3: They’d call me with a long face asking why it didn’t work.

There are only two possible reasons to explain why AdWords wasn’t providing a positive ROI for them.

1. Something’s wrong with the AdWords campaign.
2. Something’s wrong with the website.

That’s it.

Given the time frame in the above example, it’s obvious that the AdWords account was immature.  It hadn’t reached it’s final cost-per-click value yet on the majority of its terms, it needed more A/B ad testing and so on.  There were definitely things that could be improved.  But, in my experience, it’s also true that the website was under-performing too.

Why is that?

Why is it that a firm can build a website, be perfectly happy with it, and then once they start spending money on marketing are surprised to find out that the website under-performs? Or even better, why would a web development firm create a website that can’t do its job properly in the first place?

The reason, quite simply, is because it’s the first time you’ve built that particular website and it’s really hard to hit a home run on the first swing.

This can be an unsatisfactory thing for a client to hear when it’s coming from a web developer. We all want things to be simple. When we buy a blender, we expect it to blend. We don’t expect it to sort of blend but then we call the manufacturer and tell them how to make it blend better. That would be ridiculous.

But here’s a crucial distinction. Websites are not a one-size fits all commodity item.  Websites are a space in which things happen.  If an analogy must be made, it’s really a lot closer to opening a store than it is to building a blender.  And we’ve all seen enough stores go out of business to know that being successful at it takes work.

This isn’t the Honda you bought to toodle around town with. This is the NASCAR franchise you own where you can get better and win money.

Are any of these analogies working for you?

Great. You understand that a website isn’t a one-time thing, it’s a process.  And that process has to start at the beginning.  Only by answering the big questions can we drill down to the specifics to get meaningful answers.

Here are the two questions I’d ask my long-faced unhappy AdWords client.

What is your website supposed to do?

And then the revealing follow-up: What does it actually do?

The juxtaposition of the two answers allows them to see problems right away. It’s what the Greeks called “anagnorisis”. The moment of discovery.

It’s when you realize that your e-commerce site is all pretty pictures on the front page with nary a link to an item to buy.  It’s when you realize that your cottage rental website is too heavy on selling the area and not heavy enough on selling your rooms.  It’s when you realize that your dentistry website says all the right things from your print literature but doesn’t make it easy enough for people to contact you or set up an appointment.

It’s when you see how things could be better.

And like kicking a bad habit, the first step to recovery is admitting you have a problem.  With a problem now identified, we can start to make changes to improve the website.

Categories: articles

The Ben and Newman Show Podcast #001

August 22, 2011 Leave a comment

We are two web professionals with background in web marketing, SEO, design and development who have been in the game for more than 30 years, combined.  In that time we’ve realized that there are two main areas of concern when it comes to web development: getting people to your website and then getting them to do what you want.

Fundamentally, websites are machines that accomplish a specific task or set of tasks.  It’s up to the website owner and the web developer to manage the operation of this machine and to make changes for the better, if it’s desired.  We have a personal interest in making the sites we design better.  And that’s led, in a meandering way to this website.

In this space, we will be exploring the topic of website usability, or alternately website ux or just, ‘ux’.  The idea here is for both of us to be able to dialog with each other and with you to come about new thoughts and insights into web usability.

What we have here is our first podcast.  It tells you a bit about us, who we are, why we’re here and has a few brief thoughts on web usability and how business owners should think about the topic.

You can expect a new podcast from us every Monday and new posts on Wednesday and Friday.

We’re excited to be getting started and are glad you’re here to join us.

See you on Wednesday.

Categories: podcasts