Posts Tagged ‘simple’

Usability Observation with Inspectlet

November 25, 2011 5 comments

Inspectlet Homepage - This week's usability tool review

First Impression with Inspectlet

We place Inspectlet in the category of  ‘Heatmaps / Mouse Tracking Tools’ and ‘Screen Capture Tools’.  In the past, I had used Userfly, another tool in this category and had been underwhelmed.  The code snippet slowed down the page and I eventually took it off.  I don’t want stuff to load from another site on my page.

So, I was alittle leery Inspectlet, but Ben was favorable in his survey of UX Tools and we decided on this review. So keeping an open mind, I started to use it.

Here it is: Inspectlet let’s you observe people using your website – Actual users on the actual site.  Like Coca-Cola, it’s the real thing.  Paste a code on your site, and <magically> a recording shows up on the Inspectlet dashboard with a video along with key metrics –  time on site, browser information (screensize and type) and number of pages viewed.

The numbers are good. It’s good to know that a visitor using IE found the site thru a google search for this spent 2 mins on this page.

The video is GREAT. It’s invaluable to know where they scrolled and clicked on the site.  It’s the thing you can’t get unless you discover a cloak of invisibly and a teleportation device  (Hey, I could build a kick-ass business in England if I could hook up with the Chosen One)

Inspectlet - It's Like MAGIC!... psst, it IS magic. (I'm two for two on chosen one references!)

What is the experience of using Inspectlet?

Inspectlet is surprising simple to use, considering it’s magical abilities.  It’s straightforward process and nothing seemed counter-intuitive to me.  I was expecting to sign up, get the code and paste it into my site, and watch some videos.  Real-Time Analytics and Heatmaps are two other features.  These were not so valuable to me because of the low traffic the testing sites.  Analytics would be more valuable if the sites got more traffic.  And, the heatmaps would be more valuable if the sites had more interaction / clicks.  I suppose over time the patterns would become clear, but in the few days I tested not enough data was collected to make Real-time Analytics and Heatmaps valuable.

The videos are ‘The show’ – the real value for me.  You get a table of all the captures for the site which includes – IP address, starting page, capture length, browser (type), screen size, and referer and date.  Click, and you get a video display page which is very well thought out and easy to use.  Each page viewed by the user gets a new ‘chapter’.  You can pause the video or speed it up (up to 5x speed).  Unfortunately, you can’t scrub back and forth.  There is a nice feature where the video automatically skips parts with no interaction AND the sections with the most interaction (clicks and mouse movement and scrolling) are highlighted in red.  All of this can be viewed in their demo -that’s what sold me on it.

The Process of setting up the test

  • Sign up / login / get to the dashboard
  • Add a new site
      • give it a name
      • RealTime Twitter Query (Twitter should be capitalized in the form – new tool!)
      • skip some stuff that’s not documented (Exclude IP, screen capture method, and screen capture frequency)
  • Get and Install the code on each page you want to test
  • Watch the videos – scribble notes frantically, and look quizzically
  • Analyze the results – Come up with ideas for testing questions and changes to be made

Doing this process is fairly straightforward, but I do wish for more documentation. Getting comfortable with stuff you don’t understand is key for using alot of stuff on the webs – you can’t be an expert in everything.  Still I wish for more documentation.  Yet, the tool just works.

I did have a problem – no data was coming in – and it was quickly resolved by tech support via an email – ON A SUNDAY.  Once again, support is good.  The problem was that I had the ‘Staggered Captures’ set up incorrectly. (some more documentation would be good here)

How to get the most value from ‘Usability Observation’

Here’s my thing:  Using Inspectlet will benefit your usability plan.  I think a tool like this should be in every UX toolbox and here’s why.

Like I’ve said before, user-testing is about observing users with the intent of improvement – to make changes.  Inspectlet gets directly to the observation.  You are like a fly on the wall (Great name Userfly!) You don’t disturb the user and they are having an authentic experience with your site.  This by-passes many of my issues with user-testing.  This is really Usability Observation.

We aren’t taking them out of the flow – they don’t know they are being watched.  It’s like security cameras in a retail store.  But we aren’t watching them for shoplifting.  We’re watching to see where they go, how they got there, where they click.

My big issue with testing – the thing I can’t get my head around.  Is what to test?  What questions to ask?  Observation is the answer.  Observation and testing go hand in hand. Observation leads to  exact and specific test questions. Those test questions lead to more observations.

Here is the pitfall: You can’t observe and test at the same time.  We’ve talked about this many times, but now I’m having an ‘Inward Singing’ moment. Avoid this pitfall: by observing first.  We listen first, correct?  So, listen to your users.  If you listen, they will tell you what to ask next.  Observe first. Test second. repeat.  Hmmmm.  Or Observe. Change. Test. repeat.

Like a three step Waltz. 1,2,3 - Observe, Change, Test. ... And, let the user lead, plz.

My thoughts get unclear here: But bare with me a sec.  Maybe these are the three fundamentals and they each relate and rely on the other.  Observation is a form of Measurement.  Inspectlet is both Qualitative and Quantitative measurement.  The point is that it’s dangerous to mix the elements or try to perform them together.

We tried to remove the observation from the testing in our testing script – by starting with the participant simply using the site.  Ben even suggested leaving the room while they complete the tasks. Like: “Here do these tasks and I’ll come back and we can talk about it”.  That’s good.  But, Inspectlet is better.  They don’t know they are being watched.  They are thinking about their goals and needs – not being a test subject or providing insight to you, the builder.

Natural users are better than un-natural test subjects.

So, how do you avoid messing up at writing a test question? Start with observation or a measurement.   Then specifically ask / test about that measurement.  Bounce Rate is a common metric we want to lower.  If it’s high, people are leaving your site within 10 secs and are not going deeper into the site.  You see it in Google Analytics and you SEE it in the Inspectlet video screencaptures. That’s an observation.  Inspectlet would show you this – and more, you can see if the user did anything during those ten seconds.  Now, make a change to lower bounce rate – put key content above the fold, make a clear call to action, make text bigger and bolder.  And, the final step, make a question and test with a Usabilla type tool – “What are you most interested in on this page?”; “Where would you click if you wanted to do [insert Critical Path step one]?; “Which text would you likely read first?”.   These questions test if your change made a difference. Well, you could measure again over time to see if the change made a difference.  Or, you could create a Usabilla test to ask about first impressions of the site – or 5 sec or Feng-GUI it.

Okay: Enough rambling.  Point that started that digression is good, I feel.  Here are my findings:

  • Observation is different than testing.
  • Both are important and relate to each other.
  • Inspectlet is an observation tool – it will provoke questions to test and changes to make.
  • Three general types activities in usability or design are: Observe->Change-> Test and can be followed in that order.

A few final thoughts:

Using Inspectlet, I found myself wishing for an intercom button.  “Excuse me, website visitor.  Why are you scrolling up and down like a madman? ”  I realize now, that I want to switch from observer to tester.  And, of course, I wanted to make changes.  The big insight I had – do something for smaller screen sizes. Could I have seen that in Google Analytics?; yes.  Did I know we have 10% ‘small’ screen use?; yes.  Did it have a big impact seeing those numbers?; no, not until I SAW it with my own eyes.

We talked about finding users to test in previous posts.  Inspectlet [because it is an observation tool] doesn’t have this problem.  The users are right there on the site now – right now.

I bet site owners get addicted to watching the videos, just like some are addicted to watching visitor counts.  There is data there – actionable data that will bring in more money.  Because of that, I think Inspectlet is a great value at 8 bucks a month.

By way of explanation of that digression into the process of design and testing or website revision.  I’ve just finished David Zull book on the brain Learning Cycle and I think that’s where those ideas came from.  He basically says the brain learns in four stages: Gathering, Reflecting, Creating and testing.  Inspectlet is a gathering tool – a sensory tool.  Usabilla is a testing tool – an active probing tool.  Reflecting might be Analytics – where you integrate the data and decipher patterns.  Creating is where you make changes to your design and plan.

Thanks for reading and see you next time!

Simple Usability Tests with Usabilla

November 18, 2011 4 comments

Tool of the day, Usabilla!

First Impression with Usabilla

We have Usabilla listed in our ‘Conduct a User Test’ category along with Loop11 and  In earlier posts, Ben and I have reviewed Usabilla and concluded it would be a valuable resource for simple usability tests for websites.  Basically, if you are a business owner with a website, then Usabilla is a good place to start with user testing.  It’s simple. It has a usable free service plan and support resources to help you get started today.

Website testing, in it’s simplest form, is observing people using your site with the intent to make improvements.  Usabilla offers a way to test users completing tasks on the website. You give a task.  The user completes the task with a click and/or a note. You can see where they clicked, what they wrote, and the time it took.

What is the experience of using Usabilla?

Before I started with my own test, I studied the Usabilla features.  I first heard of Usabilla on review-type blog post where the founder and CEO of the company, Paul Veugen, commented and addressed issues uncovered in the review. That shows engagement and interest.  This isn’t a dead-tool or a side project for someone.  It makes me trust these folks.  Points for Usabilla, right off the bat.

My trust in the company  is further supported by their blog.  The posts are relevant to the small business owner interested in user testing a website.  Five-things-you-can-test-under-five-minutes  and Guerilla-usability-testing-tools-improve-conversion-rate-satisfaction  are two insightful and practical posts I read.

Additionally, I liked that I could test sites [using the screencapture tool] and/or uploaded images for my tests.  This allows me to follow the advice of Steve Krug and test the napkin sketch of my designs [Maxim:Start earlier than you think makes sense].  Collecting users for the test was made easier by providing a javaScrip widget to embed on the site to get actual site visitors to perform the test.  And, lastly, I want to give a shout out to the Usabilla Support.  I had a question about the widget placement on the page [it was being covered by the content]. I followed the ‘Chat Now’ link on the dashboard to quickly get in touch with Paul v. A. who looked at my site and instructed me how to fix my problem. Try to find that kind of support in another free tool!

The Process of setting up the test

  • Sign up
  • Create a test
      • give it a name
      • custom logo – not sure where this was used
      • Add a page – generated screenshot from a URL didn’t work on one page -had to use my own screenshot
      • Choose a task – create a new task
  • preview the test
  • activate the test
  • invite participants – via URL or Widget
  • Pause test and Edit
  • Analyze the results

The experience of using the tools was smooth and went as expected. No frustration  or undo cognitive load. The real hard part is deciding what to test and how to test it.  Thankfully the Blog was very helpful, even if the standard questions were not. Meaning, I think those standard questions won’t reveal actionable insights. More on this later.

With great power - like chainsaw arm- comes great responsiblity

Having the right tool means nothing if you don't know how to use it.

How to get the most value from remote user testing with Usabilla

Just because you have access to a powerful tool, doesn’t mean you can produce powerful results.  The tool is only as good as the user.  If this discourages you, don’t let it.  Everyone has to begin somewhere [and we’ve chosen Usabilla as that tool].  Here are some of my thoughts based on my first experiences with remote user testing like this.

First and foremost, what do you test?  More specifically, what questions do you ask the user about your design in order to get actionable / profitable results?  I’ve talked about this before in my post,”It’s the Questions, Stoopid“.  Basically, everyone has impressions and they are generally unique and varied. That makes them both hard to test and [mostly] un-actionable.

The tool, powerful as it is, allows you to test whatever you like. You can use the standard questions.  IMHO, this is likely to return superficial, un-actionable results. Or, dig a bit deeper into your design decisions and ask some targeted questions and will lead to concrete site changes, more conversions, better [smoother? more elegant? less confusing?] critical paths for your site visitor through your site.

Developing this talent for question creation is the real challenge for UX experts.  It requires a scientific eye for data vs noise and the creativity to manipulate the tools to reveal actionable insights.  It’s hard – try it yourself and you’ll see.  Of course you could be the ‘chosen one’.  For the rest of us, I suggest these test questions as food for thought and a good starting point (borrowed from the Usabilla Blog post mentioned earlier).

There can be only one!

Where would you click to start to use this product?

Simple test of how and if the user notices the  call to action and the Critical Path of the user experience.  If they get it ‘wrong’ [or I should say, click where you don’t expect] or it takes a long time for them to click [nice feature of Usabilla measures this time], then you can review / revise your design and retest.  Usabilla makes it easy to do this.

Which elements make you trust on this website, and why?

Trust is paramount on the internet.  If I don’t trust a website, then I’m not likely to convert for them – buy stuff, give my info, sign up for anything.  Could be just me… I believe good designers put in trust building elements into sites – certification logos, personal pictures – not stock, quality content, well known brands or imagery, etc. The question will reveal: Are users picking this up? Do they click on the elements you expect?  And, because of a nice contextual note tool, the user can very leave a specific note about the design.  Yep, that “…and, why?” is powerful.

hmmmm, I don't trust 'em. Been burned too many times in the past. Sorry, Ginger.

A few final thoughts:

Where you get / recruit your test participants matters, I feel.  If you are simply hoping to increase the efficiency of the ‘around the watercooler’ test – sure, go ahead and use your facebook and social media to get participants.  They are returning users.  And, I guess, if your site is based on returning users, then this would be fine and good.  But, if you need first time users, then use the widget or ask via some other means than existing contact lists.

I saw an banner ad for Usabilla where the tagline was ‘We give designers quantitative ammunition to go with their qualitative insight’.  I thought this was perfectly phrased.  We do test insight. We do test the assumptions that lead to our design decisions.  That’s why we test.  We ask ourselves: Will making this button red attract more attention? Will adding a recognizable logo increase the sense of trust in the users? Until ‘the scientists’ develop a brain scanning and opinion deciphering device, we’ll have to be creative with the user-testing tools available to uncover the effect of our designs on the user.

I’m glad to add Usabilla to our tools  and recommend it to anyone looking to quickly and easily start testing today.