Archive

Posts Tagged ‘information theory’

Better User Experience Podcast #19 – The Top 10 Key UX Concepts We’ve Learned in the Past 4 Months

December 19, 2011 Leave a comment

We’ve been doing this learning-about-UX thing for a little over 4 months now. On today’s podcast we take a look back at the Top 10 things we learned in that time. If you’ve been following the site over the past week you know that we’ve already covered this material in two posts. But like any good discussion, we dig a bit further into each topic on the podcast.

Top 10 Key UX Concepts We’ve Learned (So Far)

10. Proactive vs. Reactive
9. UX Testing Resistance
8. The Power of Process
7. The Difference Between User Experience and Usability
6. The Critical Path
5. CARP
4. Prove vs. Improve
3. Signal vs. Noise / Information Theory / Entropy
2. Quick Tests Can Be Valuable
1. There’s a Great UX Community

Remember you can subscribe to our podcast on iTunes – new episodes each week!

Follow us on Twitter: @BUXofficial

Friend us on Facebook: facebook.com/BetterUserExperience

View us on Youtube: youtube.com/user/BetterUserExperience

Looking Back: 10 Key Usability Concepts: Part 2

December 16, 2011 4 comments

When Ben and I started learning about UX back in August, we committed to a period of learning that ended on December 15th. That doesn’t mean we’re going away, but it does mean that we’re at the end of the time we set aside to get our feet wet in the discipline. As a result, we’ve been doing some reflecting on what we’ve learned and what the most valuable take-aways are from the past 4 months.

On Wednesday, Ben covered the first 5 key UX concepts and today we polish off the list with numbers 5 through 1.

5. CARP

They say CRAP. I say CARP. Whatever... it's all good.

When you don’t know what you’re doing, watching experts is like watching magic happen. How in the world does somebody know how to do that? I mean, how did he know how to do that!?

For visual design and page layout, the saving grace for me was CARP.

  • Content
  • Alignment
  • Repetition
  • Proximity

We talk most about CARP during the Feng-GUI vs… posts.  In those posts we would evaluate a page using the principles of CARP and then run it through the Feng-GUI tool.  This gave us a way to reflect on the visual elements on the page and a way for active testing those reflections. CARP is important for UX’ers because it allows for a way to communicate design decisions to stakeholders who don’t have a visual eye.  You can tell them “This is not ‘Nam.  There are rules!” to quote the Big Lewbowski.  It gave form to the void – a method to the magic. When looking at a design, it gave me a way of looking at it to know not just if it was any good but why it was the way it was. It gave me a system for thinking about design.

4. Prove vs. Improve

Improve is easy. Prove? somewhat harder.

In the beginning of learning about user testing, we thought user testing was essentially a scientific endeavor. And by that I mean that we thought that you had to have strict controls and that it really mattered how the tests were conducted. We felt a strong need to make sure that no bad data tainted the testing process.

But thanks to the Wisdom of Steve Krug and the idea that user testing doesn’t have to be about proving an idea, it’s about improving the website. And because all websites can be improved, in a sense it’s fish in a barrel.

There’s more to that concept: we are all experts at filtering and grouping, it’s about getting points of view not about error bars, and that a majority of the improvements on the website are going to be able to be discovered by out-of-context users.

This means that quick, down-and-dirty testing can be effective because to a certain extent, it’s all about just measuring stimulus response. It’s a freeing concept.

3. Signal vs. Noise / Information Theory / Entropy

See? Easy. I see signal v noise everywhere now.

It’s important to remember that websites are at heart, a communication medium. The computer is a communication device. The Internet connects us. But your web presence is responsible for communicating the message.

The ability to effectively communicate online is essentially what it’s all about.

There are two parts to communication: the communication channel and the message itself.

Information theory is concerned with the communication channel. The message is for all intents is irrelevant. The reason the message doesn’t matter is because it’s generic. Web content is text, video, and audio. The specifics of those content types don’t have any bearing on how the Internet works.

The channel itself has a maximum capacity (or maximum bandwidth). In that channel, there are three possible states:

  1. the amount of the channel used by the signal
  2. the amount of the channel used by noise
  3. the remaining unused channel capacity

If you assume that channel capacity is 100%, all you’re left with is signal and noise. Or, more abstractly, order and disorder.

That’s why we dragged the concept of entropy into the discussion in the first place (Ben’s awesome entropy in UX podcast). At a real level, defining a website’s critical path is the same thing as creating order from disorder.

We know that entropy is the tendency of things to go from order to disorder. But it’s also a measure of the system’s inability to do work.

A web site’s job is to do work. “Work” in this case is defined as “the process of accomplishing the website’s goals”. If you lay out your critical path, you should be able to generate a number for each page in the process that is probability that somebody will leave that page and not continue any farther on the critical path.

Lower that number and you will have reduced the entropy on the website. You will also have strengthened the website’s signal and lowered its noise.  You will have strengthened your critical path and will have additional leads or revenue to show for it.

2. Quick Tests Can Be Valuable

the Flash

Usering testing in a Flash? User testing BY THE FLASH! Ben, I think we found our gimmick.

As we talked a bit #7 Prove vs. Improve, the real shocker for user testing is that it can be done really quickly and with all kinds of leading questions. Essentially, think of a professional yet scientifically invalid test and it’s probably good enough for a basic user test.

The take away is another Krug-ism: “Test early and test often”.

1. There’s a Great UX Community

When we started this site it was a selfish endeavor to learn about web usability. I only use the word ‘selfish’ to mean that we did all of this for our own gratification.  Yes, we hoped that what we put together would resonate with others. Yes, we hoped we’d find a cool existing community of people who are into UX which we could become part of and could learn from and contribute to. And yes, we hoped to reach out to a web ux company here and there.

What we found exceeded our best expectations. Universally, the UX community seems to be made up of good folks who love to learn, are striving to get better, and who generally have a groovy outlook on things.

We’ve been lucky enough to talk to Rafael from Feng-GUI, Paul from Usabilla, and Rachit from Inspectlet. They were very generous with their time and their wisdom and we appreciate them taking the time to talk with us.  Check out our Blogroll and Twitter … followees to find out our community.

In the coming weeks and months we hope to talk to loads more. We want to talk to people passionate about building a better user experience. Learning how to build sites that give users a better experience is what motivates us to do this website, to write these posts and to do the podcasts that we do. We’re fighting entropy and we’ve learned that we’re not alone.

There’s a whole wonderful community of developers, designers, interface experts and ux tool makers who are all fighting the good fight. We’re honored to be a part of it.

Snoopy and Woodstock

awww.....

Sell More on Your Website by Understanding a Bit About Entropy

October 19, 2011 3 comments

If you listened to Monday’s podcast, you know that I’m on a bit of a kick about entropy. It has to do with this book I’m reading called The Information, by James Gleick. The book is a history of information and the rise of information theory. Really good stuff. And he spends a good bit of time going on about entropy.

Now, entropy is a bit of a scary word. It has a kind of intrinsic feel to it, like we have some intuitive sense of what it means but when it comes to spitting out a specific definition, we all turn into karaoke performers trying to sight read Snow’s song “Informer”.

Hello? 1992 called. They want their metaphor back.

Entropy, simply, is a measure of the unavailability of work inside a closed system.

At least that was its original definition. It was first purposed by Rudolf Clausius and described a specific quality of thermodynamics.

Energy as Information

James Clerk Maxwell was the first to link this quality to the idea of information.  See, he looked at it as order and disorder. Order and disorder imply knowledge. To make order you must know something about the thing you are ordering. In his mind, it involved a little demon who controlled a very tiny door between two rooms. In one room were fast moving molecules. In the other room were slow moving molecules. And he decided what molecules got through the door. While he was sitting at the door he could choose to mix the molecules or keep them separate. But, because of the laws of thermodynamics, if he were to just open the door, after a period of time of fast molecules bumping into small molecules, every molecule would more or less be moving at the same speed*.

Makes a great Halloween costume.

With the help of Maxwell’s Demon, entropy was now linked to information.

The second law of thermodynamics says that entropy is always increasing. This means that without intervention, everything moves from order to disorder. Or to put that another way, from specific to general.

If you’ve ever been to a business meeting, you’ve seen this before: Interesting, dynamic ideas often get presented at the start of a meeting and boring, mediocre ones often end them.  Sweet, sweet car designs are presented at automobile shows and then the same boring sedans are cranked out year after year. Windows XP was supplanted by Windows Vista.

Entropy is everywhere.

A case can be made that what made Steve Jobs great was his ability to fight entropy in the extreme. Before him, computers were for governments, science and business. Because of him we can talk to our phone using natural language and it can respond to our information needs.  For sure, he didn’t do this alone and in a vacuum.  But it’s hard to deny that he brought information to the masses in a way that had never been seen or experienced before.

He gave people the tools to be able to manipulate information – to create order out of disorder. He created the technological environment that we are now living in.  Would there be an Android without an iPhone?  Would Windows 7 be half the OS it is now without having to compete with OSX?

Dig a little deeper into social behavior and two themes for how people deal with entropy begin to emerge.

People’s Relationship to Entropy

  1. They want to create order from disorder
  2. Not all the time

Now let’s run those two rules through a “customer” filter and see what happens.

Customer’s Relationship to Entropy

  1. People have a finite amount of energy to spend in a day
  2. As a result, people want to conserve their energy
  3. People want to expend energy on activities of their choice
  4. People do not want to spend their energy unnecessarily

And like that we’re out of thermodynamics and into the world of web design.

Common Sense Stuff

What a customer is saying is: I’ll buy your product or service if I like it and the price but I’m not going to spend a lot of energy to do it.

Now we can state the goal of web design in scientific terms:

The goal of web design is to produce a website with low entropy.

That is to say, a web design is successful when it makes it easy for a person to do what they want to with as little effort as possible.

And this can be measured.  Right now.  In fact, you may already be measuring it.

Entropy and Efficiency

Look at the number of visitors to your site.  Look at the number of sales.  Now, do that for the past six months, or year, or two years and get an average number of visits to sales.  Whatever that percentage is, that’s how well your website has worked over that period of time.

Let’s say, for the sake of argument that you have 100 visitors to your website each month, on average, for the past year. And in that time, on average, you had 2 sales each month.  Simple math will tell you that your website has a 2% conversion rate.  That is to say, it is two-percent efficient.

The goal of user testing is to discover changes that can be made that will increase the number of sales on your website, given the same number of visitors.

If user testing is conducted and changes are made to the website in the above example and over the next several months the website averaged 4 sales per month, the website would have doubled its efficiency from 2% to 4%.

Efficiency is directly related to entropy. Entropy, remember is about order and disorder. The more order we bring to the website, the less energy the visitor has to expend to buy the product or service and the more efficient it is.

The reason we should talk as designers about entropy rather than efficiency is because efficiency is a by-product of entropy, not the other way around. Entropy is by its nature probabilistic.  The more knowledge you have about the pages of your website, the more effect you have can on reducing entropy – you become like Maxwell’s demon deciding which molecules to let through the door.

Practically Speaking

Every page on your website that somebody can find via search or a link is a potential entry point. Likewise, every page is a click away from being an exit point. It’s all very messy and random.

The job of web designers, programmers, interface designers, and SEO people do is give shape to those pages.

SEO is responsible for managing the website’s relationship with search engines. Another way to think about it is that they are responsible for getting traffic to the website. In a closed system, an SEO guy wouldn’t be necessary. But our website itself exists in the larger eco-system of the Internet and so messaging extends beyond your website. SEO, because of its connection to traffic, is the first person to set expectations for your website’s visitors.

Designers and programmers work to bring shape to the website. E-commerce sites have catalog pages, product description pages, a cart, and a checkout process – and they show up in that order.

Information sites like Google, YouTube, and Wikipedia are designed so that information can be easily found and accessed.

From disorder emerges order.

On this website we’ve spent a good bit of time talking about defining a website’s critical path. We believe that user testing should revolve around improving the efficiency of that path.  It’s important to remember that it’s a literal path. It is about energy flow.

Entropy, for a website, can be defined as the likelihood that a visitor to a website will NOT complete the critical path.

Fighting entropy on a website means giving form to and then reducing the resistance of the critical path.

This is why a conversion funnel is such a valuable web analytics tool.  It shows entry and exit points with respect to the critical path.  It points out to you places where user testing could reduce entropy.

On Friday, we’re going to take a look at the third-rail of web design: pricing.

Nothing introduces entropy into a website quite like pricing.  Money is really a physical manifestation of a person’s energy.  They know that they have to expend a certain amount of energy to accumulate money.  Money, like energy, is also finite for most people. Thus, pricing is directly related energy, and thus, entropy.

We’re going to take a look at some pricing strategies that can reduce entropy and increase the odds that your site’s visitors will respond positively to your price point.

#####

* I saw more or less because it’s not practical for the average person to know the behavior of every molecule. So what has risen in its place are laws of probability. That is to say, while a closed system tends towards maximum entropy, at the molecular level, there will be exceptions to this rule. Extremely unlikely events, however unlikely, still happen. But at the macro level, these probabilities are so low as to be practically non-existent.

Better User Experience Podcast #10: Entropy and Web Usability

October 17, 2011 6 comments

Newman is, no kidding, on a five day canoe trip down the Suwannee River in Florida. It’s enough to make you want to break out in song, I swear…

In lieu of lining up another special guest this week I decided to pull a one-man show and talk about everybody’s favorite topic: entropy.

Wait! Hold that yawn! I swear, I’m going to clarify things for you and turn around and show you why it’s a big deal for web usability.

If you haven’t already, be sure to subscribe to our podcast on iTunes – new episodes are available every Monday!

And, if you’re interested, you can read more about entropy and information theory in the book The Information by James Gleick.

Podcast Corrections

  • I refer to Rudolf Clausius as Clausius Rudolf at least once in the podcast. My bad. His name is actually Rudolf Clausius.

Podcast #5: Information Theory and ‘It From Bit’

September 14, 2011 Leave a comment

It’s late Wednesday – later than I’d like for my weekly blog post but I have an excuse! Newman and I ended up having a long conversation instead. We taped it, called it a podcast, and now, here it is for your aural enjoyment.

Regular readers know that new podcasts are a Monday thing and this is clearly not Monday.

And you are correct! Way to know your days of the week.

Today’s topic though is messy. It’s better explained through a series of ramblings than through a discrete article. In fact, the basis of the podcast is about that very thought. Newman and I talk about the book The Information by James Gleick, or at least about the prologue.

You know a book is going to be good if you can do 45 minutes on something you read from the chapter before the first chapter. If you want to “Take A Look Inside the Book” on Amazon, you can do that here.

We also refer to a few TED Talks, which you can find below.

TED Talk: David Christian: Big History

TED Talk: Aaron O’Connell: Making sense of a visible quantum object

Enjoy the podcast (subscribe!) and be sure to join us on Friday for Newman’s epic blog post where he compares fivesecondtest.com to Feng-GUI. It may be a tag-team match, but you’ll have to wait until Friday to know exactly what that means. In the meantime, hit us up in the comments.