UXasm

Some Stuff About User Experience, eCommerce, Social Media & etc.

Tag Archives: usability testing

A Framework for Site Reviews (with Examples)

Over the past decade, I’ve been part of many reviews of websites, both in-person, as a consultant (prior to 2009) and at many events. I’ve found that much of the time, the reviews themselves lack structure (particularly those that happen “on the fly” during a conference panel or informal sit-down). Thankfully, during my recent face-off with Distilled’s Will Critchlow in London, I had an excuse to noodle on that and work up some ideas.

The Searchlove conference had a unique concept for our classic presentation battle. We were each given three websites to review around 12:30pm and had to give 30 minute presentations using slide decks 4 hours later. My will to win and avenge my depressing loss at Mozcon Seattle was stronger than my jetlag, and I gave the following presentation:

Read more of this post

Test Everything You Got, Regardless of Its Polish or Fidelity

Sketches, wireframes, et al -- all worthy of testing
Sketches, wireframes, et al — all worthy of testing

Whether you test your work on a regular cadence or only once or twice per cycle, the inevitable question that arises is what to actually test. We start to wrestle with the pressure of maximizing our time and money spent on testing and getting the most insight for that expense. Is it best to put a rough sketch of an idea in front of potential or existing customers or to wait until there’s a more fleshed out version to show? Should it be clickable (really clickable, i.e., working code) or a mocked up experience created using Axure, Powerpoint, Fireworks or any other tool?

Read more of this post

The Science Of Usability Testing

From unskippable cutscenes to galvanic skin response, we investigate the world of videogame user research.

Difficulty spikes, unreliable checkpoints, context-sensitive buttons that might open a door, but might bounce a grenade into your lap instead: these things matter. “Every moment in a game, you’re bleeding players,” says John Hopson, Bungie’s user research lead. “Hopefully, you’re bleeding them as slowly as possible. The most powerful thing I ever did on Halowas make a graph showing how many players we lost each mission. We had these people: they bought the game, they wanted to play, and we failed them.”

Usability testing didn’t start with videogames. It started with product development of a more domestic stripe: with teapots, toasters and car dashboards. Although designers have always spared a thought for their audiences since the days of Jet Set Willy – it’s hard to make even the simplest videogame without thinking of what the player’s going to do or see from one second to the next – it’s only become a serious issue in the games industry relatively recently. Yet with no bespoke track at GDC, no standardised terminology, and no agreed best practices, usability may be gaining respectability, but it’s still one of the least understood aspects of design. That poses some interesting questions. How does the industry approach user research today, and why has something so fundamental waited so long to be taken seriously?

Read more of this post

12 Website Usability Testing Myths

The internet is a wonderful, magical place that is filled with more amazing content than you could shake a stick at; it has an almost unimaginable wealth of resources on a huge array of different topics, and more or less anything you can think of exists on the internet.

The problem though, is not that there is too much content, nor that there are too many sites, it’s just that the vast majority of sites and services suffer from a number of different usability issues that make using them anything from difficult and frustrating to downright unpleasant to use. I’m sure you can think of a number of sites off the top of your head that fit into these categories.

Unfortunately there are a number of different myths floating about saying that improving usability takes too long, costs too much or doesn’t really do anything useful to these sites and services. As someone who works on a website usability testing tool I hear these myths far too often, and I’d like to dispell them permanently.

Read on to see 12 Website Usability Testing Myths, and why they are wrong:

Read more of this post

Pairing Up Usability Testing with A/B Testing

Traditional usability testing involves several steps: creating screeners, sending out recruitment email messages, scheduling sessions, creating test scripts, conducting the test sessions, consolidating the findings, and making design recommendations. Large-scale usability testing can run for months, which is a big investment in time, money, and effort.

One of the challenges usability professionals constantly face is showing the value of usability testing through quantifiable results. Convincing a client to invest tens of thousands of dollars in usability testing often requires some concrete numbers that explicitly tell what the return on the investment in usability testing will be. A client might say, “Yes, the usability testing will tell me how horrible the Web site is and how much users dislike filling out the forms on the Web site. So what?” From the client’s prospective, the real question is: How do I know the usability testing will help me meet my business goals—making more sales, lowering costs, and increasing conversions? Just saying that the user-satisfaction rate will go up is not enough.

A report from Forrester titled “Need to Cut Costs? Improve the Web Site Experience” lists a few easy-to-measure indicators that reliably show the benefits of usability testing. These indicators include fewer customer-service calls about products and Web sites and shorter calls regarding complex issues. We can link changes in such indicators to improvements in a Web site’s user interface. However, there are many other factors that can lead to such changes—such as better product promotion, better support documentation, better internal communication, or simply a decrease in sales. As usability professionals, we need a clear way of showing what we’ve learned from usability testing, how our recommended changes have gotten implemented, and what positive outcomes have occurred because of these changes.

Read more of this post