News sites, large and small, can measure usability

Steve Stewart is Internet supervisor for The Decatur (Ala.) Daily. While earning a master’s degree from the University of Alabama, he conducted usability studies under the direction of Wilson Lowrey, associate professor of journalism.

What small news organization can afford the time or money needed to enlist readers in planning and evaluating a website? Usability testing is a grand idea, but isn’t it too much to take on while you’re struggling just to get out the daily paper?

Better questions might be: Who can afford not to measure usability? (Nobody.) And can it be done without requiring too much time and expense? (Yes — with both usability testing and less-formal techniques.)

Usability means user-friendliness, and a bedrock principle of Web usability is “Don’t make me think.”

That might seem odd when applied to a news site, where we expect readers to look for things to think about. And they will think about topics that interest them. But first we must catch and keep their attention — and on the Internet, attention is fleeting.

“Don’t Make Me Think” is the title of usability consultant Steve Krug’s book. “[W]hen I look at a Web page it should be self-evident. Obvious. Self-explanatory,” Krug wrote. “I should be able to ‘get it’ — what it is and how to use it — without expending any effort thinking about it.”

Another usability consultant, Jakob Nielsen, explained the consequences of making Web readers think.

“There’s no such thing as a user reading a website manual or otherwise spending much time trying to figure out an interface,” Nielsen wrote in his online “Alertbox” column. … [L]eaving is the first line of defense when users encounter a difficulty.”

Laura Ruel and Nora Paul provided a guide to do-it-yourself usability testing of news sites in OJR in August. You can find plenty of other usability articles on the Internet and in print, but not much else about news site usability. And, aside from the OJR article, there’s almost no help for the small paper with limited resources (85 percent of all the dailies in the country have less than 50,000 print circulation).

Usable newspapers — and sites

In a 2006 survey, the Pew Research Center for the People and the Press found that 46 percent of readers considered print newspapers usable. They cited a newspaper’s convenience, the ability to read it any time and anywhere, and a preference for hard copy over audio or video. Readers also liked the fact that they can read the newspaper at their own pace, reread and let it sink in.

Krug has a generally high opinion of the usability of newspaper websites.

“I think by and large, they are pretty usable,” he said in an interview. “There’s not much that they have to do. They have to expose you to the list of headlines and organize them by sections so you can browse them the same way you browse the newspaper.”

Usability consultant Jared Spool said large papers generally do a better job on their websites than small papers. Each newspaper needs a vision for its own site, he stressed.

Using usability in a redesign

The newspaper where I work — The Decatur (Ala.) Daily, with a weekday print circulation of about 20,500 — set out to modernize its online edition and make it resemble the recently redesigned newspaper. We sought Web users’ help.

We asked them to send e-mails, we posted an online questionnaire, and we invited 16 users to come to the newspaper office and test our site. We also looked at metrics — statistics about site traffic. We learned a few things about how to measure usability, and we received practical advice to enlighten us and our consultants.

When the e-mails and questionnaire responses started coming in, the first shock was that many existing readers did not want our site to change. They already knew how to use it.

“Please don’t change your website,” one reader wrote. “So many companies think they must change every year or so. They don’t understand how mad people get when they are forced to relearn how to navigate a new site. If it ain’t broke, don’t fix it!!!”

That last sentence was a frequent refrain among about 185 e-mails we received, and it also showed up in questionnaire responses. But all three methods also produced constructive criticism — for example, impressing on us that we needed to improve our search feature, to reduce clutter, and to use screen space better.

We received about 1,400 questionnaire responses and enlisted 16 usability testers — also known as readers or potential readers — to visit our office and evaluate the site.

Usability testing

We asked eight readers to conduct usability tests on the old site. Months later, after developing a prototype of the new site, we asked eight additional readers to do back-to-back tests of the old site and the prototype.

Half of the testers were regular readers of the newspaper’s site, and half were not. Each came to the newspaper office for an hour or less and tried out certain features of the site at my request. I asked them to show me how they used the site and to think aloud, loosely following a procedure that Steve Krug described in his book.

How it helped

Our findings helped shape the development of our newspaper’s new site. The new design is largely the work of our consultants. But our interaction with readers helped us tweak the site, avoiding problems and omissions that otherwise might have shown up only after the new site went online.

For example, we changed some menu words, added items to the main menu and created additional sub-menus to make it easier for readers to find specific news and features.

We are already learning more from the ultimate usability test, which is what readers at large like and dislike on the new site. They are sending e-mails that have shown, among other things, that we needed a page summarizing all the news stories for a particular day, as well as easier access to the archives. These are features from the old site that regular readers liked.

We may find that additional usability testing will help us develop solutions to problems before making changes online. That’s what usability experts recommend: an endless cycle of testing and improving the site.

Comparing the methods

Each of the usability measurements tried here has its strengths.

The questionnaire gave us the most numbers, though not a statistical sample because the respondents were self-selected. The questionnaire and e-mails gave us insight into why users reacted the way they did, but the usability tests were the most helpful because they allowed us to watch people use the site, listen to them and talk with them. What they did sometimes shed light on what they said, and sometimes even contradicted it.

“I would probably rather do this than have to fill out forms, written responses, that kind of thing,” one usability tester commented.

Colors are one thing people might notice in a usability test but ignore in a questionnaire or e-mail. People didn’t even mention colors in the questionnaire results and e-mails, but usability testers mentioned them a lot, especially in comparing the bright colors of our old site with the pastels on our new one.

Usability tests let you involve people who don’t normally read your site; you can recruit them through mutual acquaintances and advertisements in other media. Watching and hearing people express delight with your work (if it happens) is one of the rewards of usability testing.

“Well, call me a taxi!” one user exclaimed to express pleasant surprise. “I like that…. Very good. Just very good.”

But evaluating usability tests is subjective, and don’t be surprised if different evaluators come to different conclusions. And usability tests require more time and preparation than the other methods.

The questionnaire and e-mails let us hear from many more people than the usability tests. They could respond at any time from anywhere. But these people already were readers of our site — not the non-readers whom we’d like to attract. Occasionally we received comments from people who were journalists, Web experts or computer experts, offering specialized advice.

We found that people tend to complain in questionnaire responses and e-mails, and sometimes they comment more on content than usability. They also resist change.

In a questionnaire, you can ask both open-ended and structured questions, and online-survey tools are easily available and flexible. But if respondents complain or ask a question, you can’t answer them unless you have an identification or e-mail address.

E-mail feedback provides at least the possibility of interaction; you can send readers a reply or follow-up question and hope they’ll respond.

We had just a little experience with site metrics — enough to see that metrics could add some science to usability measurement by providing hard numbers that would corroborate or discredit conclusions drawn from other usability measurements. But making this happen will require more experience and research with metrics, and perhaps better software for interpreting site traffic.

“The [metric] that’s probably the most useful is what people are typing into your search box,” Jared Spool said. When people search for something, it’s because they can’t find it on the current page. “If you know what page they were on when they typed it, they’re telling you what page it should have been on.”

Metrics can show what positioning works best for particular kinds of features, Krug said. If something is getting no clicks in one spot and you move it somewhere else, where it receives clicks, then metrics will tell you the move was a success.

Metrics are “getting a lot more valuable” because in the past, “any tool that would do a decent amount of analysis for you was expensive,” as well as unreliable, Krug said.

That’s changing with such tools as Google Analytics. It’s high-powered and free, Krug said, and can provide “enormous amounts” of information on what kinds of stories are being read and the paths people take through your site.

It’s not expensive

For a small newspaper such as ours, the techniques we used have the advantage of requiring not much money — mostly time.

For the usability tests, we found a small room that wasn’t being used every day and set up there. All we needed was a computer with an Internet connection and (optionally) a digital video camera, available for about $100, to keep track of the mouse cursor and the changing screen and to record the sound.

We obtained the names of possible usability testers by advertising (in the newspaper and online) for people to contact us and help us improve the website, and also by word of mouth. We phoned individuals and invited them to come into the office for an hour or less for the testing. At least half of those we called were willing; it did not take long to find enough testers. We rewarded the testers with pizza coupons and profuse thanks.

We set up the online questionnaire through SurveyMonkey.com (one of several available services) and paid about $20 a month. Using SurveyMonkey’s tools, it was easy to construct the survey with both multiple-choice and open-ended questions. It was also easy to retrieve and analyze the results. We attracted readers to the survey by placing a notice and a link at the top of our home page.

The e-mail feedback resulted from our online and print ads that asked for help in improving the website. Metrics software came as one of the services provided by our Internet host.

Get readers involved

There are other ways to get usability advice from readers, including interviews, focus groups, paper prototyping, card sorts, and eye-tracking (which has been done extensively on news sites by the Poynter Institute and its partners).

But the point is to get readers involved. And as Jared Spool says, it doesn’t need to be elaborate, and it doesn’t need to be expensive.

“Take a laptop and go some place where the users are easily found,” he advised. “Just go hang out in Starbucks and buy people coffee.”

Benefits beyond usability

Our usability testing provided many comments that ostensibly had little to do with usability, but which told us what readers thought about our content. And content is not exactly extraneous to usability, as both Krug and Spool said.

“I think they are merging,” Krug said. “Usability, particularly of home pages, suffers greatly because you run into the business issue of ‘What do we have to feature on our home page to stay in business?'” As a result, the home page becomes cluttered.

Content and usability also overlap on the issue of advertising, Spool said.

“The advertisements have always been traditionally designed to distract the readers from what they’re doing. So readers are trying to concentrate on the content, but the ads get in their way. …From a usability perspective, users would think the online paper is better if all the ads went away tomorrow. They wouldn’t miss them at all. And this is a problem. This is a huge problem.”

One thing that helps, Krug said, is to place ads next to related stories, letting the content draw readers rather than compete.

In addition to what you learn about usability and content, you will generate goodwill among readers and potential readers.

After readers spent two hours or so traveling to and from the newspaper office and testing the site for us, we owed them thanks. But almost all of them were happy to be asked for their opinions and happy to help us.

“I think you all are to be applauded for getting strangers and listening to individuals’ thoughts and having them do things,” one of them said.

One usability tester wrote us a thank-you note. And perhaps our most tangible reward came when another decided to subscribe to the paper.