The only metric that matters

In the nearly 15 years that I’ve been working online, I’ve watched the most popular metric among Web publishers change from “hits,” to “page views,” to “unique visitors” to “time on site.”

But none of those metrics really matter. I’ve seen sites post phenomenal numbers for each of those categories, and fail. There’s one metric, and only one, that truly matters in determining your websites’s commercial success.

Revenue.

Your visitors can spend hours per month on your website, but a huge “time on site” value by itself won’t entitle you to a dime (see Twitter). I suspect that one reason why various Web metrics fall into and out of favor over the years is that managers talk up or down those metrics based on their website’s individual performance. Someone notices that people are spending more time, on average, on the website, then he or she gets on a panel at a news industry conference and – boom – “time on site” becomes the metric everyone needs to consider.

That’s nice for such sites, but, ultimately, nothing matters but money. (Some might publish for other reasons – to exert influence or win votes, for example – but for most of the news publishers who read OJR, the ultimate criterion for a publication is whether it makes enough money to justify remaining in business.)

Non-profits might consider them exempt from this rule, but non-profit doesn’t mean non-revenue. Try running a non-profit without donations, without grant funding or without a trust fund sometime. If content on a non-profit website doesn’t entice funders to continue funding it, that publication won’t last long, either.

Money matters. To all news publishers.

Other metrics can mislead you. “Hits” is nothing more than calls to your server, which can be grossly inflated by the use of multiple images, style sheets, scripts and other elements on each webpage. “Page views” can be sharply reduced on sites that use advanced scripting and DHTML techniques, allowing pages to “change” in their users’ browser without calling another page view from the server. “Unique visitors” favors sites with huge “drive-by” traffic, including those where readers don’t stay long enough to read or to click anything. And a large “time on site” value means that readers are spending, well, a lot of time on your site. Which sounds great, unless you consider that means those readers aren’t clicking away from your site via your advertisers’ ads.

So when you’re using metrics to examine the efficacy of your website and its various sections, features and pages, do so with an eye on how those elements contribute to your site’s overall revenue.

I look at the eCPM (earnings per thousand impressions) of each page and section of my websites. But I don’t consider that metric in isolation. The pages on your website are a team. They should be working together to attract readers to your site, engage them to become loyal return visitors, then deliver them efficiently to revenue-producing pages, whether those be advertisers’ websites, donation forms or e-commerce features.

So I gladly accept low eCPMs on “gateway” pages that are bringing large numbers of readers to my site, either from search engines or other websites, so long as those readers eventually are continuing on to other, revenue-producing pages on the site.

What you don’t want to do, however, is waste ad impressions on pages where readers aren’t converting. I look through my metrics reports for sections with far-below-average click-throughs and eCPM rates, then remove the ads from those sections. Typically, I end up running registration and reader profile pages, comment and blog input forms, and most “rules and regulation”-type pages ad free. Removing low-performing ages from your ad inventory helps support higher click-through and eCPM values throughout the rest of the site, potentially improving your overall pay rates in third-party ad networks, such as Google’s AdSense.

I also remove ads from popular “gateway” pages with high bounce rates, meaning that new visitors aren’t clicking elsewhere on the site after arriving, but simply hitting the back button to return from where they came. In extreme cases, where bounce rates approach 100 percent, I’ve even deleted those pages from my site (or hidden them from search engines) to discourage “drive-by” traffic from people who clearly aren’t interested in my site, its content or advertisers.

Finally, I take a hard look at features that earn my highest eCPMs, and consider how I might replicate those successes. Was there something about the format or design of those pages, or the content upon them, that elicited a high percentage of clicks or well-paying ads? Did I hit upon especially lucrative keywords within that page’s topic? Is there high-paying advertiser demand for the topic of the page?

Answers to those questions help me decide what and how to create when building new pages and sections on the site. News publishers have been doing this for ages, even if that process isn’t typically made clear to reporters in the newsroom. Those new outdoor recreation or home decor sections didn’t appear in the paper just because an editor felt inspired. The publisher likely had some compelling metrics demonstrating advertiser demand in those markets. That’s what you should be looking for in your site’s readership data.

Of course, you need data in order to analyze it. That’s why smart news publishers ought to be experimenting, constantly. Try new topics, new writing forms, new functionality – then create new tracking channels to monitor those experiments, to build a database of information that can help guide you in making smarter decisions about the growth and maintenance of your website.

Staking out newspaper survival in Web analytics

This is part two in a two-part series on Web analytics and the future of news. [Part one]

The news industry is caught in a destabilizing position – each newspaper is going to have to come up with its own unique algorithm to give advertisers a sense of their audience.

The new metric that advertisers increasingly care about is something called “engagement” – how users are actually interacting and spending time with the site. But because each newspaper website offers unique content, there’s no blanket measure for creating a uniform “engagement” score for the news industry from different points of comScore or Ominture data.

“We can’t boil it down to X percent of unique users plus your time on site plus page views,” said Alan Segal, director of audience development at the Atlanta Journal-Constitution.

He explained that the formula in Atlanta would be different from elsewhere. “Engagement for us looks different for us versus the New York Times,” Segal said. “It depends on your market and what the goals are and how you interact with your community.”

Why engagement? Because it’s a more robust way of looking at the world than just uniques, page views, visits, or clicks per minute.

As Alex Langshur, president of the Web Analytics Association, said, “Measures that reflect audience engagement are more valuable than metrics that just measure raw numbers.”

He explained to me that looking at something like unique visitors alone would be a comparable analogy to looking at Lehman Bros’ balance sheets – everything looks the way you want it to, but it doesn’t tell you what’s really going on.

“In the media space just looking at things like number of visits or number of page viewed don’t tell you a lot about the level of engagement that people have,” he said.

The other problem, as we addressed here, is that unique, page views, and visits can all be, as Segal said, “gamed.” Here’s a closer look.

Unique Visitors: Messy for a number of reasons. As USC Annenberg lecturer Dana Chinn said, “It’s a deeply flawed metric. It counts computers and not people, so it’s over counted and under counted.” That means that, for instance, one person can check the LA Times on three computers – still one user, but it registers as a unique visitor from each computer. Meanwhile, computers from Web cafes or libraries that are used by multiple people are likely to undercount.

Uniques also require that users install a cookie into their computer. “All the research shows that people delete cookies or on a regular basis reject cookies,” Langshur said.

Page Views: Also problematic but absolutely crucial for advertisers. Page views are literally the number of times someone loads a single page of an Internet site.

Langshur said page views are difficult to standardize because of the variety of ways of presenting content. In addition, newspapers could create artificially inflated page view counts by breaking up text, or deflated counts by having a special project’s features all loaded on to one page.

Nonetheless, page views are important for advertisers, so news organizations need to understand how to reconcile with the fact that page views are not standardized with the fact that advertisers want to know how many people are seeing their ad.

“From an advertiser’s perspective, page views are important as this is tied to the number of impressions they might generate for their ad.” Langshur said. “Impressions are important, and depending on the type of ad space used and the ad type served, the first critical step to generating click-throughs.”

Visits: This is another confusing term, but the Web Analytics Association has tried to provide a clear definition to help people understand exactly what this means. This is their definition:

“A visit is an interaction, by an individual, with a website consisting of one or more requests for a page. If an individual has not taken another action (typically additional page views) on the site within a specified time period, the visit will terminate by timing out.”

At this point, 30 minutes seems to be the agreed upon standard – especially by the Interactive Advertising Bureau, which is attempting to create consistent definitions for advertisers. However, you can also change your analytics software to change this time.

But here’s where things get complicated, according to the Web Analytics Association:

“However, in the case of sites where interaction consists solely of file downloads, streaming media, Flash, or other non-HTML content, a request for this content may or may not be defined as a ‘page’ in a specific Web analytics program but could still be viewed as a valid request as part of a visit. The key is that a visitor interaction with the site is represented.”

Content using Ajax, Flash or widgets isn’t captured as page views by most Web analytics programs, either.  This leads to undercounts in page views – which is definitely a problem for ad-supported online publishers.

Visits, according to Langshur, may get closer to engagement, but “a visit on its own does not define engagement,” and visits need to be taken into account with everything else a user does on the site.

Problems with the reporting agencies: A crop of agencies, new and old, have rushed to the Web analytics market for newspapers – some common names include Omniture, comScore, Nielsen, and Google Analytics.

Segal and Chinn point out some problems with the data collection. comScore and Nielsen rely on panel data — a really sophisticated sampling method of the behavior of a selected group of individuals. This is problematic for news organizations for few reasons, but primarily because most businesses aren’t particularly keen on having their employees set up trackers on work computers, and that’s where lots of online news reading takes place.

Then, these numbers don’t match up with Omniture or Google Analytics data – which don’t use panel software – and these companies in turn may not take into account the idiosyncrasies of individual websites.

What’s a news organization to do?: If the answer lies in engagement, as we have suggested, this means that news organizations are going to have to come up with their own specific measures for tracking audience behavior and making a valid claim that resonates with advertisers. Gone are the days when a single measure could account for it all, and gone are the days when newsrooms could take a single snapshot of the industry all at once.

Engagement is only a starting point. But it gives news organizations and advertisers a sense of how people are actually using their sites – and it’s a uniquely customizable opportunity that allows news organizations to sell their strengths.

Chinn said, “The Web is trackable.  Audiences are knowable. It is up to each advertiser and to up to each Web publication to know what their audiences are and what to provide for advertisers.”

Engagement, then, will be a “measure of myriad things based on the kinds of products that a website has,” she said. “There’s all kinds of talk on what is the algorithm for engagement and the truth is that ‘It depends.'”

Measuring user engagement: Lessons from BusinessWeek

Think about the traffic statistics you refer to when you look at Omniture or Google Analytics data for your site. Unique visitors? Pageviews? What do they actually tell you about your audience? The ubiquitous unique visitor metric treats your most passionate and thorough users exactly the same as those of the one-hit scan-and-scram variety. And pageview tallies are so apples-to-oranges in these days of Flash and AJAX that they’re rendered almost meaningless. If you really want to describe your audience, it’s time for some new metrics.

But what else is there? The folks at BusinessWeek think they have an answer, and it’s not about how much content users consume but rather what they do with it. I asked BusinessWeek’s online editor, John Byrne, about his team’s efforts to go beyond pageviews and visits to quantify something more inscrutable: user engagement.

What is BusinessWeek’s definition of user engagement and why is it important?

User engagement is how we nurture and build a community. Our reader engagement index is a comments-to-postings measure for a given month: So we will tally how many comments on X number of stories/blog posts that BusinessWeek.com published that month. This gives us a ratio figure that we track to determine our monthly reader engagement index and growth. In February of this year, we received from our community 28.2 perspectives and insights for every story or blog post we published. A year earlier, we received 23.7. So we know we’re moving in the right direction.

It’s important because we value, and so measure and gauge, all our interactions with our readers on BusinessWeek.com — including commenting on a story or blog post. The next level is how our writers and editors engage our readers in a conversation, and also welcoming our readers to write longer pieces for us, or to report (at least once a week) a reader-suggested story. We’re also engaging with BW readers on other sites, such our Ning network that served as a forum to generate and debate stimulus spending priorities for the Obama administration, or interactions involving our 50+ staffers on Twitter. If we don’t listen to our readers and interact with them, and then act on the feedback and suggestions they’re giving us, we’re dead in the water. That applies to any media brand today, not just BusinessWeek. We’re just making it more of a priority, including featuring readers on an equal plane with our writers — on our home page, for example, our featured reader is given more prominence than even a Jack & Suzy Welch.

What information sources and tools do you use in measuring engagement?

Our reader engagement index involves Omniture (for stories) and Movable Type to track numbers of blog comments and posts.

Beyond our reader engagement index, other measures include how much you’re retweeted — for instance, one of my tweets on March 25 was retweeted 130 times. We also look at referring traffic from blogs or Twitter on Omniture, or by running a Twitter search: http://search.twitter.com/search?q=businessweek or http://search.twitter.com/search?q=johnabyrne.

And also look at Google BlogSearch or blogpulse (owned by Nielsen Buzzmetrics) for mentions of businessweek.com: http://www.blogpulse.com/search?query=www.businessweek.com

What do engagement metrics tell you that conventional metrics do not?

It shows, in quantifiable/measurable terms, how much our readers care about us. To post a comment or submit a suggestion is a strong indicator of a BW loyalist, someone we need to nurture and engage and reward. It also tells us how much (and how well) our staffers are interacting with readers. The problem with time spent on a site is that it also measures, in the case of a portal, email time, or in the case of a site heavy on video, time spent watching video, which can be like TV. I also argue that simple pageview metrics are heavily influenced by slideshows and email. There is no better sign of commitment or engagement than the act of reading a substantive piece of journalism, thinking about it and then forming a point of view on that story that you’re willing to write and share with others. That is true engagement.

How do you use this information to improve the site?

You can’t manage something if you don’t measure it. So having a point of reference for exactly how we’re doing drives other ideas and initiatives to increase engagement.

Does increased user engagement translate into benefits for advertisers?

Yes. Anecdotally, our sales team is selling our engagement story and using it to differentiate what we do versus our competition. It also helps to better position our Business Exchange, a new Web 2.0 product we launched last September, as a key component of our engagement efforts.

So, let’s look at some potential engagement metrics and what they might tell us. This is by no means a comprehensive list; it’s just what came to my mind. If you have other thoughts on ways to measure engagement or how you might use this data, please, um, “engage” in the comments below.

  • Internal metrics: Statistics about engagement that takes place on your site
    • Comments posted: Shows how much users are inclined to react to a topic, or supply insights of their own.
    • Return commenters: In other words, how many people comment multiple times on the same item? This is a measure of conversation around a topic. (Kudos to the Guardian’s Kevin Anderson for this idea.)
    • Times e-mailed: Reveals how often users are sharing this information with friends. This metric probably skews toward neophyte users, as more experienced users are presumably less likely to use an “e-mail this” feature.
    • Average time spent on page: Shows how thoroughly users are consuming the content, perhaps? Lots of asterisks, though, as John points out.
  • External metrics: Statistics about how people share and discuss your content elsewhere
    • Tweets/retweets: Measures how “viral” this content is in a social network. There’s also geographic information embedded in these tweets that could tell you where a topic resonates particularly strongly.
    • Diggs: Another measure of the viral nature of a topic. Given Digg’s audience, this metric might favor content that appeals to a techie crowd.
    • Delicious saves: Shows how many users stored this page with an eye toward returning to it. This metric could be particularly useful for ongoing features that you want to build a regular user base for.
    • Inbound links from blogs: Quantifies the discussion taking place in the blogosphere. This could help you identify the blogs that are most attuned to the content you produce — as opposed to just the ones that send you the most visitors (which are not necessarily the most engaged users).

Each of these metrics is easily available for a given URL at a given moment, but keeping track of all your stories over time would be impossible without some automated assistance — particularly with regard to the external metrics.

Here’s what I’d like to see: A web service that will track a URL across several services (Technorati, Delicious, Digg, and maybe internal analytics packages too) to see how it’s being referenced in each medium, then tabulate all those metrics into a single “engagement score”. (And I’d love to hear from any programmers who want to take a stab at building this!)

Meanwhile, anybody have any engagement metrics tips they’d like to share?