In the nearly 15 years that I’ve been working online, I’ve watched the most popular metric among Web publishers change from “hits,” to “page views,” to “unique visitors” to “time on site.”
But none of those metrics really matter. I’ve seen sites post phenomenal numbers for each of those categories, and fail. There’s one metric, and only one, that truly matters in determining your websites’s commercial success.
Your visitors can spend hours per month on your website, but a huge “time on site” value by itself won’t entitle you to a dime (see Twitter). I suspect that one reason why various Web metrics fall into and out of favor over the years is that managers talk up or down those metrics based on their website’s individual performance. Someone notices that people are spending more time, on average, on the website, then he or she gets on a panel at a news industry conference and – boom – “time on site” becomes the metric everyone needs to consider.
That’s nice for such sites, but, ultimately, nothing matters but money. (Some might publish for other reasons – to exert influence or win votes, for example – but for most of the news publishers who read OJR, the ultimate criterion for a publication is whether it makes enough money to justify remaining in business.)
Non-profits might consider them exempt from this rule, but non-profit doesn’t mean non-revenue. Try running a non-profit without donations, without grant funding or without a trust fund sometime. If content on a non-profit website doesn’t entice funders to continue funding it, that publication won’t last long, either.
Money matters. To all news publishers.
Other metrics can mislead you. “Hits” is nothing more than calls to your server, which can be grossly inflated by the use of multiple images, style sheets, scripts and other elements on each webpage. “Page views” can be sharply reduced on sites that use advanced scripting and DHTML techniques, allowing pages to “change” in their users’ browser without calling another page view from the server. “Unique visitors” favors sites with huge “drive-by” traffic, including those where readers don’t stay long enough to read or to click anything. And a large “time on site” value means that readers are spending, well, a lot of time on your site. Which sounds great, unless you consider that means those readers aren’t clicking away from your site via your advertisers’ ads.
So when you’re using metrics to examine the efficacy of your website and its various sections, features and pages, do so with an eye on how those elements contribute to your site’s overall revenue.
I look at the eCPM (earnings per thousand impressions) of each page and section of my websites. But I don’t consider that metric in isolation. The pages on your website are a team. They should be working together to attract readers to your site, engage them to become loyal return visitors, then deliver them efficiently to revenue-producing pages, whether those be advertisers’ websites, donation forms or e-commerce features.
So I gladly accept low eCPMs on “gateway” pages that are bringing large numbers of readers to my site, either from search engines or other websites, so long as those readers eventually are continuing on to other, revenue-producing pages on the site.
What you don’t want to do, however, is waste ad impressions on pages where readers aren’t converting. I look through my metrics reports for sections with far-below-average click-throughs and eCPM rates, then remove the ads from those sections. Typically, I end up running registration and reader profile pages, comment and blog input forms, and most “rules and regulation”-type pages ad free. Removing low-performing ages from your ad inventory helps support higher click-through and eCPM values throughout the rest of the site, potentially improving your overall pay rates in third-party ad networks, such as Google’s AdSense.
I also remove ads from popular “gateway” pages with high bounce rates, meaning that new visitors aren’t clicking elsewhere on the site after arriving, but simply hitting the back button to return from where they came. In extreme cases, where bounce rates approach 100 percent, I’ve even deleted those pages from my site (or hidden them from search engines) to discourage “drive-by” traffic from people who clearly aren’t interested in my site, its content or advertisers.
Finally, I take a hard look at features that earn my highest eCPMs, and consider how I might replicate those successes. Was there something about the format or design of those pages, or the content upon them, that elicited a high percentage of clicks or well-paying ads? Did I hit upon especially lucrative keywords within that page’s topic? Is there high-paying advertiser demand for the topic of the page?
Answers to those questions help me decide what and how to create when building new pages and sections on the site. News publishers have been doing this for ages, even if that process isn’t typically made clear to reporters in the newsroom. Those new outdoor recreation or home decor sections didn’t appear in the paper just because an editor felt inspired. The publisher likely had some compelling metrics demonstrating advertiser demand in those markets. That’s what you should be looking for in your site’s readership data.
Of course, you need data in order to analyze it. That’s why smart news publishers ought to be experimenting, constantly. Try new topics, new writing forms, new functionality – then create new tracking channels to monitor those experiments, to build a database of information that can help guide you in making smarter decisions about the growth and maintenance of your website.