Search Engine Optimization is dead – Long live Plain English Optimization

So, how did your website fare in the great Google SEOcalypse last week?

Did you lose traffic? Gain it? Did you even notice?

Sistrix tracked the carnage among some of the top so-called content farms on the Internet, based on keyword positioning within search engine results pages [SERPs]. Among the losers in the Sistrix report were Associated Content, Mahalo and Examiner.com.

Personally, I don’t track keyword placement in SERPs for my websites. I track traffic and revenue. And I did see a drop in Google-directed traffic late last week on one of my websites, but a slight increase on the other. When I looked more closely at the loss in Google traffic, I didn’t see in decrease in referrals for the most popular keyphrases people were using to find my site, according to my Google Analytics report. All the loss seemed to be coming from the long tail, the all-but-forgotten, individually low-trafficked discussion threads and obscure listing pages on my site that I would just as soon Google ignore.

Well, consider that wish granted. The data does suggest to me, though, that Google’s not targeting entire sites with this latest algorithm change, but individual pages based on the thoroughness and uniqueness of their content.

Frankly, tracking keywords and obsessing about how highly your copy ranks in search engines provides one of the faster ways to go crazy in the online news business. With Google moving more toward highly personalized SERPs, chasing keywords is a fool’s pursuit.

It’s time to forget about SEO [Search Engine Optimization] and time to focus instead on PEO [Plain English Optimization].

Too many writers think of SEO as writing for computers, when their real focus should be writing to meet the needs of a human audience. Ask yourself these questions whenever you write:

  • Are you writing about something that people have personal experience with or personal interest in? Can you express that audience “need” in 10 words or less? Have you done that in the story?
  • Does your article do anything to provide a practical take-away that helps readers address this need, whether it be a to-do-list (even a short one) or at least relevant, previously unknown information about the topic? Can you describe that take-away in 10 words or less? Have you done that in the story?
  • Are you writing using the words and phrases that normal readers – people who aren’t your sources and co-workers – use when they talk about this topic? Are you using the vocabulary of a 10th grader, or a 10-year professional in the field?
  • Describe your piece in three words. Do those three words appear in the headline, the title tag or at least within the opening paragraph? How long does the reader have to read your piece before he or she will know what you’re writing about?
  • Are you drowning your reporting under too many words?

These principles aren’t incompatible with SEO, in fact they’re part of what many of us have been suggesting as basic “white hat” SEO principles in the past.

But with SERPs so variable these days, and with too many writers unable to get over the idea that SEO is writing for machines, I think that many of us would find it easier, not to mention far more productive, to think about Plain English Optimization instead.

Think about the people who will read what you write. What are their needs? What are you doing to help meet at least one of those needs in this piece? Are you keeping it clear and simple?

Write to PEO, and the SEO will take care of itself.

What if Google categorizes Patch.com as a 'content farm?'

Last Friday Google made a major announcement: Focus on improving search results has shifted from “pure webspam” to “content farms.” The latter are sites with shallow or low-quality content, websites that try to cheat their way into first page of search results. Google sees these sites as junk.

In theory, this all sounds good. Especially when one of the goals is to affect sites that copy others’ content and sites with low levels of original content. None of these “low quality” sites are named, but I can see smoke coming up from Santa Monica: Demand Media is not happy about this. The company is in the middle of the rumored IPO and Google is possibly going to lower the ranking of content farm sites such as eHow.com. I would be angry, especially when most of your anticipated business value relies on writing stories based on popular search queries, i.e. farming content. Timing of the Google announcement is hardly an accident.

As tempting as it is to gloat over Demand Media’s misfortune, the Google announcement might have severe consequences to all publishing. The company doesn’t identify the sites it considers to be “low quality.” One of the things Google will attack are sites and pages with “repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments.”

If you have hired a social media or search engine specialist, this is one of the key tricks you will be taught. Go out to the Internet, spread your links to comments and remember to include popular keywords in title, lead and body text. But Google is trying to build a search engine that understands natural language and true relationships between sites, an algorithm that is not fooled by clever cross-linking or keywords.

As a journalist, you have to support that. Otherwise the whole Web will look like the joke LAweekly published few days ago: “So this SEO copywriter walks into a bar, grill, pub, public house, Irish bar, bartender, drinks, beer, wine, liquor.”

The big question is how will Google judge who is doing spammy, search-engine inspired headlines and who is doing real customer research with Google Analytics.

Let’s take Patch.com – not because it’s evil but because it’s probably one of the sites that could be impacted by Google’s dislike of content farming and shallow content. I am not saying Patch.com is doing either, but computers might think differently. Patch.com sites create a lot of content about wide variety of topics on their own neighborhood – something that an algorithm could think as trying to match the long-tail queries in your area. And Google emphasizes that there is no human judgment involved, just computers calculating the odds of junk content vs. not junk.

Should you be worried if you are doing data-driven content innovation on your site? Meaning that you get story ideas from following up what people search within your site, what keywords drive them to your site from Google and what does Google Zeitgeist tell you about the most popular searches during this time of the year.

I would not be too worried. Just keep on churning out good original content and pay less attention to eager SEO consultants. I hope Google is just transforming the whole publishing industry by making copies obsolete and helping people to find the original pieces of content.

Pekka Pekkala researches sustainable business models at USC Annenberg, is a partner at Fugu Media and a technology columnist. He used to be the head of development at Helsingin Sanomat, the largest Finnish newspaper.

Publishing tip: To earn more money, try showing fewer ads

Allow me to offer you a completely counter-intuitive piece of advice – one that’s nevertheless helped me to increase income from the networked ads I publish on my websites.

To earn more money, try showing fewer ads.

You might think that online advertising works linearly: More ads = more money. (This equation certainly seems to reflect the thinking behind many ad-laden newspaper websites I read.)

But placing more ads on your website might actually hurt your ad network earnings – and not just because you’d be driving readers away with a lousy site experience.

Ad networks, such as Google’s AdSense (the network I use most often on my sites), often use complicated proprietary algorithms to decide which ads to show on your website, and how much they’ll charge the advertiser for each click. AdSense uses what Google bills as a real-time auction system to determine which ads show in the AdSense slots on a publisher’s site.

But it’s not simply a case of the highest bidder wins the space. Google’s system is trying to determine:

  • what ads are relevant to the content of the webpage, or
  • what ads are relevant to the interests of the reader on the page, and
  • of those potential ads, which one would most be likely to elicit a click, then
  • do the math to figure out if an ad less likely to elicit a click could actually earn the publisher more money, even factoring in the lower chances of getting clicked.
  • And after all that, the system has to determine if the selected advertiser is behind or ahead of schedule for the amount of money they’ve been charged from their daily ad budget that day. If they’re too far ahead of schedule, the system ignores their ads for while and this selection process starts over again.

Finally, even after a reader clicks on an ad and the advertiser is charged, if too few of your readers come through for those advertisers after they leave your site – they fail to buy something, register on advertiser’s website or simply view enough pages there – Google might charge those advertisers less money for those clicks from your site. It’s called “smartpricing” and can cripple your revenue.

That’s more variables than I ever had to account for in my high-school calculus class.

If you don’t understand the way your ad network’s software “thinks,” you never can expect to make a living wage income from networked ads. But if you take the time to learn about these systems, you’ll come to realize that certain pages on your site actually can work against you.

If certain pages attract readers who aren’t likely to click on ads – or who are “flaky” and likely to click but never to do anything on the advertiser’s website – you’re likely better off without ads on those pages.

As a publisher, you should want to present the ad network with a series of pages that include content likely to attract readers who are both interested in advertisers related to that content and looking to buy or otherwise engage with those advertisers.

The more such readers you deliver, the more lucrative ads you will get from the ad network, earning you more money. The fewer you deliver, the less the ad network “thinks” of your website, which will lead to less lucrative and possibly less well-targeted ads, as the ad system casts about trying to find something that will engage your readers.

Here’s an experiment I’ve tried with great success:

I divide my websites into multiple channels in my AdSense reports. I’ve set up my content management system to publish different topics and services on the site into different URL paths, which can be tracked easily as channels by the AdSense reporting tool.

Then I look at the eCPM (earnings for every 1,000 pages viewed) for each of those channels, as well as the average eCPM for the site as a whole.

For every channel that earns less than 50 percent of the site’s overall eCPM rate, I take the ads off that channel.

The result? The last three times I’ve done this, I’ve yielded an average 15 percent increase in network ad revenue. That’s not a 15 percent increase in eCPM. That’s increase in bottom-line income. (ECPM, obviously, rises way more than that.)

This analysis also helps you identify which sections and topics on your website are earning you the best return on your reporting, writing and development investment, giving you better information with which to decide how you’ll spend your time and resources in the future.

I’m not arguing that you should ditch all your public interest reporting in favor of chasing the highest eCPMs. Or that you should drop lower-performing content from the website. I love Slashdot’s analogy of a successful website as an omelet. You need the right mix of ingredients for everything to come together in an attractive, tasty whole.

But just as not every section and feature of your website serves the same editorial purpose, several sections and features of your site will serve different business purposes as well. Some sections earn money, and should display ads. Others don’t, and should not display ads. Placing ad slots on unproductive sections of your website can hurt networked ad performance elsewhere on the site.

Obviously, you can’t try my method of culling ads from sections of your site every day. Eventually, you’ll come to a point where you cut too far, and overall ad revenue declines. If that happens, hurry to replace those most-recently eliminated ad spots.

After a while, you might try returning ads to some of those borderline underproductive channels, too. In my experience, some sections that performed poorly in the past can begin to perform better in the future – thanks to your site attracting a different mix of readers, or audience tastes changing. Even if that’s not the case, you might still see a nice increase in ad revenue from returning ads to some of those channels for a few weeks before their presence begins to drag down revenue elsewhere.

If you’ve the skill for it, properly timing the addition and deletion of ads from marginally productive channels can help you maximize ad revenue. Personally, though, I prefer to leave ads off channels once I’ve yanked them. I’ll return ads only if I see a change in traffic coming to one of those channels, usually following some news development on that topic that’s bringing new visitors to the site. Even then, I’ll check the revenue figures after a month or so, and if the channel eCPM remains below 50 percent of the site’s average, the ads are coming off again.