Why the death of syndication is great news for hyperlocal and niche sites

Clay Shirky makes a wise prediction for 2011. It is called widespread disruptions for syndication:

Put simply, syndication makes little sense in a world with URLs. When news outlets were segmented by geography, having live human beings sitting around in ten thousand separate markets deciding which stories to pull off the wire was a service. Now it’s just a cost.

If you happen to run a hyperlocal or niche publication, this prediction is a good one. Internet is built on the idea of having just one copy of everything, accessible to everyone. If you produce those original pieces of content, no need to worry. If you’re in the business of aggregating others content, prepare for a rough ride.

The idea of one copy surfaced last winter along with Jaron Lanier’s book You Are Not a Gadget. Internet pioneer Ted Nelson originally coined the term and Lanier summarizes it well:

Instead of copying digital media, we should effectively keep only one copy of each cultural expression.

Internet is the great antidote for the Gutenberg printing press: instead of enabling us to make copies cheaper and faster, it makes the whole idea of copying obsolete. Why copy if you can make a link to the original?

Anyone who has worked in an online newsroom knows the problem of copying. How much time we should spend following the other news outlets, copy their breaking stories with a punchier headline and a quickly written comment? And how much effort should be spent creating original content and our own breaking stories?

The idea of “do what you do best, link to the rest” is not new, Jeff Jarvis wrote about it already in 2007. But for some reason, linking seems to be really difficult for news organizations. The idea of having everything on your site comes from the old editorial culture. Newspaper is the complete package of yesterday’s events; TV newscast is today’s package of everything important. If you leave something out, people will probably change the channel or cancel the subscription. But in the Internet, there are no packages, channels or subscriptions. There is just one big mess of links.

When Ted Nelson was making the first designs for something like World Wide Web, it didn’t have copies but one giant, global file.

The whole of a user’s productivity accumulated in one big structure, sort of like a singular personal web page.

So the idea of Internet — and the technology behind it — is exactly the opposite to the idea of a traditional newspaper publishing. We are not creating our own publications or single ‘destination’ websites but building a giant, single web. Work against this principle and you’ll end up in trouble. This is why paywalls are failing on the Web, in mobile and will fail in most cases on iPad. Once you start blocking iPad users from your website to sell more apps, you are encouraging readers to make copies, not subscriptions.

But all this is great news for small publishers, such as hyperlocal news or niche sites. You can be a part of that single Web page of Internet news. Concentrate on the original content instead of copying; create the one copy only you or your organization can create. If you don’t believe me, listen to Gawker’s Nick Denton: scoop generates audience, which in turn generates advertising.

The end of syndication is good news for journalists as well. When publishers start creating more original content instead of hastily made copies, the human element comes back to the process of journalism. The creator of the original content becomes more valuable, because it is still pretty difficult to make copies of people.

I might sound like a technophile, but the irony is that Google News is already helping original content to surface above copies. Google News algorithm knows who published the original story first. If your news site covers the same story and doesn’t include the link to the original story in the first paragraph, you can kiss Google News front-page goodbye.

And it was Google News algorithm that made us aware of the syndication craze. Who could have imagined there were 12,000 copies of the ‘Somali pirates’ story without Google telling it to us. Now Google is punishing us for making those copies. Who saw that one coming?

Pekka Pekkala researches sustainable business models at USC Annenberg, is a partner at Fugu Media> and a technology columnist. He used to be the head of development at Helsingin Sanomat, the largest Finnish newspaper.

Microsoft and News Corp. are pursuing yesterday's solution to today's challenges

News Corp.’s alleged plan to shield its online content from Google’s search engine in favor of having it indexed by Microsoft’s Bing is a brilliant content business strategy… for the 20th Century.

But, today, it illustrates just the latest example of backward-thinking by legacy media executives who’ve been left lost and clueless by the Internet revolution.

Microsoft needs to do something to distinguish Bing from market leader Google. (And simply renaming its Live search engine didn’t get that done.) News Corp., like any business looking for growth, wants to find a new source of revenue.

So, instead of making its content available for indexing on all search engines, News Corp. could decide to make it available only to one search index, in exchange for payment or some other consideration from that search engine’s owner. On the surface, the deal makes great sense for both sides: News Corp. gets cash (or some other payment of value) and Microsoft gets unique content in its search results – pages that readers can’t find elsewhere.

That’s the way many successful content deals have happened in the past. Think how sports leagues sell broadcast rights to their games to selected networks or channels. Or how cable and satellite companies have split popular channels across several packages, “encouraging” customers to move up to more expensive subscription tiers. It’s all about exchanging cash for access.

But that model is beginning to fail. The benefit to Microsoft in a potential deal with News Corp. is having News Corp. pages in its Bing search results that readers could not find elsewhere.

Except… they can. Tim Berners-Lee’s analogy of the Internet as a Web is complete. Content no longer exists solely within silos, accessed only through its defined channel. Online content is enmeshed with that web – linked to, and thereby accessible from, countless outside sources.

News Corp. can close its content to Google’s spiders. But Google will continue to index the millions of other webpages, including blogs, Twitter feeds, discussion forums and other news sites, that link to News Corp. webpages, from Fox News reports to Myspace profiles. Those linking pages will continue to provide access to News Corp. content through Google and other search engines, even when (or if) News Corp. changes its robots.txt file to block Googlebot.

Some bloggers and other writers have reacted to the news of a possible deal with glee – hoping that exclusion from Google will render News Corp. content invisible on the Web. Heck, I’d love to see all references to the Fox News’ cynical Republican-propaganda-masquerading-as-news disappear from the Web, too. But even if News Corp. withdraws its pages from Google, the references to those pages will not disappear. Taking News Corp. out of Google won’t make sites like Red State and Real Clear Politics vanish.

Producers have attempted to close these backdoor channels for content before. Who remembers the battles that many sports franchises fought with bloggers and even newspaper reporters to prevent them from posting scores and descriptions during games? The teams feared that online “broadcasts” of their games would compromise lucrative deals with radio and TV partners.

Today, those battles are mostly over. If anything, sports teams have moved on to trying simply to keep their athletes from Tweeting in the middle of games. Forget about the folks in the press box.

So what’s the value to Microsoft? Perhaps if News Corp. websites used Bing for their internal site search, there’d be some value in driving more readers to Microsoft’s search engine. But that’s a different deal, one that does not require exclusion from Google’s index. Perhaps News Corp.’s conservative audience might be convinced to use Microsoft’s Bing out of ideological loyalty, but let’s not forget that Microsoft is the “MS” in the right wing’s hated MSNBC. So that’s likely a non-starter.

I’ve heard many legacy news managers complain that the advertising model is dying online. From personal experience, both in my own ventures and watching others, I can attest that’s certainly not the truth. The model remains viable, even if it can no longer deliver the level of profit margin at the sales volume to which the news industry’s grown accustomed. (In math terms, the “model” is the equation that describes a relationship. The numbers that the model spits out can change even as the model remains the same.)

The Internet is killing one legacy media business model, however, and that’s the supply-side model based on creating value by restricting access to content. That’s the model upon which the News Corp.-Microsoft deal is based. While it might have worked in a pre-Web, channel-driven world, the public simply has spun too many ways around content-control deals to make this one worth Microsoft’s time or investment.

Bing will have to find another way to distinguish itself from Google.

And Rupert Murdoch will have to find other ways to make money off his crud.

Wanted: Less rhetoric, more critical thinking about 'The Reconstruction of American Journalism'

The new report “The Reconstruction of American Journalism” by Leonard Downie Jr. and Michael Schudson is one more example of what what’s wrong with the debate about the future of journalism. The Columbia Journalism School-sponsored report shovels out overviews, conclusions and recommendations by the pound, but with barely a few grams’ worth of critical thinking. Jan Schaffer, in her reaction to Downie and Schudson, said it best: “Darts for the mile-high, inch-deep reportage.” Schaffer, who is executive director of American University’s J-Lab: The Institute for Interactive Journalism and Pulitzer Prize-winning former reporter and business editor at the Philadelphia Inquirer, zeroes in on the report’s fatal weakness:

“If we really want to reconstruct American journalism, we need to look at more than the supply side; we need to explore the demand side, too. We need to start paying attention to the trail of clues in the new media ecosystem and follow those ‘breadcrumbs.’ What ailing industry would look for a fix that only thinks of ‘us,’ the news suppliers, and not ‘them,’ the news consumers? I don’t hear from any of those consumers in this report.”

Alan D. Mutter, whose Reflections of a Newsosaur blog, provides a good share of the small amount of rigorous, economic-centered thinking that’s gone into the journalism crisis, also gave a mostly scathing review to “The Reconstruction of American Journalism.”

Downie and Schudson come to their drastic recommendation of a “National Fund for Local News” using the kind of sleeves-rolled-up but shallow analysis that typically informs newspaper editorials on big issues (e.g., health care reform and the U.S. role in Afghanistan) A typical sentence from the report: “With appropriate safeguards, a Fund for Local News would play a significant role in the reconstruction of American journalism.” What are “appropriate” safeguards? What are the con’s as well as the pro’s of letting the federal government, through funding decisions that are made by appointed “national boards” and “state councils,” “play a significant role in the reconstruction of American journalism”?

Downie and Schudson focus, appropriately, on the threat of continued editorial staff downsizing to journalism’s “‘accountability reporting that often comes out of beat coverage and targets those who have power and influence in our lives—not only governmental bodies, but businesses and educational and cultural institutions.’” But creating a spider-web-like network of grant-dispensing boards sets the stage for all kinds of abuses that, ironically, would provide fodder for accountability reporting.

Missing from the Downie-Schudson report are the basic elements of critical thinking:

  • Digging for causes instead of reacting to symptoms.
  • Measuring as well as marshaling evidence.
  • Recognizing all the stakeholders.
  • Asking “why” questions.
  • Testing conclusions and recommendations.

Perhaps it’s unfair to hammer the Downie-Schudson report too hard. It’s symptomatic of what passes for analysis of the crisis in American journalism. We get too much rhetoric. The rhetoric is often well phrased – after all, it’s usually written by journalists – but we don’t need more rhetoric, however polished it may be. What we need is more case-method and other critical examination. Journalist/teacher/consultant Jane Stevens pointed the way with her studies of three community sitesCapitolSeattle.com, QuincyNews.org and WestSeattleBlog.com. Stevens and her co-author Mark Poepsel, a University of Missouri School of Journalism PhD candidate, take a close look at what the sites are doing on the journalistic, community and revenue fronts. The studies, if they are expanded to other websites, may lead to a flexible business model that can be tailored to work in a variety of communities – without federal money being doled out by national and state boards packed with patronage appointees.

(Stevens, by the way, gives Newsweek a well-deserved whack for its recent superficial take on the future of community journalism, which came to optimistic conclusions, but for the wrong reasons.)

Maybe the Downie-Schudson report will provoke enough tough reactions – on top of Schaffer’s and Mutter’s – that, cumulatively, will prod journalism’s practitioners and thinkers finally to start thinking critically about a crisis that won’t be solved with rhetoric, no matter how elegantly and urgently it’s framed.