<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Online Journalism Review&#187; data journalism</title>
	<atom:link href="http://www.ojr.org/tag/data-journalism/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.ojr.org</link>
	<description>Focusing on the future of digital journalism</description>
	<lastBuildDate>Wed, 10 Apr 2013 03:17:23 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.5.1</generator>
		<item>
		<title>Data journalism jobs on the rise</title>
		<link>http://www.ojr.org/data-journalism-jobs-on-the-rise/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=data-journalism-jobs-on-the-rise</link>
		<comments>http://www.ojr.org/data-journalism-jobs-on-the-rise/#comments</comments>
		<pubDate>Wed, 20 Feb 2013 13:14:25 +0000</pubDate>
		<dc:creator>Michael Juliani</dc:creator>
				<category><![CDATA[The Repeater]]></category>
		<category><![CDATA[.net]]></category>
		<category><![CDATA[Columbia Journal]]></category>
		<category><![CDATA[data graphics]]></category>
		<category><![CDATA[data journalism]]></category>
		<category><![CDATA[Flowing Data]]></category>
		<category><![CDATA[free coding tutorials]]></category>
		<category><![CDATA[jouralism jobs]]></category>
		<category><![CDATA[journalism]]></category>
		<category><![CDATA[journalism job opportunities]]></category>
		<category><![CDATA[Lansing State Journal]]></category>
		<category><![CDATA[Vermont Public Radio]]></category>
		<category><![CDATA[web graphics]]></category>
		<category><![CDATA[WNYC]]></category>

		<guid isPermaLink="false">http://www.ojr.org/?p=2358</guid>
		<description><![CDATA[As the Columbia Journalism Review reports, it behooves journalists to become literate in data coding, because that&#8217;s where jobs are opening up. It&#8217;s still a very small set of people who can combine the speed, ethics, understanding and fairness required of a journalist with the coding skills of a developer, says John Keefe, editor of [...]]]></description>
				<content:encoded><![CDATA[<div id="attachment_2449" class="wp-caption alignleft" style="width: 213px"><a href="http://www.ojr.org/wp-content/uploads/2013/02/data-journo-handbook-overview.png"><img src="http://www.ojr.org/wp-content/uploads/2013/02/data-journo-handbook-overview-203x300.png" alt="The Data Journalism Handbook at a glance, used under Creative Commons License." width="203" height="300" class="size-medium wp-image-2449" /></a><p class="wp-caption-text">The <a href="http://www.datajournalismhandbook.org/1.0/en/index.html">Data Journalism Handbook</a> at a glance, used under <a href="http://creativecommons.org/licenses/by-sa/3.0/">Creative Commons License</a>.</p></div>
<p>As the <a href="http://www.cjr.org/between_the_spreadsheets/between_the_spreadsheets_wnyc_jobs.php" target="_blank">Columbia Journalism Review reports</a>, it behooves journalists to become literate in data coding, because that&#8217;s where jobs are opening up. It&#8217;s still a very small set of people who can combine the speed, ethics, understanding and fairness required of a journalist with the coding skills of a developer, says John Keefe, editor of data news at <a href="http://www.wnyc.org/" target="_blank">WNYC</a>.</p>
<p>&#8220;[A]nybody who put any effort into being good at that and having those qualities is going to have a job probably before they can graduate,&#8221; Keefe told CJR.</p>
<p>Apparently, news sources as small as the Lansing State Journal and Vermont Public Radio are making space for data and design teams on their staffs. Publications can utilize coding-literate teams to produce graphics and surveys of demographics &#8212; census maps, for instance, in the case of WNYC&#8217;s coverage of Hurricane Irene, which brought them a record-setting amount of traffic.</p>
<p>Free tutorials like ones on <a href="http://flowingdata.com/category/tutorials/" target="_blank">Flowing Data</a> and <a href="http://www.netmagazine.com/features/top-20-data-visualisation-tools" target="_blank">.net magazine</a> teach journalists basic data coding skills that can help them become more employable, as outlets learn that the web offers them even more shots at being inventive and innovative and therefore more interesting to viewers.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ojr.org/data-journalism-jobs-on-the-rise/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>Journalism&#8217;s problem of scale demands a rethinking of the news product</title>
		<link>http://www.ojr.org/journalisms-problem-of-scale-demands-a-rethinking-of-the-news-product/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=journalisms-problem-of-scale-demands-a-rethinking-of-the-news-product</link>
		<comments>http://www.ojr.org/journalisms-problem-of-scale-demands-a-rethinking-of-the-news-product/#comments</comments>
		<pubDate>Mon, 24 Dec 2012 19:13:15 +0000</pubDate>
		<dc:creator>Gabriel Kahn</dc:creator>
				<category><![CDATA[Business Model]]></category>
		<category><![CDATA[Feature]]></category>
		<category><![CDATA[aggregation]]></category>
		<category><![CDATA[business models]]></category>
		<category><![CDATA[data journalism]]></category>
		<category><![CDATA[journalism education]]></category>
		<category><![CDATA[Management]]></category>

		<guid isPermaLink="false">http://www.ojr.org/?p=229</guid>
		<description><![CDATA[Digital journalists are already experimenting with and inventing news products. Here's why it's so critical they continue.]]></description>
				<content:encoded><![CDATA[<p><div id="attachment_232" class="wp-caption alignleft" style="width: 450px"><img src="http://www.ojr.org/wp-content/uploads/2012/12/telegraph-newsroom-scale.jpg" alt="The newsroom at The Daily Telegraph" title="telegraph-newsroom-scale" width="440" class="size-full wp-image-232" /><p class="wp-caption-text">The newsroom at The Daily Telegraph. | Credit: <a href="http://www.flickr.com/photos/victoriapeckham/">victoriapeckham</a>/<a href="http://creativecommons.org/licenses/by/2.0/deed.en">Flickr</a></p></div><br />
I spend an inordinate amount of time trying to untangle the mass of conflicting visions about the future of the news industry. But recently I heard a phrase of unusual clarity: “Traditional journalism, as a process, does not scale.”</p>
<p>The person who spoke this line was <a href="http://www.marketplace.org/people/matt-berger">Matt Berger</a>, the director of digital media at Marketplace. What he meant was there is no business model that will support an organization with 100 reporters writing 100 stories (or, as we used to refer to the newsroom, 100 monkeys at 100 typewriters).<span id="more-229"></span></p>
<p>When you are going up against a World Wide Web that has so much real-time content, it’s almost impossible to gain enough traction to adequately monetize the work of a single soul banging away at a single keyboard. This old model was only possible when information was scarce. And information was scarce because it was delivered on newsprint. (And yes, there are still a few places that can achieve the necessary scale in the digital realm, and we all know who they are.)</p>
<p>Of course, there is nothing earth-shattering about this concept. It’s blatantly obvious. And yet, when you stop to consider it, you wonder how anyone who cares about the future of the industry could be thinking about anything else. Or why so many news sites are still swimming upstream by trying to sell ads against work churned out by individual journalists.</p>
<p>The implications of this challenge are unsettling. The single “article” — journalism’s basic unit of commerce — will only rarely generate enough value to cover its cost of production. (Gulp.) But as I began to consider what scalable journalism meant, I also realized how many conversations I had had recently that were really about addressing this very problem.</p>
<p>I recently sat down with <a href="http://www.magnify.net/company/team">Steve Rosenbaum</a>, author of “<a href="http://www.amazon.com/Curation-Nation-World-Consumers-Creators/dp/0071760393/ref=sr_1_1?ie=UTF8&amp;qid=1355963921&amp;sr=8-1&amp;keywords=curation+nation">Curation Nation</a>” and founder of <a href="http://www.magnify.net/">Magnify.net</a>. His startup seeks to address this issue by helping news sites appropriately harness content that’s out there already, rather than attempt to produce it themselves. Plenty of people might want to visit the homepage of <em><a href="http://video.fieldandstream.com/">Field &amp; Stream</a></em> to watch a video about boat trailers or fishing lures. But it’s not realistic to think that magazine’s staffers can churn out enough quality video to satisfy the demand of either the audience or advertisers. Again, it’s a question of scale.</p>
<p>Yet the Internet is brimming with videos about these topics already. So Magnify reels in an array of relevant videos that editors can choose from. <em>Field &amp; Stream</em> provides the context (you’re watching this in the confines of their site’s video page) and the curation (they choose the content that they feel is most valuable). The best part: The magazine can sell pre-roll ads or ads on the site even though the content (the actual video) was created elsewhere. Depending on the arrangement, the magazine either pockets the revenue or shares it with whoever made the video. This last point marks an evolution of the concept of curation. Not long ago, showing someone else’s video on your site was considered “theft” by some. Now, many just call it “distribution.”</p>
<p>The issue of scale is also lurking in the background throughout the recent report from Columbia’s Tow Center for Digital Journalism on <a href="http://towcenter.org/research/post-industrial-journalism/">Post-Industrial Journalism</a> (though it weighs in at an industrial-length 122 pages). Much of the report discusses the need for a new workflow that is more open and responds to the ways in which information is currently assembled and consumed. (For a smarter, Cliff Notes version of this concept, read the <a href="http://structureofnews.wordpress.com/2012/12/13/in-praise-of-process/">post from my friend and former editor Reg Chua</a>.)</p>
<p>Obviously, the layers of editors that were once charged with policing copy have no place in the modern, distributed newsroom. But editing — the process of vetting, sharpening and enriching content — still holds tremendous value. I spoke recently with Roman Heindorff, one of the founders of <a href="http://www.camayak.com/">Camayak.com</a>, a browser-based product that helps organize a newsroom’s workflow. The founders were trying to address an increasingly common problem: how to bring sense to the news organization of the future, which will be made up principally of part-time contributors working on myriad projects, sometimes across vast geographies. Camayak has begun to gain traction with campus papers, which often have hundreds of occasional contributors who need a seamless way to collaborate with each other. The overall goal is to make the most efficient use of available human resources to produce greater amounts of content. The founders also believe there is a virtuous circle involved: The more people are able to use the platform to collaborate successfully, the better the content.</p>
<p>Marketplace’s Berger approaches the problem from the perspective of structured journalism. Achieving appropriate scale requires putting lots of up-front effort into building a digital product that doesn’t wilt with the day’s news. This means creating a database of content that the audience can dip back into multiple times and still draw new conclusions. The database can be regularly refreshed with new content to extend its life.</p>
<p>His Exhibit A is a Marketplace feature called <a href="http://www.marketplace.org/topics/sustainability/future-jobs-o-matic">Future Jobs-O-Matic</a>, an interactive tool that lets you browse hundreds of professions to see how many people are employed as welders or what the average salary of a machinist might be (Answer: $39,000). The database is updated every two years, but people keep coming back to it, sharing it, using it in the classroom, etc. Buried in the data, of course, are also nuggets that traditional “article-producing” journalists can use as building blocks for stories.</p>
<p>The implications of what this all means from where I sit are far reaching. Much of what I do involves teaching students the rudiments of how to produce an article — which has an ever-shrinking economic value. Clearly, this needs to be rethought. And those of us who inhabit journalism schools need to create an environment that pushes students to produce journalistic artifacts that have a shelf life, that draw content from the crowd and that still provide a platform for storytelling and for meeting the information needs of the public. Should be a snap.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ojr.org/journalisms-problem-of-scale-demands-a-rethinking-of-the-news-product/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Better reporting on computer models could dispel some of the mysteries of climate change</title>
		<link>http://www.ojr.org/p2095/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=p2095</link>
		<comments>http://www.ojr.org/p2095/#comments</comments>
		<pubDate>Wed, 21 Nov 2012 11:22:02 +0000</pubDate>
		<dc:creator>Larry Pryor</dc:creator>
				<category><![CDATA[Frontpage]]></category>
		<category><![CDATA[data journalism]]></category>
		<category><![CDATA[reporting]]></category>
		<category><![CDATA[science]]></category>

		<guid isPermaLink="false">http://www.ojr.org/?p=2095</guid>
		<description><![CDATA[Now that climate topics have been allowed back in the public arena, it’s time for the media to fill some serious gaps in the coverage of climate science. A good place to start would be to explain how computer models work. While a story on the intricacies of algorithms might seem to be a “yawner,” [...]]]></description>
				<content:encoded><![CDATA[<p>Now that climate topics have been allowed back in the public arena, it’s time for the media to fill some serious gaps in the coverage of climate science. A good place to start would be to explain how computer models work. While a story on the intricacies of algorithms might seem to be a “yawner,” if told from the point of view of a brilliant scientist, complete with compelling graphics, or, better yet, with the immersive technology of new media, stories on climate models could provide ways for non-scientists to evaluate the reliability of these tools as predictors of the future.</p>
<p>Equally important, social media and the virtual communities that websites are capable of forming can help to overcome a major barrier to the public’s understanding of risk perception: The tendency of citizens to conform their own beliefs about societal risks from climate change to those that predominate among their peers. This derails rational deliberation, and the herd instinct creates an opening for persuasion — if not deliberate disinformation — by the fossil fuel industry. Online communities can provide a counter-voice to corporations. They are populated by diverse and credible thought leaders who can influence peers to not just accept ideas but to seek out confirming evidence and then take action. Because social networks enable the rapid discovery, highlighting and sharing of information, they can generate instant grassroots activist movements and crowd-sourced demonstrations.</p>
<p>Studies show that a major cause of public skepticism over climate stems from ignorance of the reliability of climate models. Beyond their susceptibility to garbage in, garbage out, algorithms on which models are based have <a href="http://cacm.acm.org/magazines/2005/5/6221-a-covenant-with-transparency/abstract">long lacked the transparency needed to promote public trust</a> in computer decisions systems.   The complexity and politicization of climate science models have made it difficult for the public and decision makers to put faith in them. But studies also show that the media plays a big role in why the public tends to be skeptical of models. An <a href="http://www.nature.com/nclimate/journal/v2/n9/full/nclimate1542.html">article in the September issue of Nature Climate Change</a> written by Karen Akerlof et al slammed the media for failing to address the science of models and their relevance to political debate:</p>
<blockquote><p>Little information on climate models has appeared in US newspapers over more than a decade. Indeed, we show it is declining relative to climate change. When models do appear, it is often within sceptic discourses. Using a media index from 2007, we find that model projections were frequently portrayed as likely to be inaccurate. Political opinion outlets provided more explanation than many news sources. </p></blockquote>
<p>In other words, blogs and science websites have done a better job of explaining climate science than traditional media, as visitors to <a href="http://realclimate.org/">RealClimate.org</a>, <a href="SkepticalScience.org">SkepticalScience.org</a> and other science blogs can attest. But the reach of these sites and their impact on the broader public are debatable. Websites such as the U.S. Department of Energy’s <a href="http://science.energy.gov/">Office of Science</a> have a trove of information on climate modeling but, with the exception of <a href="http://www.jpl.nasa.gov/earth/">NASA’s laboratories</a>, most government sites on science make little effective use of data visualization. This void offers mainstream journalists an opportunity to be powerful agents in the climate learning process, to tell dramatic multimedia stories about how weather forecasts can literally save our lives and, by extension, why climate forecasts can be trusted.</p>
<p>Two recent events can be thought of as whetting the public’s appetite for stories about computer-generated versions of reality. The prediction that Hurricane Sandy would eventually turn hard left out in the Atlantic and pound the northeastern shore of the United States was <a href="http://www.livescience.com/24377-weather-climate-hurricane-sandy.html">made almost a week in advance by weather forecasters</a>.</p>
<p>This technology-driven prediction no doubt saved countless lives. In addition, <a href="http://www.carbonbrief.org/blog/2012/11/how-well-did-the-media-cover-hurricane-sandy-scientists-have-their-say">some media coverage of Hurricane Sandy</a> did much to enable non-scientists to understand why it is tricky to attribute specific storms to climate change but still gave the public the big picture of how warmer ocean waters provide storms with more moisture and therefore make them bigger and more damaging.</p>
<p>Simultaneously, in a different domain but using the same tools of analysis and prediction, Nate Silver’s FiveThirtyEight computer model, results of which were published in his <a href="http://fivethirtyeight.blogs.nytimes.com/">blog at The New York Times</a>, out-performed traditional political experts by nailing the November national election outcomes. How did he pull that off?  A story about his statistical methods, complete with graphics, could reveal how risk analysts create spaces between the real world and theory to calculate probabilities. This would help the public to become familiar with models as a source of knowledge.</p>
<p>Some reporters have produced text stories on climate models that are examples of clarity. Andrew Revkin, while as an environment writer for The New York Times and now as the author of his Dot Earth blog at nytimes.com’s opinion section, has for many years covered how climate models relate to a large body of science, including a <a href="http://dotearth.blogs.nytimes.com/2012/10/30/two-views-of-a-superstorm-in-climate-context/">posting on Oct. 30</a> that placed Hurricane Sandy in the context of superstorms of the past.</p>
<p>David A. Fahrenthold at The Washington Post wrote how “<a href="http://www.washingtonpost.com/wp-dyn/content/article/2010/04/05/AR2010040503722.html ">Scientists’ use of computer models to predict climate change is under attack</a>,” which opens with a baseball statistics analogy and keeps the reader going. Holger Dambeck at SpiegelOnline did a thorough assessment of climate model accuracy in non-science language, “<a href="http://www.spiegel.de/international/world/modeling-the-future-the-difficulties-of-predicting-climate-change-a-663159.html">Modeling the Future: The Difficulties of Predicting Climate Change</a>.” But these stories are rare and often one-dimensional.</p>
<p>Effort is now being spent on <a href="http://www.centerforcommunicatingscience.org/">making scientists into better communicators</a>, but more might be accomplished if mainstream journalists, including those who publish on news websites with heavy traffic, made themselves better acquainted with satellite technology and its impact on science. Information specialist Paul Edwards explains in his book, “A Vast Machine: Computer Models, Climate Data and the Politics of Global Warming,” how climate modeling, far from being purely theoretical, is a method that combines theory with data to meet “practical here-and-now needs.” Computer models operate within a logical framework that uses many approximations from data that — unlike weather models — can be “conspicuously sparse” but still constituting sound science, much as a reliable statistical sample can be drawn from a large population. How statistics guide risk analysis requires better explanation for a public that must make judgments but is seldom provided context by news stories. The debate over cap-and-trade policy might be Exhibit A.</p>
<p>Depicting model-data symbiosis in such diverse fields as baseball performance, hurricane forecasts and long-range warming predictions would be ideally suited to web technology. Not only can climate models be reproduced on PCs and laptops, showing atmospheric changes over the past and into the future, but also the models’ variables can be made accessible to the web user, who could then take control of the model and game the display by practicing “what ifs” — how many degrees of heat by year 2100 could be avoided by a selected energy policy, how many people would be forced into migrations if this amount of food supplies were lost, how big would a tidal barrier need to be to protect New York City from another Sandy disaster? (If this sounds a bit like SimCity, the new version of the game due in 2013 includes climate change as part of the simulated experience.)</p>
<p>This narrative approach to news, including personal diaries and anecdotes of everyday lived experience, is what Richard Sambrook, former director of BBC Global News and now a journalism professor at Cardiff University, has termed “360 degree storytelling.” Mike Hulme, a professor of climate change at East Anglia University, provides this description of the new public stance toward science in his book,  “Why We Disagree About Climate Change”:</p>
<blockquote><p>Citizens, far from being passive receivers of expert science, now have the capability through media communication “to actively challenge and reshape science, or even to constitute the very process of scientific communication through mass participation in simulation experiments such as ‘climateprediction.net’. New media developments are fragmenting audiences and diluting the authority of the traditional institutions of science and politics, creating many new spaces in the twenty-first century ‘agora’ … where disputation and disagreement are aired.”</p></blockquote>
<p>Today’s media is about participation and argumentation. A new rhetoric of visualization is making science more comprehensible in our daily lives. What goes around, comes around. One of the pioneer online journalism experiments in making the public aware of how technology, risk assessment and human fallibility can cross over was a project by MSNBC.com known as the “<a href="http://www.msnbc.msn.com/id/34623505/ns/us_news-security/t/can-you-spot-threats/#.UKZgt-Oe9FU">baggage screening game</a>.” Players could look into a simulated radar screen and control the speed of a conveyor line of airline passenger baggage — some of which harbored lethal weapons. Assuming you were at the controls, the program would monitor your speed and accuracy in detection and keep score, later making you painfully aware of missed knives and bombs. Adding to your misery was a soundtrack of passengers standing in line and complaining about your excessive scrutinizing, with calls of “Come on! Get this thing moving! We’re late!” It was hard to be impatient with the TSA scanners after that.</p>
]]></content:encoded>
			<wfw:commentRss>http://www.ojr.org/p2095/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>National party conventions, graphic photos, social media&#039;s bull$#!t, open data, and a world stream</title>
		<link>http://www.ojr.org/national-party-conventions-graphic-photos-social-medias-bullt-open-data-and-a-world-stream/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=national-party-conventions-graphic-photos-social-medias-bullt-open-data-and-a-world-stream</link>
		<comments>http://www.ojr.org/national-party-conventions-graphic-photos-social-medias-bullt-open-data-and-a-world-stream/#comments</comments>
		<pubDate>Tue, 28 Aug 2012 08:31:54 +0000</pubDate>
		<dc:creator>webtech</dc:creator>
				<category><![CDATA[Frontpage]]></category>
		<category><![CDATA[data journalism]]></category>
		<category><![CDATA[online video]]></category>
		<category><![CDATA[political reporting]]></category>

		<guid isPermaLink="false">http://www.ojr.org/?p=2087</guid>
		<description><![CDATA[Here&#8217;s a quick roundup of stories and conversations that caught our attention in the past week, the first in what will gradually become a regular series. Convention City: For the next two weeks, we&#8217;ll be barraged with reportage from the Republican and Democratic national conventions. As MediaShift points out, a lot of attention among media [...]]]></description>
				<content:encoded><![CDATA[<p>Here&#8217;s a quick roundup of stories and conversations that caught our attention in the past week, the first in what will gradually become a regular series.</p>
<p><strong>Convention City:</strong> For the next two weeks, we&#8217;ll be barraged with reportage from the Republican and Democratic national conventions. As MediaShift points out, a lot of attention among media observers will be paid to how a variety of digital tools are deployed, much like it was during the Summer Olympics. The media industry blog has already put together a <a href="http://www.pbs.org/mediashift/2012/08/best-online-resources-for-following-the-gop-democratic-conventions240.html">helpful list of resources</a> for following the conventions. Meanwhile, the Washington Post has launched a <a href="http://www.washingtonpost.com/grid/republican-national-convention/">new feature it&#8217;s calling The Grid</a>, which is an interesting way to scan through all their various social media and reporting channels and get the latest on the RNC (and next week the DNC).</p>
<p><strong>Instagraphic:</strong> In case you missed it (which seems impossible), Instagram moved to the center of a century-old debate this weekend following the shootings at the Empire State Building. When user @ryanstryin posted a graphic photo showing one of the victims lying in the street, it prompted a lot of reflection from both the mainstream media and the public over whether it&#8217;s appropriate to publish or share such images. We&#8217;ve had these arguments since the advent of photography &#8211; in times of war, in times of peace &#8211; on whether to publish photos of the dead and wounded or withhold them out of respect for the victims and their families. But this was a special kind of wake-up call. The media no longer makes these decisions, now that witnesses have a publishing platform in their pocket. New media commentator and J-school prof Jeff Jarvis got a little hot under the collar <a href="http://buzzmachine.com/2012/08/24/without-mediation/">defending his own decision</a> to share the photo on his Twitter stream and offers a compelling argument on the side of keeping the news unfiltered. The point is, if you click this hyperlink <a href="http://www.slate.com/blogs/browbeat/2012/08/24/the_empire_state_building_shooting_photos_on_instagram_were_they_too_soon_.html">showing a victim with blood streaming down the sidewalk</a> (republished here by Slate), you&#8217;ve already been forewarned by the linked words. Since mainstream media still have the broadest reach, they will continue to find themselves at the center of this debate, but the audience is going to find it increasingly difficult to avoid such material. The decision will be not one for the &#8220;broadcaster&#8221; on whether to share, but a personal one on whether to click.</p>
<p><strong>Streaming the world 60 seconds at a time.</strong> The Wall Street Journal is now asking its reporters to file microvideo reports using the social media video platform <a href="http://www.tout.com/">Tout</a>. <a href="http://stream.wsj.com/story/world-stream/">They&#8217;re calling it WorldStream</a>. From Tampa to Syria, you can see snippets of life, the news, and everything else a reporter can capture with a mobile phone camera. A first dive leaves me with the impression that much, much work has yet to be done before WSJ&#8217;s WorldStream can be called a mature product. Rebels relaxing in a mosque in Syria might have been portrayed better with a photo, for instance. Thirty seconds watching a pan of the empty delegate center in Tampa would have been better spent reading an actual story about the convention. And I can&#8217;t help but wonder what you can expect to get out of a 60-second interview with a pol &#8211; the format seems more suited to TMZ celeb shots and gotcha journalism. It will be interesting to see how the service evolves. For now, my main impression is that we&#8217;re looking at the news equivalent of Romantic fragment poems &#8211; Coleridge&#8217;s &#8220;<a href="http://en.wikipedia.org/wiki/Kubla_Khan">Kubla Kahn</a>&#8221; or Keats&#8217; &#8220;<a href="http://en.wikipedia.org/wiki/Hyperion_(poem)">Hyperion</a>.&#8221; They may work artistically, but are story fragments really the best approach for an industry devoted to informing and enlightening its audience?</p>
<p><strong>Social media is bull$#!t.</strong> Or so says <a href="http://bjmendelson.com/">B.J. Mendelson</a> in the title of his new book. The former social media marketer and contributor to Mashable <a href="http://slides.shortformblog.com/465373">boosts his own contrarian view</a> after serving the industry for years. Among some of the more common precepts of online journalism Mendelson disputes: the all-importance of pageviews, that Facebook really has 800 million users, and that we&#8217;ve learned much new about Internet marketing since Dale Carnegie&#8217;s &#8220;How to Win Friends and Influence People.&#8221; He tells journalist Ernie Smith that the biggest BS thing about social media is &#8220;the concept that what’s happening on these very different platforms, with their comparatively small and different audiences, has resonance with what’s happening with the rest of us. This false hope we’re giving people, which not coincidentally popped up around the same time the economy cratered. People needed something to believe in, and selfish and greedy marketers were ready to give that to them in the package of the myth of social media.&#8221; Incidentally, the interview is a nice display of what you can do with <a href="https://jux.com/">Jux</a>, yet another platform for quick blogging.</p>
<p><strong>The problem with open data</strong>. Is there one? Some interesting conversations on the topic this week. One started when the White House <a href="http://www.washingtonpost.com/blogs/innovations/post/white-house-launches-innovation-fellows-program-video/2012/08/24/b32375c0-ee03-11e1-afd8-097e90f99d05_blog.html">announced the selection of its &#8220;Innovation Fellows,&#8221;</a> members of the private and nonprofit sectors and academia whose job it will be to help develop five government programs, including one on open data. That announcement sparked some backlash from conservative commentators, including Michelle Malkin, who wondered whether this isn&#8217;t really just a waste of taxpayer money. Open government reporter Alex Howard <a href="http://gov20.govfresh.com/can-government-innovation-rise-above-partisan-politics/">captured some of that debate</a>, which unfolded in the social media sphere. Meanwhile, <a href="http://techpresident.com/news/wegov/22768/open-data-open-questions-unclear-action-where-do-we-go-here">techPresident&#8217;s David Eaves reported</a> on how a government spending scandal uncovered in the U.K. with the help of an <a href="http://openlylocal.com/">open data project</a> raises as many questions about how government collects and reports its data as it does about the suspect spending. So, what do you do if the government&#8217;s databases are poorly coded or managed &#8211; how do we get the government to change? And even if you discover these remarkable stories with the aid of open data sources, does it make it any easier to act? More questions like these are sure to present themselves as data journalism flowers into a discipline in its own right.</p>
<p><strong>Another decade of the Internet.</strong> I leave you with a fun look back at how much the Internet has changed in the past 10 years, courtesy of <a href="http://mashable.com/2012/08/22/the-internet-a-decade-later/">this Mashable infographic</a>. Enjoy. </p>
]]></content:encoded>
			<wfw:commentRss>http://www.ojr.org/national-party-conventions-graphic-photos-social-medias-bullt-open-data-and-a-world-stream/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>