Better reporting on computer models could dispel some of the mysteries of climate change

Now that climate topics have been allowed back in the public arena, it’s time for the media to fill some serious gaps in the coverage of climate science. A good place to start would be to explain how computer models work. While a story on the intricacies of algorithms might seem to be a “yawner,” if told from the point of view of a brilliant scientist, complete with compelling graphics, or, better yet, with the immersive technology of new media, stories on climate models could provide ways for non-scientists to evaluate the reliability of these tools as predictors of the future.

Equally important, social media and the virtual communities that websites are capable of forming can help to overcome a major barrier to the public’s understanding of risk perception: The tendency of citizens to conform their own beliefs about societal risks from climate change to those that predominate among their peers. This derails rational deliberation, and the herd instinct creates an opening for persuasion — if not deliberate disinformation — by the fossil fuel industry. Online communities can provide a counter-voice to corporations. They are populated by diverse and credible thought leaders who can influence peers to not just accept ideas but to seek out confirming evidence and then take action. Because social networks enable the rapid discovery, highlighting and sharing of information, they can generate instant grassroots activist movements and crowd-sourced demonstrations.

Studies show that a major cause of public skepticism over climate stems from ignorance of the reliability of climate models. Beyond their susceptibility to garbage in, garbage out, algorithms on which models are based have long lacked the transparency needed to promote public trust in computer decisions systems. The complexity and politicization of climate science models have made it difficult for the public and decision makers to put faith in them. But studies also show that the media plays a big role in why the public tends to be skeptical of models. An article in the September issue of Nature Climate Change written by Karen Akerlof et al slammed the media for failing to address the science of models and their relevance to political debate:

Little information on climate models has appeared in US newspapers over more than a decade. Indeed, we show it is declining relative to climate change. When models do appear, it is often within sceptic discourses. Using a media index from 2007, we find that model projections were frequently portrayed as likely to be inaccurate. Political opinion outlets provided more explanation than many news sources.

In other words, blogs and science websites have done a better job of explaining climate science than traditional media, as visitors to RealClimate.org, SkepticalScience.org and other science blogs can attest. But the reach of these sites and their impact on the broader public are debatable. Websites such as the U.S. Department of Energy’s Office of Science have a trove of information on climate modeling but, with the exception of NASA’s laboratories, most government sites on science make little effective use of data visualization. This void offers mainstream journalists an opportunity to be powerful agents in the climate learning process, to tell dramatic multimedia stories about how weather forecasts can literally save our lives and, by extension, why climate forecasts can be trusted.

Two recent events can be thought of as whetting the public’s appetite for stories about computer-generated versions of reality. The prediction that Hurricane Sandy would eventually turn hard left out in the Atlantic and pound the northeastern shore of the United States was made almost a week in advance by weather forecasters.

This technology-driven prediction no doubt saved countless lives. In addition, some media coverage of Hurricane Sandy did much to enable non-scientists to understand why it is tricky to attribute specific storms to climate change but still gave the public the big picture of how warmer ocean waters provide storms with more moisture and therefore make them bigger and more damaging.

Simultaneously, in a different domain but using the same tools of analysis and prediction, Nate Silver’s FiveThirtyEight computer model, results of which were published in his blog at The New York Times, out-performed traditional political experts by nailing the November national election outcomes. How did he pull that off? A story about his statistical methods, complete with graphics, could reveal how risk analysts create spaces between the real world and theory to calculate probabilities. This would help the public to become familiar with models as a source of knowledge.

Some reporters have produced text stories on climate models that are examples of clarity. Andrew Revkin, while as an environment writer for The New York Times and now as the author of his Dot Earth blog at nytimes.com’s opinion section, has for many years covered how climate models relate to a large body of science, including a posting on Oct. 30 that placed Hurricane Sandy in the context of superstorms of the past.

David A. Fahrenthold at The Washington Post wrote how “Scientists’ use of computer models to predict climate change is under attack,” which opens with a baseball statistics analogy and keeps the reader going. Holger Dambeck at SpiegelOnline did a thorough assessment of climate model accuracy in non-science language, “Modeling the Future: The Difficulties of Predicting Climate Change.” But these stories are rare and often one-dimensional.

Effort is now being spent on making scientists into better communicators, but more might be accomplished if mainstream journalists, including those who publish on news websites with heavy traffic, made themselves better acquainted with satellite technology and its impact on science. Information specialist Paul Edwards explains in his book, “A Vast Machine: Computer Models, Climate Data and the Politics of Global Warming,” how climate modeling, far from being purely theoretical, is a method that combines theory with data to meet “practical here-and-now needs.” Computer models operate within a logical framework that uses many approximations from data that — unlike weather models — can be “conspicuously sparse” but still constituting sound science, much as a reliable statistical sample can be drawn from a large population. How statistics guide risk analysis requires better explanation for a public that must make judgments but is seldom provided context by news stories. The debate over cap-and-trade policy might be Exhibit A.

Depicting model-data symbiosis in such diverse fields as baseball performance, hurricane forecasts and long-range warming predictions would be ideally suited to web technology. Not only can climate models be reproduced on PCs and laptops, showing atmospheric changes over the past and into the future, but also the models’ variables can be made accessible to the web user, who could then take control of the model and game the display by practicing “what ifs” — how many degrees of heat by year 2100 could be avoided by a selected energy policy, how many people would be forced into migrations if this amount of food supplies were lost, how big would a tidal barrier need to be to protect New York City from another Sandy disaster? (If this sounds a bit like SimCity, the new version of the game due in 2013 includes climate change as part of the simulated experience.)

This narrative approach to news, including personal diaries and anecdotes of everyday lived experience, is what Richard Sambrook, former director of BBC Global News and now a journalism professor at Cardiff University, has termed “360 degree storytelling.” Mike Hulme, a professor of climate change at East Anglia University, provides this description of the new public stance toward science in his book, “Why We Disagree About Climate Change”:

Citizens, far from being passive receivers of expert science, now have the capability through media communication “to actively challenge and reshape science, or even to constitute the very process of scientific communication through mass participation in simulation experiments such as ‘climateprediction.net’. New media developments are fragmenting audiences and diluting the authority of the traditional institutions of science and politics, creating many new spaces in the twenty-first century ‘agora’ … where disputation and disagreement are aired.”

Today’s media is about participation and argumentation. A new rhetoric of visualization is making science more comprehensible in our daily lives. What goes around, comes around. One of the pioneer online journalism experiments in making the public aware of how technology, risk assessment and human fallibility can cross over was a project by MSNBC.com known as the “baggage screening game.” Players could look into a simulated radar screen and control the speed of a conveyor line of airline passenger baggage — some of which harbored lethal weapons. Assuming you were at the controls, the program would monitor your speed and accuracy in detection and keep score, later making you painfully aware of missed knives and bombs. Adding to your misery was a soundtrack of passengers standing in line and complaining about your excessive scrutinizing, with calls of “Come on! Get this thing moving! We’re late!” It was hard to be impatient with the TSA scanners after that.

The news of the future

The problem of veracity and realism in digital graphics has challenged Web editors and designers since the outset of online journalism. Where do we draw the line between fact and fantasy? How much latitude can we give the audience to create its own realities?

One answer has been to define Virtual Reality and create immersive applications that meet journalists’ notions of epistemology – the grounding of knowledge in verifiable facts and information. In contrast to artists, online journalists do not put a high value on illusion. We are not in the deception business. Nor are we gamers.

On the other hand, digital technology gives online journalists a chance to experiment with multisensory presentations, and we have long favored giving the audience opportunities to participate in storytelling. Harking back to MSNBC’s baggage checking exercise and other early versions of hypothetical scenarios, we have given the audience increasing latitude to explore the possibilities of digital landscapes from a first-person point of view.

Over the last several years, more effort has been put into elaborate calculators, civic games and hypothetical scenarios. The goal has been to use the immersive techniques of gamers “as an amplifier of thought,” to use the phrase of one design theorist, Brenda Laurel. For journalists, this requires creating a new vocabulary, a new metalanguage. Another theorist, art historian Jonathan Crary, describes it as “a radically different practice about the possibility of presence within perception.” To the print newsroom, it may seem more like Web journalists playing with dangerous toys.

A fresh example of where to draw the line in using Virtual Reality to tell the news has been created by the National Geographic in its documentary “Six Degrees.” It is based on a book, has a Web version, appeared in mid-February on cable and satellite TV and is set to be released in IMAX theaters in a 3-D version.

Each of us will come away from seeing the various versions of “Six Degrees” with our own opinions. But here, for the sake of discussion, and in no particular order, are my thoughts about a high-minded and expensive effort to put the audience into a hypothetical alternative world of global climate change. What do we see?

  • Mixed realities to create an appearance of the real
  • A topic that is large and complex has been reduced to the representation of a natural force, the rise in temperature due to greenhouse gas emissions
  • A point of view from outer space – a metaphor of the space voyager looking down on Earth
  • The application opens with the expectation that something will happen – the beginning of a plot – with an ominous sound reminiscent of the opening of “Jaws.”
  • The presentation Is not linear but has a design structure – the possible perspectives are not infinite
  • The ‘AS IF’ possibilities have been limited for the purposes of logical and affective clarity
  • It purposefully dissolves fixed limits on both time and space
  • It creates an ephemeral reality with an ontology that is founded on the process of global warming
  • The images are transient and malleable – they play upon memories and reinforce our experience (Memories of camping vs. civilization being reduced to tents on the Arctic Circle.)
  • The premise assumes shared information and a common ground – this is not a debate over whether human activities have provoked global climate change
  • It investigates problems but offers no solutions
  • The interface both enables and represents – it emphasizes action, raises alarms
  • The representations involve direct sensing and cognition (sounds of whale songs, melting ice, violent crowds)
  • Scenes are selected, arranged and represented so as to both intensify emotion and condense time (But are they hokey, especially the newscasts?)
  • The design has implicit restraints, but they arise naturally from our growing knowledge of the context
  • The explicit restraints – the temperature scale and Lighthouse Buttons – frame our actions
  • The multisensory experience creates empathy – we vicariously experience what the characters are experiencing
  • The overall impact is to give us a vision that changes our beliefs – our ways of doing things must change (or else…)
  • The application is built upon the storage and retrieval of information in a variety of media types to provide an organic experience that involves the whole sensorium.

    For what it’s worth, my favorite scene is the sidewalk café in Paris (Degree Four). It is reminiscent of “Last Year at Marienbad.”

  • Get your geek on

    LiveScience.com is like the smart-aleck in the back of the class. It’s sharp and saavy, and it’s not afraid to crack a sex joke about dinosaurs. For that, the irreverent 3-year-old science and technology news site took home a 2007 Online Journalism Award in the Specialty Journalism (large sites) category, beating out WebMD.com and Beliefnet.com.

    LiveScience “did a good job of keeping an often static subject fresh and new and you really had a sense they are on top of it,” according to the ONA press release. The judges also lauded the site’s “top-notch use of multimedia” and mix of user and expert voices on a topic that “can get static and old very quickly,” according to Ruth Gersh, co-chair of the 2007 awards.

    What’s their secret? OJR chatted on the phone with Anthony Duignan-Cabrera, Editorial Director of Consumer Media at LiveScience’s parent company, Imaginova, to find out. An edited transcript follows.

    OJR: What do you do at LiveScience?

    Duignan-Cabrera: I oversee what types of stories we’re using, and if we can bring new partners. I’ve been with Imaginova for eight years. It’d been launched as Space.com, and I’d always been interested in space.

    OJR: What’s LiveScience’s mission?

    Duignan-Cabrera: We make science not boring. We think science news should be relevant, funny and engaging. The reader should come to the site and leave smarter after five minutes.

    OJR: How are you better than your competitors?

    Duignan-Cabrera: Our competitors – the Associated Press, New York Times, Washington Post, and other mainstream organizations that reach a broad audience – their news has a lack of irreverence and that “gee whiz” factor. Some of the things they cover are obscure for the public. We want readers to think: I should care about that – whether it be global climate change or how things work in the workplace.

    OJR: Can you give some specific examples of how you respond to your readers?

    Duignan-Cabrera: We looked at what science stories are very popular among our competitive set as well as at general subjects that are hot right now. We tweak our coverage and watch where our traffic goes.

    Last year, we gave our site a major redesign, which took six to eight months. We made accessing our stories easier, especially the top 10 lists and image galleries. We’re expanding how our reporters use our blogs. We increased our environmental coverage, because we realized there was an interest. We always pay attention to how people are going through the site. We track the stories and how the users use the stories they read to go through our site.

    OJR: What’s your advice to journalists trying to improve their sites?

    Duignan-Cabrera: Pick a concept, and then observe what your users like and what your competitors are not doing. If there’s a need, fill it. If the readers want more of something, give it to them. Depending on the topic, they might want more pictures or video.

    We make fun of things that are dry. We have them take quizzes. We cover the top 10 ancient capitals of the world, taboos, and myths about sex.

    OJR: I recently covered a panel discussion at Annenberg that featured scientists and science journalists, and there was a lot of talk about how inadequate science reporting is in general. Some people blame on the public being science-illiterate. What are your views on the state of science journalism?

    Duignan-Cabrera: The scientist, the academics that live in their ivory towers depend on public funding, and the only way to get support is to engage the public. That aside, not everyone’s an astrophysicist, and there’s a way of explaining the coolness of it that entertains as well as educates. Learning should be fun, not funny or goofy, or exploitative. It shouldn’t be like cod liver oil! It should be “neat” and “cool” and all those different adjectives.

    Our company did a Filed Under: Uncategorized Tagged With: