Stanford's computational journalism offers distinct tools for better storytelling
Stanford's data-driven journalism program gives students powerful new ways to explore complex stories of public and social significance. The program blends computers and storytelling while upholding the journalistic values of accuracy and relevance.
There are important stories, Jay Hamilton says, that won't get told unless the research is done.
Communication Professor Jay Hamilton directs Stanford's journalism program, which is teaching students how to use 'big data' to tell stories with public policy impact.
"The underexplored area in journalism is big data," said Hamilton, the Hearst Professor of Communication and director of Stanford's journalism program, which this year launched five entirely new "data-driven" journalism courses for both undergraduate and graduate students.
"What we're trying to do is to discover the patterns in the data and then tell the stories," he said.
No longer the domain of number crunchers and bean counters, data-centric journalism is pushing the boundaries of how to tell deep and meaningful stories to the reading public. At Stanford, students can now learn how to create a mix of sophisticated, explanatory narratives and interactive graphics that weren't possible before.
For the 2014-15 academic year, Stanford hired two lecturers focused on data journalism, Dan Nguyen and Cheryl Phillips. And, this summer the university will open a new computational journalism lab.
"We're trying to do two things – lower the cost of delivering stories, and tell them in a different, more personalized way," said Hamilton, explaining that data and algorithms can help reporters search vast amounts of information more quickly than ever before.
An economist by training, Hamilton envisions Stanford, in the heart of Silicon Valley, becoming a leader in the field of data-driven journalism. Many huge public policy issues, from transportation to political donations and basically any field that generates data, can be examined through computational journalism.
"We're empowering students to produce work that is multimedia, data-intensive, entrepreneurial and influenced by design thinking. This has the impact to change laws and policy," Hamilton said.
On April 16, the journalism program will host a symposium, "Corruption: Who Plays? Who Pays?" featuring an expert panel discussion on how to explore corruption in politics through big data and journalism. The symposium is free and open to the public.
Panelists include Zephyr Teachout, a law professor at New York University and author of Corruption in America; Derek Willis, correspondent for the New York Times; and Justin Grimmer, a Stanford political science professor.
This will be the second in a series of conferences on big data and journalism. In February, Stanford hosted "Data Driven: Coding and Writing Transportation's Future," which included discussions on the evolving landscape data of vehicle and transportation data and its journalistic opportunities.
With big data, what can one learn about public policy? Already, the Stanford effort has highlighted databases on transportation issues, for example – New York taxi ride information, vehicle crash test results, bridge safety evaluations and traffic tracking.
And that is only the tip of the iceberg, Hamilton said. Sifting through all the data takes time and effort, but now the software systems exist to do so more efficiently. The bigger problem is often associated with "rational ignorance." That's when the cost of educating oneself on an issue exceeds the potential benefit that the knowledge would provide.
Rational ignorance has always hindered coverage of public affairs by media companies, according to Hamilton. This was made worse by the downsizing of the journalism field in the last couple decades. Budgets were depleted for the type of comprehensive, investigative journalism that delivers knowledge on key societal issues.
"But the recent explosion of data has given us many opportunities for analysis," he said. "People want to know more about products, entertainment and voting choices, for example."
Graduate student Carolina Wilson said the Department of Communication offers a myriad of data-journalism tools for student reporting projects. But no matter the amount of data, in the end a good story is a good story.
"I think that there is great value in finding or basing stories in data that is regularly available but rarely digested by the public. I believe that the job of a journalist is that of a storyteller," said Wilson, a recent graduate of the University of Notre Dame, where she served as the senior class president.
Sometimes to tell a story worth telling one needs to find the numbers buried deep in a complex report, she said.
"Data analysis also gives journalists an avenue through which they can become better public servants, by digesting normally incomprehensible data and telling our community a story that explains why it should matter," Wilson added.
For Tobin Asher, an undergraduate student in communication, journalism represents a field that should challenge people to critically evaluate what they think they know.
Such assessments usually come from a deeper understanding of the facts beyond who did and said what. "A data-driven approach to journalism aids in this endeavor by attempting to explain underlying causes," he said.
In investigative journalism, "data helps to connect the dots," Asher said. For example, he noted a recent newspaper story that used data trends on prisoners from the past few decades. It showed that increased incarceration rates no longer result in decreasing crime rates.
It is to this type of big-picture issue that Allison McCartney, a graduate student in journalism, plans to apply her newfound skills. She is especially interested in finding stories in government spending data.
"Someone told me when I first started working with defense contractor data that it's a window into what the Defense Department wants to do, as opposed to so many records that tell you what the government has already done," said McCartney, who worked for PBS NewsHour as an editor before coming to Stanford.
She noted that the challenge is drawing insight from these numbers, then "connecting them to the lives and work of real people to learn what they mean."
'A natural fit'
Dan Nguyen, one of the new lecturers, has experience as a newspaper reporter and in data journalism for media companies and nonprofits. The use of data in understanding the world is not exactly a novel way of thinking, he said, but it is increasingly becoming the language and foundation for how the world thinks and operates.
"That alone to me makes the case for being more computationally aware in order to make sense of the world, especially for everyone who wants to find and tell stories," said Nguyen.
He describes this approach as a natural fit for Stanford: "Not just because of Stanford's technical reputation, but because of the university's overall enthusiasm and discipline toward exploring new ways of learning and building."
Stanford is also home to the David and Helen Gurley Brown Institute for Media Innovation, which is a collaboration between Stanford and Columbia universities to support new endeavors in media innovation. At Stanford, the primary focus is on media technology, and the institute is located in the School of Engineering.
Jay Hamilton, Department of Communication: (650) 723-5448, firstname.lastname@example.org
Clifton B. Parker, Stanford News Service: (650) 725-0224, email@example.com