Accessibility Settings

color options

monochrome muted color dark

reading tools

isolation ruler
academic fraud investigative journalism
academic fraud investigative journalism

Image: Shutterstock

Stories

Topics

Why Investigative Journalists Should Report on Lax Oversight and Fraud in Research Data

Uri Simonsohn is an outspoken advocate for open science — adding transparency to the research process and helping researchers share what they’ve learned in greater detail with a broad audience.

Many people know Simonsohn for his data analyses on Data Colada, a blog about social science research he writes with two other behavioral scientists, Leif Nelson and Joseph Simmons. The three scholars, who co-direct the Wharton Credibility Lab at the University of Pennsylvania, occasionally use the blog to spotlight evidence of suspected fraud they’ve found in academic papers.

In his role at the Credibility Lab and as a professor at Esade Business School in Barcelona, Simonsohn travels to speak on issues around scientific integrity and data science. During his recent visit to Harvard University, The Journalist’s Resource asked for his thoughts on how journalists can improve their coverage of academic fraud and misconduct.

Here are three big takeaways from our conversation.

1. Before Covering Academic Studies, Ask Researchers About Preregistration

Preregistration is “the practice of documenting your research plan at the beginning of your study and storing that plan in a read-only public repository such as OSF Registries or the National Library of Medicine’s Clinical Trials Registry,” according to the nonprofit Center for Open Science. Simonsohn says preregistration helps prevent research fraud. When researchers create a permanent record outlining how they intend to conduct a study before they start, they are discouraged from changing parts of their study — for instance, their hypothesis or study sample — to get a certain result.

Simonsohn adds that preregistration also reduces what’s known as “p-hacking,” or manipulating an analysis of data to make it seem as though patterns in the data are statistically significant when they are not. Examples of p-hacking: Adding more data or control variables to change the result or deciding after the analysis is complete to exclude some data. (For more on statistical significance, read our tip sheet on the topic.)

Preregistration is particularly important when researchers will be collecting their own data, Simonsohn points out. It’s easier to alter or fabricate data when you collect it yourself, especially if there’s no expectation to share the raw data.

While preregistration is the norm in clinical trials, it’s less common in other research fields. About half of psychology research is preregistered as is about a quarter of marketing research, Simonsohn says. A substantial proportion of economic research is not, however, because it often relies on data collected by other researchers or nonprofit organizations and government agencies such as the US Census Bureau.

Simonsohn urges journalists to ask researchers whether they preregistered their studies before reporting on them. He likened reporting on research that isn’t preregistered to driving a car that hasn’t been inspected. The car might be perfectly safe, but you can’t be sure because no one has had a chance to look under the hood.

“If the person says ‘no,’ [the journalist] could ask, ‘Oh, how come?’” he says. “And if they don’t provide a compelling reason, the journalist could say ‘You know, I’m not going to cover work that hasn’t been preregistered, without a good rationale.’”

Research registries themselves can be a helpful resource for journalists. The Center for Open Science lets the public search for and read the thousands of preregistered research plans on its Open Science Framework platform. Researchers who preregister their work at AsPredicted, a platform Simonsohn helped create for the Wharton Credibility Lab, can choose whether and when to make their preregistered research plan public.

2. Report on the Lack of Oversight of Research Data Collection

Journalists and the public probably don’t realize how little oversight there is when it comes to collecting and analyzing data for research, Simonsohn says. That includes research funded by the federal government, which gives colleges, universities and other organizations billions of dollars a year to study public health, climate change, new technology and other topics.

Simonsohn says there’s no system in place to ensure the integrity of research data or its analysis. Although federal law requires research involving human subjects to be reviewed by an Institutional Review Board, the primary goal of these independent committees is protecting the welfare and rights of study participants.

Academic papers are reviewed by a small group of experts before a scholarly journal will publish them. But the peer-review process isn’t designed to catch research fraud. Reviewers typically do not check the authors’ work to see if they followed the procedures they say they followed to reach their conclusions.

Simonsohn says journalists should investigate the issue and report on it.

“The lack of protection against fraud is a story that deserves to be written,” he says. “When I teach students, they’re shocked. They’re shocked that when you submit a paper to a journal, [the journal is] basically trusting you without any safeguards. You’re not even asked to assert in the affirmative that you haven’t done anything wrong.”

Journalists should also examine ways to prevent fraud, he adds. He thinks researchers should be required to submit “data receipts” to organizations that provide grant funding to show who has had access to, changed, or analyzed a study’s raw data and when. This record keeping would be similar to the chain of custody process that law enforcement agencies follow to maintain the legal integrity of the physical evidence they collect.

“That is, by far, the easiest way to stop most of it,” Simonsohn says.

3. Learn About Open Science Practices and the Scientists who Expose Problematic Research

Nearly 200 countries have agreed to follow the common standards for open science that the United Nations Scientific, Educational and Cultural Organization (UNESCO) created in 2021. In December, UNESCO released a status report of initiatives launched in different parts of the globe to help researchers work together in the open and share what they’ve learned in detail with other researchers and the public. The report notes, for example, that a rising number of countries and research organizations have developed open data policies.

As of January 2024, more than 1,100 open science policies were adopted by research organizations and research funders worldwide, according to the Registry of Open Access Repositories Mandatory Archiving Policies, which tracks policies requiring researchers to make their “research output” public.

In the US, the universities and university departments that have adopted these policies include Johns Hopkins University, University of Central Florida, Stanford University’s School of Education and Columbia University’s School of Social Work. Such policies also have been adopted at Harvard Kennedy School and one of its research centers, the Shorenstein Center on Media, Politics and Public Policy, which is where The Journalist’s Resource is housed.

Science mag Cornell research scientist resigns

Watchdog Nick Brown exposed problems with a Cornell nutrition professor’s research, leading to the retractions of six academic papers as well as the professor’s resignation. Image: Screenshot, Science magazine

Simonsohn recommends journalists learn about open science practices and familiarize themselves with research watchdogs such as Nick Brown, known for helping expose problems in published studies by prominent nutrition scientist Brian Wansink.

Retraction Watch, a website that tracks research retractions, maintains a list of more than two dozen scientific sleuths. Elisabeth Bik, a microbiologist and science integrity consultant who has been called “the public face of image sleuthing,” was a guest speaker in The Journalist’s Resource’s recent webinar on covering research fraud and errors.

Here are some of the open science organizations that journalists covering these issues will want to know about:

This story was originally published on The Journalist’s Resource and is reprinted here with permission. 


Denise-Marie OrdwayDenise-Marie Ordway joined The Journalist’s Resource in 2015 after working as a reporter for newspapers and radio stations in the U.S. and Central America, including the Orlando Sentinel and Philadelphia Inquirer. Her work also has appeared in publications such as USA TODAY, The New York Times, Chicago Tribune and The Washington Post. She has received a multitude of national, regional and state-level journalism awards and was named as a Pulitzer Prize finalist in 2013 for an investigative series she led that focused on hazing and other problems at Florida A&M University. Ordway was a 2014-15 Fellow of Harvard’s Nieman Foundation for Journalism. She also serves on the board of directors of the Education Writers Association.

Republish our articles for free, online or in print, under a Creative Commons license.

Republish this article


Material from GIJN’s website is generally available for republication under a Creative Commons Attribution-NonCommercial 4.0 International license. Images usually are published under a different license, so we advise you to use alternatives or contact us regarding permission. Here are our full terms for republication. You must credit the author, link to the original story, and name GIJN as the first publisher. For any queries or to send us a courtesy republication note, write to hello@gijn.org.

Read Next

Methodology Reporting Tools & Tips

5 Tips for Using Academic Research in Investigative Journalism

Academic research is a crucial tool for investigating societal problems and holding the powerful accountable. Pulitzer Prize-winning investigative journalist Neil Bedi, criminologist Rachel Lovell, and Denise-Marie Ordway of The Journalist’s Resource share practical advice on using academic research in investigative journalism.

Magnifying glass, mortarboard, academia, journalism, collaboration

Case Studies

How a Canadian Reporting Lab Is Pioneering Academic-Journalist Collaboration 

Fundamentally, journalists and scholars do similar work – diving into documents, crunching numbers, conducting interviews. But their timeframes, and their measures of success, can be quite different. The Global Reporting Centre tries to bridge that gap, both by funding such collaborations and serving as an ambassador between two very different cultures.