Emerging Researcher Perspectives: Replication as a Credible Pre-Analysis Plan

Guest post by Raphael Calel, Ciriacy-Wantrup Postdoctoral Fellow at the Department of Agricultural and Resource Economics at the University of California, Berkeley.


One of the most important tools for enhancing the credibility of research is the pre-analysis plan, or the PAP. Simply put, we feel more confident in someone’s inferences if we can verify that they weren’t data mining, engaging in motivated reasoning, or otherwise manipulating their results, knowingly or unknowingly. By publishing a PAP before collecting data, and then closely following that plan, researchers can credibly demonstrate to us skeptics that their analyses were not manipulated in light of the data they collected.

Still, PAPs are credible only when the researcher can anticipate and wait for the collection of new data.  The vast majority of social science research, however, does not satisfy these conditions. For instance, while it is perfectly reasonable to test new hypotheses about the causes of the recent financial crisis, it is unreasonable to expect researchers to have pre-specified their analyses before the crisis hit. To give another example, no one analysing a time series of more than a couple of years can reasonably be expected to publish a PAP and then wait for years or decades before implementing the study. Most observational studies face this problem in one form or another.

(more…)

New York Times Covers TOP Guidlines

Yesterday in Science, the Transparency and Openness Promotion (TOP) Committee published the TOP Guidelines, referred to by the New York Times as “the the most comprehensive guidelines for the publication of studies in basic science to date” (see here). The guidelines are the output of a November 2014 meeting at the Center for Open Science (COS), co-hosted with BITSS and Science Magazine.

Transparency and Openness Promotion Guidelines

By Garret Christensen (BITSS)


BITSS is proud to announce the publication of the Transparency and Openness Promotion Guidelines in Science. The Guidelines are a set of standards in eight areas of research publication:

  • Citation Standardstop1
  • Data Transparency
  • Analytic Methods (Code) Transparency
  • Research Materials Transparency
  • Design and Analysis Transparency
  • Preregistration of Sudies
  • Preregistration of Analysis Plans
  • Replication

(more…)

Emerging Researcher Perspectives: Get it Right the First Time!

Guest post by Olivia D’Aoust, Ph.D. in Economics from Université libre de Bruxelles, and former Fulbright Visiting Ph.D. student at the University of California, Berkeley.


As a Fulbright PhD student in development economics from Brussels, my experience this past year on the Berkeley campus has been eye opening. In particular, I discovered a new movement toward improving the standards of openness and integrity in economics, political science, psychology, and related disciplines lead by the Berkeley Initiative for Transparency in the Social Sciences (BITSS).

When I first discovered BITSS, it struck me how little I knew about research on research in the social sciences, the pervasiveness of fraud in science in general (from data cleaning and specification searching to faking data altogether), and the basic lack of consensus on what is the right and wrong way to do research. These issues are essential, yet too often they are left by the wayside. Transparency, reproducibility, replicability, and integrity are the building blocks of scientific research.

(more…)

Advisory Board Established for Project TIER

Guest post by Richard Ball and Norm Medeiros, co-principal investigators of Project TIER at Haverford College.


Project TIER (Teaching Integrity in Empirical Economics) is pleased to announce its newly-established Advisory Board. The advisors – George Alter (ICPSR), J. Scott Long (Indiana University), Victoria Stodden (University of Illinois at Urbana-Champaign), and Justin Wolfers (Peterson Institute/University of Michigan) – will help project directors Richard Ball and Norm Medeiros consider ways of developing and promoting the TIER protocol for documenting empirical research.

The guiding principle behind the protocol is that the documentation (data, code, and supplementary information) should be complete and transparent enough to allow an interested third party to easily and exactly reproduce all the steps of data management and analysis that led from the original data files to the results reported in the paper. The ultimate goal of Project TIER is to foster development of a national network of educators committed to integrating methods of empirical research documentation, guided by the principle of transparency, into the curricula of the social sciences.

(more…)

New Advisory Board Member: Paul Romer

BITSS is delighted to announce that we’ve added a new member to our advisory board: economist Paul Romer. Paul is a prominent economic theorist who has made major contributions to our understanding of economic growth, technological change, and urbanization. Paul is currently Professor of Economics at NYU, director of the Marron Institute of Urban Management, and director of the Urbanization Project at the Leonard N. Stern School of Business. He has previously taught at Stanford University’s Graduate School of Business, the University of California Berkeley, the University of Chicago, and the University of Rochester. You can learn more about him and the other advisory board members here, or you can view Paul’s own website.

As far as work related to transparency, Paul has recently written a paper in the Papers & Proceedings issue of The American Economic Review as well as several related blog posts about “mathiness” in economic theory models. The paper and related blog posts have elicited a significant amount of interest and sparked a fascinating debate among economic theorists.

(more…)

Influential Paper on Gay Marriage Might Be Marred by Fraudulent Data

Harsh scrutiny of an influential political science experiment highlights the importance of transparency in research.


The paper, from UCLA graduate student Michael LaCour and Columbia University Professor Donald Green, was published in Science in December 2014. It asserted that short conversations with gay canvassers could not only change people’s minds on a divisive social issue like same-sex marriage, but could also have a contagious effect on the relatives of those in contact with the canvassers. The paper received wide attention in the press.

Yet three days ago, two graduate students from UC Berkeley, David Broockman and Joshua Kalla, published a response to the study, pointing to a number of statistical oddities, and discrepancies between how the experiment was reported and how the authors said it was conducted. Earlier in the year, impressed by the paper findings, Broockman and Kalla had attempted to conduct an extension of the study, building on the original data set. This is when they became aware of irregularities in the study methodology and decided to notify Green.

Reviewing the comments from Broockman and Kalla, Green, who was not involved in the original data collection, quickly became convinced that something was wrong – and on Tuesday, he submitted a letter to Science requesting the retraction of the paper. Green shared his view on the controversy in a recent interview, reflecting on what it meant for the broader practice of social science and highlighting the importance of integrity in research.

(more…)

All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Archives

Follow

Get every new post delivered to your Inbox.

Join 492 other followers