The Persistence of False Paradigms in Low-Power Sciences

By Pascal Michaillat*, Brown University It is commonly believed that the lack of experimental evidence typical in the social sciences slows but does not prevent the replacement of existing theories by newer, better ones. A simple model of scientific research and promotion challenges that belief, however. In the model, scientists are slightly…

Interpretation of study results (Part 2/2): A reproducible method

Guest post by Arnaud Vaganay (Meta-Lab) This post is the second of two dedicated to the reproducible interpretation of empirical results in the social sciences. Read part 1 here. In my previous post on the interpretation of study results, I contrasted the notions of: Analytic reproducibility, which is concerned with the reproducibility…

Transparency and Trust in the Research Ecosystem

CEGA launched the Berkeley Initiative for Transparency in the Social Sciences (BITSS) based on the argument that more transparency in research could address underlying factors driving publication bias, unreliable research findings, a lack of reproducibility in the published literature, and a problematic incentive structure within the research ecosystem.  Meanwhile an ever-increasing number…

Open Science and a Culture of Health: You Two Should Talk

Guest post by Sean Grant and Kathryn Bouskill (RAND Corporation) The Open Science movement aims to advance scientific progress by making research more open and accessible. The Culture of Health movement aims to advance collective well-being by making health a shared value and priority among all policy sectors, not just healthcare. Proponents…

Review of Stata’s dyndoc

Guest post: Tomas Dvorak is a Professor of Economics at Union College and a former Project TIER fellow. This is a repost. The original post can be found here. As a huge fan of Stata I was super-excited about dynamic markdown documents newly available in the latest Stata 15 release. I played with…

R for Stata Users

Garret Christensen —BITSS Project Scientist For whatever reason, economists use a lot of Stata. It does what we want to do (data cleaning, regression analysis, data visualization) well, and the $1,000 fees we pay every other version or so doesn’t seem to have stopped its widespread adoption. But is that changing, and…

DART Statement Pushback and Response

Garret Christensen–BITSS Project Scientist Hopefully you’re familiar with the DART Statement, a set of recommendations on Data Access and Research Transparency from APSA. (If you’re not, it’s basically political science journals getting together and saying authors will have to share their data in a trusted repository in order to be published.) There’s…

Who Inspired the Leamer-Rosenthal Prizes? Part II – Ed Leamer

Guest post by Edward Leamer, UCLA Professor of Economics & Statistics I became interested in methodological issues as a University of Michigan graduate student from 1967 to 1970, watching the economics faculty build an econometric macro model in the basement of the building (The Michigan Model), and comparing how these same faculty members described…

The BITSS Take on "wormwars" and Replication Writ Large

Garret Christensen–BITSS Project Scientist If you’re a development economist, or at all interested in research transparency, I assume you’ve heard about the recent deworming replication controversy. (If you were lucky enough to miss “wormwars,” you can catch up with just about every thing with this one set of links on storify.com). Here…

Emerging Researcher Perspectives: Replication as a Credible Pre-Analysis Plan

One of the most important tools for enhancing the credibility of research is the pre-analysis plan, or the PAP. Simply put, we feel more confident in someone’s inferences if we can verify that they weren’t data mining, engaging in motivated reasoning, or otherwise manipulating their results, knowingly or unknowingly. By publishing a…

Emerging Researcher Perspectives: Get it Right the First Time!

Guest post by Olivia D’Aoust, Ph.D. in Economics from Université libre de Bruxelles, and former Fulbright Visiting Ph.D. student at the University of California, Berkeley. As a Fulbright PhD student in development economics from Brussels, my experience this past year on the Berkeley campus has been eye opening. In particular, I discovered…

P-values are Just the Tip of the Iceberg

Roger Peng and Jeffrey Leek of John Hopkins University claim that “ridding science of shoddy statistics will require scrutiny of every step, not merely the last one.” This blog post originally appeared in Nature on April 28, 2015 (see here). There is no statistic more maligned than the P value. Hundreds of papers and…

Scientists Have a Sharing Problem

Dec 15th Maggie Puniewska posted an article in the Atlantic Magazine summarizing the obstacles preventing researchers from sharing their data. The article asks if “science has traditionally been a field that prizes collaboration […] then why [are] so many scientists stingy with their information.” Puniewska outlines the most cited reasons scientists reframe…

Psychology's Credibility Crisis

In a recent interview appearing in Discover Magazine, Brian Nosek, Co-founder of the Center for Open Science and speaker at the upcoming BITSS Annual Meeting, discusses the credibility crisis in psychology.  According to the article, Psychology has lost much of it credibility after a series of published papers were revealed as fraudulent and many other…

Psychology’s Credibility Crisis

In a recent interview appearing in Discover Magazine, Brian Nosek, Co-founder of the Center for Open Science and speaker at the upcoming BITSS Annual Meeting, discusses the credibility crisis in psychology.  According to the article, Psychology has lost much of it credibility after a series of published papers were revealed as fraudulent and many other…

Facilitating Radical Change in Publication Standards: Overview of COS Meeting Part II

Originally posted on the Open Science Collaboration by Denny Borsboom This train won’t stop anytime soon. That’s what I kept thinking during the two-day sessions in Charlottesville, where a diverse array of scientific stakeholders worked hard to reach agreement on new journal standards for open and transparent scientific reporting. The aspired standards are intended…

What to Do If You Are Accused of P-Hacking

In a recent post on Data Colada, University of Pennsylvania Professor Uri Simonsohn discusses what do in the event you (a researcher) are accused of having altered your data to increase statistical significance. Simonsohn states: It has become more common to publicly speculate, upon noticing a paper with unusual analyses, that a reported finding was…

Scientific consensus has gotten a bad reputation—and it doesn’t deserve it

In a recent post, Senior science editor at Ars Technica, John Timmer defends the importance of consensus. Opening with the following quote from author Michael Crichton: Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator…

The 10 Things Every Grad Student Should Do

In a recent post on the Data Pub blog, Carly Strasser provides a useful transparency guide for newcomers to the world of empirical research. Below is an adapted version of that post.  1. Learn to code in some language. Any language. Strasser begins her list urging students to learn a programming language. As the limitations of…

Can Greater Transparency Lead to Better Social Science?

In a recent article on the Monkey Cage, professors Mike Findley, Nathan Jensen, Edmund Malesky and Tom Pepinsky  discuss publication bias, the “file drawer problem” and how a special issue of the journal Comparative Political Studies will help address these problems.  Similar to a recent article by Brendan Nyhan, reposted on the BITSS blog, the university professors writing…

To Get More Out of Science, Show the Rejected Research

In a recent opinion piece on the New York Times news portal the Upshot, Brendan Nyhan, an assistant professor of government at Dartmouth College, comments on a host of transparency related issues. Closely echoing the mission of BITSS, Nyhan identifies the potential of research transparency to improve the rigor and ultimately the benefits…

Africa’s Data Revolution – Amanda Glassman

Interview originally posted on the Global Poverty Wonkcast:   Is the revolution upon us? When it comes to data, the development world seems to be saying yes, Yes, YES! To look beyond the hype, I invited Amanda Glassman, a CGD senior fellow and director of our global health policy program, to join me…

Africa's Data Revolution – Amanda Glassman

Interview originally posted on the Global Poverty Wonkcast:   Is the revolution upon us? When it comes to data, the development world seems to be saying yes, Yes, YES! To look beyond the hype, I invited Amanda Glassman, a CGD senior fellow and director of our global health policy program, to join me…

Can Post-Publication Peer-Review Increase Research Transparency?

Guest Post by Liz Allen (ScienceOpen) For the 3rd annual conference of The Berkeley Initiative for Transparency in the Social Sciences (BITSS), ScienceOpen, the new Open Access (OA) research + publishing network, would like prospective and registered attendees to consider the role that Post-Publication Peer Review (PPPP) can play in increasing the transparency…

The Controversy of Preregistration in Social Research

Guest post by Jamie Monogan (University of Georgia) A conversation is emerging in the social sciences over the merits of study registration and whether it should be the next step we take in raising research transparency. The notion of study registration is that, prior to observing outcome data, a researcher can publicly…

Bias Minimization Lessons from Medicine – How We Are Leaving a $100 Bill on the Ground

By Alex Eble (Brown University), Peter Boone (Effective Intervention), and Diana Elbourne (University of London) The randomized controlled trial (RCT) now has pride of place in much applied work in economics and other social sciences. Economists increasingly use the RCT as a primary method of investigation, and aid agencies such as the World…

The Role of Failure in Promoting Transparency

By Carson Christiano (CEGA) You may wonder why a network of development researchers is taking the lead on a transparency initiative. The answer lies in the profound and omnipresent power of failure. Most would agree that risk-taking is essential to innovation, whether we’re talking about creating a simple hand-washing station or a…

Research Transparency in the Natural Sciences: What can we learn?

By Temina Madon (CEGA, UC Berkeley) As we all know, experimentation in the natural sciences far predates the use of randomized, controlled trials (RCTs) in medicine and the social sciences; some of the earliest controlled experiments were conducted in the 1920s by RA Fisher, an agricultural scientist evaluating new crop varieties across…

Transparency-Inducing Institutions and Legitimacy

By Kevin M. Esterling (Political Science, UC Riverside) Whenever I discuss the idea of hypothesis preregistration with colleagues in political science and in psychology, the reactions I get typically range from resistance to outright hostility. These colleagues obviously understand the limitations of research founded on false-positives and data over-fitting. They are even…

The Need for Pre-Analysis: First Things First

By Richard Sedlmayr (Philanthropic Advisor) When we picture a desperate student running endless tests on his dataset until some feeble point finally meets statistical reporting conventions, we are quick to dismiss the results. But the underlying issue is ubiquitous: it is hard to analyze data without getting caught in a hypothesis drift,…

Freedom! Pre-Analysis Plans and Complex Analysis

By Gabriel Lenz (UC Berkeley) Like many researchers, I worry constantly about whether findings are true or merely the result of a process variously called data mining, fishing, capitalizing on chance, or p-hacking. Since academics face extraordinary incentives to produce novel results, many suspect that “torturing the data until it speaks” is…

Transparency and Pre-Analysis Plans: Lessons from Public Health

By David Laitin (Political Science, Stanford) My claim in this blog entry is that political science will remain principally an observation-based discipline and that our core principles of establishing findings as significant should consequently be based upon best practices in observational research. This is not to deny that there is an expanding…

Targeted Learning from Data: Valid Statistical Inference Using Data Adaptive Methods

By Maya Petersen, Alan Hubbard, and Mark van der Laan (Public Health, UC Berkeley) Statistics provide a powerful tool for learning about the world, in part because they allow us to quantify uncertainty and control how often we falsely reject null hypotheses. Pre-specified study designs, including analysis plans, ensure that we understand…

Monkey Business

By Macartan Humphreys (Political Science, Columbia & EGAP) I am sold on the idea of research registration. Two things convinced me. First I have been teaching courses in which each week we try to replicate prominent results produced by political scientists and economists working on the political economy of development. I advise…

Bayes’ Rule and the Paradox of Pre-Registration of RCTs

By Donald P. Green (Political Science, Columbia) Not long ago, I attended a talk at which the presenter described the results of a large, well-crafted experiment. His results indicated that the average treatment effect was close to zero, with a small standard error. Later in the talk, however, the speaker revealed that…

An Open Discussion on Promoting Transparency in Social Science Research

By Edward Miguel (Economics, UC Berkeley) This CEGA Blog Forum builds on a seminal research meeting held at the University of California, Berkeley on December 7, 2012. The goal was to bring together a select interdisciplinary group of scholars – from biostatistics, economics, political science and psychology – with a shared interest…