Three Transparency Working Papers You Need to Read

Garret Christensen, BITSS Project Scientist


Several great working papers on transparency and replication in economics have been released in the last few months. Two of them are intended for a symposium in The Journal of Economic Perspectives, to which I am very much looking forward, and are about pre-analysis plans. The first of these, by Muriel Niederle and Lucas Coffman, doesn’t pull any punches with its title: “Pre-Analysis Plans are not the Solution, Replication Might Be.” Niederle and Coffman claim that PAPs don’t decrease the rate of false positives sufficiently to be worth the effort, and that replication may be a better way to get at the truth. Some of their concern about PAPs stems from concerns about the assumption “[t]hat one published paper is the result of one pre-registered hypothesis, and that one pre-registered hypothesis corresponds to one experimental protocol. Neither can be guaranteed.” They’re also not crazy about design-based publications (or “registered reports“). They instead offer a proposal to get replication to take off, calling for the establishment of a Journal of Replication Studies, and for researchers to start citing replications, both positive and negative, whenever they cite an original work. They claim if these changes were made, researchers might begin to expect to see replications, and thus the value of writing and publishing them would increase.

Another working paper on PAPs in economics, titled simply “Pre-Analysis Plans in Economics” was released recently by Ben Olken. Olken gives a lot of useful background on the origin on PAP and discusses in detail what should go into them. A reference I found particularly informative is “E9 Statistical Principles for Clinical Trials,” the FDA’s official guidance for trials, especially section V on Data Analysis Considerations. Obviously a lot of the transparency practices we’re trying to adopt in economics and social sciences come from medicine, so it’s nice to see the original source. He compares the benefits: increasing confidence in results, making full use of statistical power, and improving relationships with partners (governments or corporations that may have vested interests in the outcomes of trials), with the costs: complexity and the challenge of writing all possible papers in advance, PAPs pushing towards simple, less interesting papers with less nuance, and reducing the ability to learn ex-post about your data. He cites Brodeur et al to say the problem of false positives isn’t that large, and that with the exception of the trials involving parties with vested interests, the costs outweigh the benefits.

(more…)

Call for Papers: Working Group in African Political Economy (WGAPE)

BITSS is co-sponsoring the 4th Working Group in African Political Economy (WGAPE) Annual Meeting, taking place May 29-30 at the Watson Institute for International and Public Affairs, Brown University.


WGAPE brings together faculty and advanced graduate students in Economics and Political Science who combine field research experience in Africa with training in political economy methods. Paper submissions must reflect WGAPE’s broad research agenda on core issues within the political economy of African development and encourages submissions of works-in-progress.

In addition, this year’s call for papers invites submissions of Pre-Analysis Plans (PAPs). There will be one dedicated session to the presentation and discussion of a PAP at this meeting.

  • Please find the full call for paper here.
  • Papers must be uploaded here by 11:59pm PT on April 19th.
  • Successful applicants will be notified by May 1st and will be invited to attend the full symposium.
  • WGAPE will cover the cost of economy travel, accommodation and dining (capped).

For further information, please contact Elisa Cascardi (CEGA) at wgapeworkshop@gmail.com.

BITSS is hiring!

BITSS is seeking a Senior Program Manager and Coordinator. Applications for both positions will be reviewed on a rolling basis until May 10 for employment beginning in June/July 2015.


The Senior Program Manager will be responsible for the overall management, leadership, and oversight of the initiative. The ideal candidate should have:

  • MA/MS in Economics, Political Science, Management Sciences, or a related discipline. PhDs are also encouraged to apply.
  • 5-10 years professional experience in an academic, non-profit, or social enterprise setting.
  • Demonstrated interest in and knowledge of empirical social science research, evidence- based policy, and research transparency.

The Program Coordinator will be responsible for the development and design of training and communication content for BITSS, including online courses, social media strategies, research dissemination plans, and outreach events. The ideal candidate should have:

  • BA/BS in Design, Communication, Computer Science, or Journalism; or BA/BS in Economics, Political Science, or related discipline, with a solid grasp of design and communication.
  • 1-3 years professional experience managing design, communication, or online content development.
  • Proven track record developing websites and online content, including proficiency with markup languages and platforms such as Weebly or WordPress.

Full job descriptions as well application instructions can be found on the CEGA job page.

Registered Reports to the Rescue?

After writing an article for The Upshot, Brendan Nyhan (Assistant Professor at Dartmouth) was interviewed by The Washington Post.


The original Upshot article advocates for a new publishing structure called Registered Reports (RRs):

A research publishing format in which protocols and analysis plans are peer reviewed and registered prior to data collection, then published regardless of the outcome.

In the following interview with the Washington Post, Nyhan explains in greater detail why RRs are more effective than other tools at preventing publication bias and data mining. He begins by explaining the limitations of preregistration.

As I argued in a white paper, […] it is still too easy for publication bias to creep in to decisions by authors to submit papers to journals as well as evaluations by reviewers and editors after results are known. We’ve seen this problem with clinical trials, where selective and inaccurate reporting persists even though preregistration is mandatory.

(more…)

TIER Faculty Fellowships 2015-16

Richard Ball, Associate Professor of Economics, and Norm Medeiros, Associate Librarian, of Haverford College, are co-principal investigators of Project TIER. They are seeking the first class of TIER Fellows to promote and extend teaching of transparent and reproducible empirical research methods.


Project TIER (Teaching Integrity in Empirical Research) is an initiative that promotes training in open and transparent methods of quantitative research in the undergraduate and graduate curricula across all the social sciences.

The Project anticipates awarding three or four TIER Faculty Fellowships for the 2015-16 academic year. Fellows will collaborate with TIER leadership and work independently to develop and disseminate transparent research methods that are suitable for adoption by students writing theses, dissertations or other supervised papers, or that can be incorporated into classes in which students conduct quantitative research.

The period of the Fellowships will be from June 1, 2015 through June 30, 2016. Each Fellow will receive a stipend of $5,000.

Applications are due April 19, 2015. Early inquiries and expressions of interest are encouraged. To learn more and apply, visit http://www.haverford.edu/TIER/opportunities/fellowships_2015-16.php.

The End of p-values?

Psychology Professors David Trafimow and Michael Marks of New Mexico State University discuss the implications of banning p-values from appearing in published articles.


To combat the practice of p-hacking, the editors of Basic and Applied Social Psychology (BASP) will no longer publish p-values included in articles submitted to the journal. The unprecedented move by the journal’s editorial board signals publishing norms may be changing faster than previously believed, but also raises certain issues. In a recent article published by Rutledge, editors of BASP, David Trafimow and Michael Marks, bring up 3 key questions associated with the banning of the null hypothesis significance testing procedure (NHSTP).

Question 1: Will manuscripts with p-values be desk rejected automatically?

Answer 1: No […] But prior to publication, authors will have to remove all vestiges of the NHSTP (p-values, t-values, F-values, statements about ‘‘significant’’ differences or lack thereof, and so on).

Question 2: What about other types of inferential statistics such as confidence intervals or Bayesian methods?

Answer 2: Analogous to how the NHSTP fails to provide the probability of the null hypothesis, […] confidence intervals do not provide a strong case for concluding that the population parameter of interest is likely to be within the stated interval. Therefore, confidence intervals also are banned from BASP.

(more…)

How NOT to Interpret p-values

Your dose of BITSS humor, via xkcd.


Source: xkcd.com

(PhOTO CREDIT: xkcd.com)

All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Archives

Follow

Get every new post delivered to your Inbox.

Join 343 other followers