Richard Ball (Economics Professor at Haverford College and presenter at the 2014 BITSS Summer Institute) and Norm Medeiros (Associate Librarian at Haverford College) in a recent interview appearing on the Library of Congress based blog The Signal, discussed Project TIER (Teaching Integrity in Empirical Research) and their experience educating students how to document their empirical analysis.
What is Project TIER
For close to a decade, we have been teaching our students how to assemble comprehensive documentation of the data management and analysis they do in the course of writing an original empirical research paper. Project TIER is an effort to reach out to instructors of undergraduate and graduate statistical methods classes in all the social sciences to share with them lessons we have learned from this experience.
What is the TIER documentation protocol?
We gradually developed detailed instructions describing all the components that should be included in the documentation and how they should be formatted and organized. We now refer to these instructions as the TIER documentation protocol. The protocol specifies a set of electronic files (including data, computer code and supporting information) that would be sufficient to allow an independent researcher to reproduce–easily and exactly–all the statistical results reported in the paper.
Keynote speaker at the upcoming BITSS annual meeting John Ioannidis (Professor of Health Research and Policy at Stanford School of Medicine, and Co-Director of the Meta-Research Innovation Center) speaks at Google about its efforts to improve research designs standards and reproducibility in science. Ioannidis is the author of the 2005 highly influential paper Why Most Published Research Findings Are False, the most downloaded technical paper from the open access library PLOS.
In a recent article on the Monkey Cage, professors Mike Findley, Nathan Jensen, Edmund Malesky and Tom Pepinsky discuss publication bias, the “file drawer problem” and how a special issue of the journal Comparative Political Studies will help address these problems.
[S]cholars may think strategically about what editors will want […] this means that “boring” findings, or findings that fail to support an author’s preferred hypotheses, are unlikely to be published — the so-called “file drawer problem.” More perniciously, it can incentivize scholars to hide known problems in their research or even encourage outright fraud, as evinced by the recent cases of psychologist Diederik Stapel and acoustician Peter Chen.
To address these problems, the authors of the article have worked with the journal for Comparative Political Studies to release a special edition in which:
[A]uthors will submit manuscripts with all mention of the results eliminated […] Other authors will submit manuscripts with full descriptions of research projects that have yet to be executed […] In both cases, reviewers and editors must judge manuscripts solely on the coherence of their theories, the quality of their design, the appropriateness of their empirical methods, and the importance of their research question.
BITSS has expanded its online media presence with a new Twitter account. Keep up to date with us and the world of research transparency by following @ucbitss.
Papers or long abstract for the Call for Papers on Research Transparency must be submitted by Friday, October 10th (11:59pm PST) through CEGA’s Submission Platform.
Topics for papers include, but are not limited to: pre-registration and the use of pre-analysis plans; disclosure and transparent reporting; replicability and reproducibility; data sharing; and methods for detecting and reducing publication bias or data mining. For more information lease fine the original call for papers here.
Since releasing the call for papers, BITSS has received many exciting submissions and has confirmed its keynote speakers which will include among others John Ioannidis from the Stanford School of Medicine and Victoria Stodden from the University of Illinois. To register visit the registration page.
The U.S. Government’s Millennium Challenge Corporation (MCC) wants to hear your new and innovative ideas on how to maximize the use of data that MCC finances for its independent evaluations.
Keynote speakers at this year’s BITSS Research Transparency Forum, Jennifer Sturdy and Jack Molyneaux at MCC’s Department of Policy and Evaluation, and Kathy Farley and Kristin Penn at the Department of Compact Operations outlined the details of the challenge in a recent post on the MCC Poverty Reduction Blog.
Why issue the challenge?
The release of this data is intended to facilitate broader use of the data, above and beyond the scope of the independent evaluations that produced this data. Since the challenge was announced at the end of August, one question to MCC has been – what type of additional learning is the agency interested in?
Who can accept the challenge?
MCC has just announced its first Open Data Challenge – the call-to-action to any masters and PhD students working in economics, public policy, international development, or other related fields who are interested in exploring how to use publicly available MCC-financed primary data for policy-relevant analysis. (more…)
Closely echoing the mission of BITSS, Nyhan identifies the potential of research transparency to improve the rigor and ultimately the benefits of federally funded scientific research writing:
The problem is that the research conducted using federal funds is driven — and distorted — by the academic publishing model. The intense competition for space in top journals creates strong pressures for novel, statistically significant effects. As a result, studies that do not turn out as planned or find no evidence of effects claimed in previous research often go unpublished, even though their findings can be important and informative.