BITSS is pleased to announce its new Advisory Board! Members include leading academics John Ioannidis (Stanford University, School of Medicine), Matthew Rabin (Harvard University, Economics), Bobbie Spellman (University of Virginia, School of Law), and Arthur Lupia (University of Michigan, Political Science). The Board will complement the work of the Executive Committee by providing strategic guidance for BITSS’ continued development, access to a larger network of researchers, and an enhanced capacity to identify and disseminate best practices in research transparency.
John Ioannidis is a Professor of Medicine and Health Research and Policy at Stanford University and co-directs the Meta-Research Innovation Center (METRICS), dedicated to studying and improving the rigor of academic research. Widely known for his article “Why most Published Research Findings are False,” Ioannidis is a pioneer in the field of research transparency and was featured in this year’s Research Transparency Forum. See his talk here.
Matthew Rabin is a Professor of Behavioral Economics at Harvard University and a former Professor at UC Berkeley whose research uses insights form Psychology to more accurately model human behavior. As an active BITSS supporter, Rabin will provide valuable insight on how to effectively increase transparency in theoretical and non-experimental research.
Does trial registration make an impact on publication bias? Knowing the answer could earn you a cash prize!
Macartan Humphreys (Columbia, Political Science) and collaborators Albert Fang and Grant Gordon are doing research on how publication (and publication bias) changed after the introduction of registration in clinical trials. They also want you to guess what the changes were. The bidder with the closest guess will win a $200 cash prize. Click here to read more and enter a guess.
Enthusiastic supporters of research transparency are often keen on advocating for the registration of trial experiments. But in the social sciences the practice remains fairly rare and its impact on publication bias is relatively unknown. Fortunately, social scientists can learn from their peers in the medical sciences who have been required to register their medical trials since 2005. The research of Humphreys et al. will look to see if there was a change in the share of p values just below 0.01 and 0.05 before and after 2005 in published medical trials. Their results will provide valuable insight as to whether or not registration should be a high priority on the transparency agenda.
BITSS Project Scientist Garret Christensen will be participating in a discussion with the Mozilla Science Lab on reproducibility in research tomorrow at 11 am ET. The call is open to the public. For those interested in joining, more information can be found here.
In case you missed the 2014 BITSS Research Transparency Forum, you can watch the presentations of all five speakers featured on the BITSS YouTube Channel and embedded in our Annual Meeting page. The YouTube channel also includes videos with interviews from the BITSS Board on the importance of research transparency.
By Garret Christensen (BITSS)
BITSS just got back from the ASSA conference, the major annual gathering of economists. The conference largely serves to help new PhD economists find jobs, but there are of course sessions of research presentations, a media presence and sometimes big names like the Chair-of-the-Federal-Reserve in attendance. BITSS faculty Ted Miguel presided a session on research transparency. The session featured presentations by Eva Vivalt (NYU), Brian Nosek (UVA) and Richard Ball (Haverford College).
Vivalt presented part of her job market paper which shows that, at least in development economics, randomized trials seems to result in less publication bias and/or specification-searching than other types of evaluations.
Nosek’s presentation covered a broad range of transparency topics, from his perspective as a psychologist. His discussant, economist Justin Wolfers, concurred completely and focused on how Nosek’s lessons could apply to economics.
As an economist myself, I thought a few of his points were interesting:
- The Quarterly Journal of Economics should really have a data-sharing requirement.
- Economists don’t do enough meta-analysis (Ashenfelter et al.’s paper on the estimates of the returns to education is a great example of the work we could and should be doing)
- Somewhat tongue-in-cheek (I think), Wolfers discussed the fool/cheat paradox: whenever anyone is caught with a mistake in their research, they can either admit to having made an honest mistake, or having cheated. If they choose the “fool” option, as most do, there’s not much one can do to change one’s own intelligence. Why does nobody cop to having cheated? You could more easily make a case for mending your ways if you admitted to cheating.
If you’re at the ASSA meetings in Boston this weekend, and you are interested in learning more about research transparency, then please stop by booth 127 in the exhibition hall to speak with BITSS and Center for Open Science representatives. Or you can attend our session Monday morning at 10:15am: “Promoting New Norms for Transparency and Integrity in Economic Research.”
This January 5th, 10.15am at the American Economic Association Annual Meeting in Boston, MA (Sheraton Hotel, Commonwealth Room).
Session: Promoting New Norms for Transparency and Integrity in Economic Research
Presiding: Edward Miguel (UC Berkeley)
- Brian Nosek (University of Virginia): “Scientific Utopia: Improving Openness and Reproducibility in Scientific Research”
- Richard Ball (Haverford College): “Replicability of Empirical Research: Classroom Instruction and Professional Practice”
- Eva Vivalt (New York University): “Bias and Research Method: Evidence from 600 Studies”
- Aprajit Mahajan (UC Berkeley)
- Justin Wolfers (UC Michigan)
- Kate Casey (Stanford University)
More info here. Plus don’t miss the BITSS/COS Exhibition Booth at the John B. Hynes Convention Center (Level 2, Exhibition Hall D).