Can’t attend our Annual Meeting? Not to worry, our Public Conference (Thu, 1.30 PM – 5.00 PM Pacific Time) will be livestreamed. For those who will be joining us virtually, questions can be submitted for the Q&A panel session starting at 4:30 PM via Twitter using #bitss2014. All other participants who will be able to attend the Public Conference, the Collaborative Training or Research Seminar we look forward to seeing you there! More info about the Research Transparency Forum is available on the event page.
In a recent interview appearing in Discover Magazine, Brian Nosek, Co-founder of the Center for Open Science and speaker at the upcoming BITSS Annual Meeting, discusses the credibility crisis in psychology.
According to the article, Psychology has lost much of it credibility after a series of published papers were revealed as fraudulent and many other study results were found to be irreproducible.
Fortunately, “psychologists, spurred by a growing crisis of faith, are tackling it [the credibility crisis] head-on. Psychologist Brian Nosek at the University of Virginia is at the forefront of the fight.” Below are excerpts from Nosek’s interview with Discover Magazine discussing what he and others are doing to increase the rigor of research.
What are you doing about the crisis?
BN: In 2011, colleagues and I launched the Reproducibility Project, in which a team of about 200 scientists are carrying out experiments that were published in three psychology journals in 2008. We want to see how many can reproduce the original result, and what factors affect reproducibility. That will tell us if the problem of false-positive results in the psychology journals is big, small or non-existent…
[W]e built the Open Science Framework (OSF) a web application where collaborating researchers can put all their data and research materials so anyone can easily see them. We also offer incentives by offering “badges” for good practices, like making raw data available. So the more open you are, the more opportunities you have for building your reputation.
By Garret Christensen (BITSS)
What are the tools you use to make your research more transparent and reproducible? A lot of my time at BITSS has been spent working on a manual of best practices, and that has required me to familiarize myself with computing tools and resources that make transparent work easier. I’ll be sharing a draft of the manual at the BITSS Annual Meeting, but for now here are a few of the resources I’ve found most useful. If you’d like to learn more about these tools, there are a ton of helpful resources on the respective websites, or for a hands-on learning experience you can sign up for a collaborative training (December 11, 9.00 AM – 12.00 PM) BITSS is organizing with the D-Lab.
Originally posted on the Open Science Collaboration by Denny Borsboom
This train won’t stop anytime soon.
That’s what I kept thinking during the two-day sessions in Charlottesville, where a diverse array of scientific stakeholders worked hard to reach agreement on new journal standards for open and transparent scientific reporting. The aspired standards are intended to specify practices for authors, reviewers, and editors to follow in order to achieve higher levels of openness than currently exist. The leading idea is that a journal, funding agency, or professional organization, could take these standards off-the-shelf and adopt them in their policy. So that when, say, The Journal for Previously Hard To Get Data starts to turn to a more open data practice, they don’t have to puzzle on how to implement this, but may instead just copy the data-sharing guideline out of the new standards and post it on their website.
The organizers1 of the sessions, which were presided by Brian Nosek of the Center for Open Science, had approached methodologists, funding agencies, journal editors, and representatives of professional organizations to achieve a broad set of perspectives on what open science means and how it should be institutionalized. As a result, the meeting felt almost like a political summit. It included high officials from professional organizations like the American Psychological Association (APA) and the Association for Psychological Science (APS), programme directors from the National Institutes of Health (NIH) and the National Science Foundation (NSF), editors of a wide variety of psychological, political, economic, and general science journals (including Science and Nature), and a loose collection of open science enthusiasts and methodologists (that would be me).
Dalson Britto Figueiredo Filho, Adjunct Professor of Political Science at the Federal University of Pernambuco in Recife, Brazil, who attended the BITSS Summer Institute in June 2014, recently published a paper on the importance of replications in Revista Política Hoje.
“The BITSS experience really changed my mind on how to do good science”, said Figueiredo Filho. “Now I am working to diffuse both replication and transparency as default procedures of scientific inquiry among Brazilian undergraduate and graduate students. I am very thankful to the 2014 BITSS workshop for a unique opportunity to become a better scientist.”
The paper, written in Portuguese, is available here. Below is an abstract in English.
With the 2014 Research Transparency Forum around the corner (Dec. 11-12), we are excited to announce the papers to be presented during Friday’s Research Seminar.
After carefully reviewing over 30 competitive submissions, BITSS has selected 6 paper presentations:
- Neil Malhotra (Stanford University): “Publication Bias in the Social Sciences: Unlocking the File Drawer”
- Uri Simonsohn (University of Pennsylvania): “False-positive Economics”
- Maya Petersen (UC Berkeley): “Data-adaptive Pre-specification for Experimental and Observational data”
- Arthur Lupia (University of Michigan): “Data Access, Research Transparency, and the Political Science Editors’ Joint Statement”
- Jan Höffler (University of Göttingen): “ReplicationWiki: A Tool to Assemble Information on Replications and Replicability of Published Research”
- Garret Christensen (BITSS): “A Manual of Best Practices for Transparent Research”