Reflections on Two Years Promoting Transparency in Research

By Guillaume Kroll (CEGA)

Two years ago, in December 2012, a handful of researchers convened in Berkeley to discuss emerging strategies to increase openness and transparency in social science research. The group’s concerns followed a number of high-level cases of scientific misconduct and unethical practices, particularly in psychology (1,2). As researchers started to question the legitimacy of the “publish or perish” structure governing academia, many decided to replicate influential findings in their field to differentiate the rigorous from the untrustworthy… only to find that a large majority of studies were unreproducible.

This observation triggered an unprecedented number of bottom-up innovations to restore the credibility of scientific evidence across social science disciplines, including the use of study registries, pre-analysis plans, data sharing, and result-blind peer review. The 2012 meeting resulted in the creation of BITSS, with the goal of fostering the adoption of more transparent research practices among the scientific community.

Today, BITSS has more than 150 affiliated researchers and partner institutions committed to improving the standards of rigor and integrity across the social sciences. Last week’s third annual BITSS meeting was a good opportunity to reflect on the progress achieved.


Tomorrow! BITSS Research Transparency Forum to Be Livestreamed

Can’t attend our Annual Meeting? Not to worry, our Public Conference (Thu, 1.30 PM – 5.00 PM Pacific Time) will be livestreamed. For those who will be joining us virtually, questions can be submitted for the Q&A panel session starting at 4:30 PM via Twitter using #bitss2014. All other participants who will be able to attend the Public Conference, the Collaborative Training or Research Seminar we look forward to seeing you there! More info about the Research Transparency Forum is available on the event page.

Psychology’s Credibility Crisis

In a recent interview appearing in Discover Magazine, Brian Nosek, Co-founder of the Center for Open Science and speaker at the upcoming BITSS Annual Meeting, discusses the credibility crisis in psychology. 

According to the article, Psychology has lost much of it credibility after a series of published papers were revealed as fraudulent and many other study results were found to be irreproducible.

Fortunately, “psychologists, spurred by a growing crisis of faith, are tackling it [the credibility crisis] head-on. Psychologist Brian Nosek at the University of Virginia is at the forefront of the fight.”  Below are excerpts from Nosek’s interview with Discover Magazine discussing what he and others are doing to increase the rigor of research.

What are you doing about the crisis?

BN: In 2011, colleagues and I launched the Reproducibility Project, in which a team of about 200 scientists are carrying out experiments that were published in three psychology journals in 2008. We want to see how many can reproduce the original result, and what factors affect reproducibility. That will tell us if the problem of false-positive results in the psychology journals is big, small or non-existent…

[W]e built the Open Science Framework (OSF) a web application where collaborating researchers can put all their data and research materials so anyone can easily see them. We also offer incentives by offering “badges” for good practices, like making raw data available. So the more open you are, the more opportunities you have for building your reputation.


Tools for Research Transparency: a Preview of Upcoming BITSS Training

By Garret Christensen (BITSS)

What are the tools you use to make your research more transparent and reproducible? A lot of my time at BITSS has been spent working on a manual of best practices, and that has required me to familiarize myself with computing tools and resources that make transparent work easier. I’ll be sharing a draft of the manual at the BITSS Annual Meeting, but for now here are a few of the resources I’ve found most useful. If you’d like to learn more about these tools, there are a ton of helpful resources on the respective websites, or for a hands-on learning experience you can sign up for a collaborative training (December 11, 9.00 AM – 12.00 PM) BITSS is organizing with the D-Lab.


Scientific Irreproducibility and the Prospects of Meta-Research

recent article from The Economist featuring John Ioannidis’ Meta-Research Innovation Center (METRICS), whose work to advance the credibility of research will be presented next week at the BITSS Annual Meeting.

“Why most published research findings are false” is not, as the title of an academic paper, likely to win friends in the ivory tower. But it has certainly influenced people (including journalists at The Economist). The paper it introduced was published in 2005 by John Ioannidis, an epidemiologist who was then at the University of Ioannina, in Greece, and is now at Stanford. It exposed the ways, most notably the overinterpreting of statistical significance in studies with small sample sizes, that scientific findings can end up being irreproducible—or, as a layman might put it, wrong […] Dr Ioannidis has been waging war on sloppy science ever since, helping to develop a discipline called meta-research (ie, research about research).

METRICS’ mission is to “identify and minimize persistent threats to medical-research quality.” These include irreproducibility of research findings (the inability of external researchers to reproduce someone else’s work, most often because research data is not shared or data manipulations are not correctly detailed), funding inefficiencies (supporting flawed research), and publication bias (not all studies that are conducted get published, and the ones which do tend to be show significant results, leaving a skewed impression of the evidence).


Facilitating Radical Change in Publication Standards: Overview of COS Meeting Part II

Originally posted on the Open Science Collaboration by Denny Borsboom

This train won’t stop anytime soon.

That’s what I kept thinking during the two-day sessions in Charlottesville, where a diverse array of scientific stakeholders worked hard to reach agreement on new journal standards for open and transparent scientific reporting. The aspired standards are intended to specify practices for authors, reviewers, and editors to follow in order to achieve higher levels of openness than currently exist. The leading idea is that a journal, funding agency, or professional organization, could take these standards off-the-shelf and adopt them in their policy. So that when, say, The Journal for Previously Hard To Get Data starts to turn to a more open data practice, they don’t have to puzzle on how to implement this, but may instead just copy the data-sharing guideline out of the new standards and post it on their website.

The organizers1 of the sessions, which were presided by Brian Nosek of the Center for Open Science, had approached methodologists, funding agencies, journal editors, and representatives of professional organizations to achieve a broad set of perspectives on what open science means and how it should be institutionalized. As a result, the meeting felt almost like a political summit. It included high officials from professional organizations like the American Psychological Association (APA) and the Association for Psychological Science (APS), programme directors from the National Institutes of Health (NIH) and the National Science Foundation (NSF), editors of a wide variety of psychological, political, economic, and general science journals (including Science and Nature), and a loose collection of open science enthusiasts and methodologists (that would be me).


Former BITSS Institute Participant Advocates for Replication in Brazil

Dalson Britto Figueiredo Filho, Adjunct Professor of Political Science at the Federal University of Pernambuco in Recife, Brazil, who attended the BITSS Summer Institute in June 2014, recently published a paper on the importance of replications in Revista Política Hoje.

“The BITSS experience really changed my mind on how to do good science”, said Figueiredo Filho. “Now I am working to diffuse both replication and transparency as default procedures of scientific inquiry among Brazilian undergraduate and graduate students. I am very thankful to the 2014 BITSS workshop for a unique opportunity to become a better scientist.”

The paper, written in Portuguese, is available here. Below is an abstract in English.


All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.



Get every new post delivered to your Inbox.

Join 220 other followers