New Study Sheds Light on File Drawer Problem

A new study recently published in Science provides striking insights into publication bias in the social sciences:

Stanford political economist Neil Malhotra and two of his graduate students examined every study since 2002 that was funded by a competitive grants program called TESS (Time-sharing Experiments for the Social Sciences). TESS allows scientists to order up Internet-based surveys of a representative sample of U.S. adults to test a particular hypothesis [...] Malhotra’s team tracked down working papers from most of the experiments that weren’t published, and for the rest asked grantees what had happened to their results.

What did they find?

There is a strong relationship between the results of a study and whether it was published, a pattern indicative of publication bias [...] While around half of the total studies in [the] sample were published, only 20% of those with null results appeared in print. In contrast, roughly 60% of studies with strong results and 50% of those with mixed results were published [...] However, what is perhaps most striking is not that so few null results are published, but that so many of them are never even written up (65%).

(more…)

eLife, the Center for Open Science, and Science Exchange partner to assess the reproducibility of cancer biology research

eLife will be the publisher for the results of the Reproducibility Project: Cancer Biology, an effort led by the Center for Open Science and Science Exchange.


First announced in October 2013, with $1.3 million in funding from the Laura and John Arnold Foundation, The Reproducibility Project: Cancer Biology aims to replicate key experimental findings in 50 high-profile pre-clinical cancer biology studies published between 2010 and 2012. The articles were selected on the basis of their very high citation rates and other scores of online attention.

The Reproducibility Project: Cancer Biology is conducted at a time when concerns have been expressed about the reproducibility of biomedical research, and the National Institutes of Health are calling for renewed efforts to enhance reproducibility. The project will provide a substantial and unprecedented dataset on the reproducibility of a large body of high-profile cancer biology articles.

“We need an objective way to evaluate reproducibility,” said Randy Schekman, Editor-in-chief of eLife and professor at the University of California at Berkeley. “This project is a valuable opportunity to generate a high-quality dataset to address questions about reproducibility constructively and rigorously.”

(more…)

Call for Pre-analysis Plans of Observational Studies

Observational Studies is a peer-reviewed journal that publishes papers on all aspects of observational studies. Researchers from all fields that make use of observational studies are encouraged to submit papers.

Observational Studies encourages submission of study protocols (pre-analysis plans) for observational studies. Before examining the outcomes that will form the basis for an observational study’s conclusions, a study protocol should describe the design, exclusion criteria, primary and secondary outcomes, and proposed analyses. Following a publicly available study protocol makes a study more convincing and transparent.

For any questions about the journal or submissions, please contact Dylan Small (Wharton, UPenn) at dsmall [at] wharton [dot] upenn [dot] edu.

Job Opportunity in Data Curation/Publication

Innovations for Poverty Action (IPA) seeks a Research Analyst to join the Data Analysis/Data Publication team. This team is leading an innovative and exciting new part of IPA’s effort to promote high quality research: releasing research data from social science experiments publicly, for re-use and replication.

The position also involves helping to develop a data repository for experiments in the social sciences (more information here), as well as producing guidelines for code and data management. IPA is collaborating with the Institute of Social and Policy Studies at Yale University on this initiative.

This job offer may be of particular interest to those with background in statistics/quantitative analysis, who are also interested in advancing research transparency within the social sciences. Please see the full job description here.

Call for Papers on Research Transparency

BITSS will be holding its 3rd annual conference at UC Berkeley on December 11-12, 2014. The goal of the meeting is to bring together leaders from academia, scholarly publishing, and policy to strengthen the standards of openness and integrity across social science disciplines.

This Call for Papers focuses on work that elaborates new tools and strategies to increase the transparency and reproducibility of research. A committee of reviewers will select a limited number of papers to be presented and discussed. Topics for papers include, but are not limited to:

  • Pre-registration and the use of pre-analysis plans;
  • Disclosure and transparent reporting;
  • Replicability and reproducibility;
  • Data sharing;
  • Methods for detecting and reducing publication bias or data mining.
Papers or long abstracts must be submitted by Friday, October 10th (midnight Pacific time) through CEGA’s Submission Platform. Travel funds may be provided for presenters. Eligible submissions include completed papers or works in progress.

The 2014 BITSS Conference is organized by the Center for Effective Global Action and co-sponsored by the Alfred P. Sloan Foundation and the Laura and John Arnold Foundation.

“Research misconduct accounts for a small percentage of total funding”: Study

Originally posted on Retraction Watch:

elifeHow much money does scientific fraud waste?

That’s an important question, with an answer that may help determine how much attention some people pay to research misconduct. But it’s one that hasn’t been rigorously addressed.

Seeking some clarity,  Andrew Stern, Arturo Casadevall, Grant Steen, and Ferric Fang looked at cases in which the Office of Research Integrity had determined there was misconduct in particular papers. In their study, published today in eLife:

View original 774 more words

Data Access and Research Transparency Panel @ APSA 2014

Join the BITSS co-sponsored panel Implementing Data Access and Research Transparency: Multiple Challenges, Multiple Perspectives at the upcoming meeting of the American Political Science Association (August 27-31, 2014 — Washington, DC).

Chairs
:
  • Colin Elman, Syracuse University
  • Arthur Lupia, University of Michigan, Ann Arbor

(more…)

All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Archives

Follow

Get every new post delivered to your Inbox.

Join 63 other followers