Job Opportunity in Data Curation/Publication

Innovations for Poverty Action (IPA) seeks a Research Analyst to join the Data Analysis/Data Publication team. This team is leading an innovative and exciting new part of IPA’s effort to promote high quality research: releasing research data from social science experiments publicly, for re-use and replication.

The position also involves helping to develop a data repository for experiments in the social sciences (more information here), as well as producing guidelines for code and data management. IPA is collaborating with the Institute of Social and Policy Studies at Yale University on this initiative.

This job offer may be of particular interest to those with background in statistics/quantitative analysis, who are also interested in advancing research transparency within the social sciences. Please see the full job description here.

Call for Papers on Research Transparency

BITSS will be holding its 3rd annual conference at UC Berkeley on December 11-12, 2014. The goal of the meeting is to bring together leaders from academia, scholarly publishing, and policy to strengthen the standards of openness and integrity across social science disciplines.

This Call for Papers focuses on work that elaborates new tools and strategies to increase the transparency and reproducibility of research. A committee of reviewers will select a limited number of papers to be presented and discussed. Topics for papers include, but are not limited to:

  • Pre-registration and the use of pre-analysis plans;
  • Disclosure and transparent reporting;
  • Replicability and reproducibility;
  • Data sharing;
  • Methods for detecting and reducing publication bias or data mining.
Papers or long abstracts must be submitted by Friday, October 10th (midnight Pacific time) through CEGA’s Submission Platform. Travel funds may be provided for presenters. Eligible submissions include completed papers or works in progress.

The 2014 BITSS Conference is organized by the Center for Effective Global Action and co-sponsored by the Alfred P. Sloan Foundation and the Laura and John Arnold Foundation.

“Research misconduct accounts for a small percentage of total funding”: Study

Originally posted on Retraction Watch:

elifeHow much money does scientific fraud waste?

That’s an important question, with an answer that may help determine how much attention some people pay to research misconduct. But it’s one that hasn’t been rigorously addressed.

Seeking some clarity,  Andrew Stern, Arturo Casadevall, Grant Steen, and Ferric Fang looked at cases in which the Office of Research Integrity had determined there was misconduct in particular papers. In their study, published today in eLife:

View original 774 more words

Data Access and Research Transparency Panel @ APSA 2014

Join the BITSS co-sponsored panel Implementing Data Access and Research Transparency: Multiple Challenges, Multiple Perspectives at the upcoming meeting of the American Political Science Association (August 27-31, 2014 — Washington, DC).

Chairs
:
  • Colin Elman, Syracuse University
  • Arthur Lupia, University of Michigan, Ann Arbor

(more…)

Data Science Meets Social Science (Video)

The video from a recent BITSS roundtable entitled “Data Science Meets Social Science” is now available online. Organized in partnership with the UC Berkeley D-Lab, the event brought together leading social scientists and Silicon Valley professionals to discuss pathways of collaboration between the two different fields, and their increasing impact on society in the age of open science.

Panelists:

(more…)

Your Question for the Day — What Is “Peer Review”?

Originally posted on The Scholarly Kitchen:

She Blinded Me with Science

She Blinded Me with Science (Photo credit: Wikipedia)

Last week, Slate published an ill-advised hatchet job by an education columnist specializing in the humanities. The topic? Peer review. While the particulars of the Slate article inspired me to add a sharply critical comment, and I was only one among dozens who found fault with the article, there is a fundamental  question worth considering in all this:

What is “peer review”?

Peer review certainly isn’t one thing. Arguing as if it were is a fundamental error made by the Slate columnist and often by others. Peer review is constantly evolving. It is actually difficult to define precisely because of all the variations it can have. All this makes general approbation or condemnation of peer-review difficult to take without the requisite grain of salt.

We often make an “availability error,” generalizing to all forms of peer review based on the…

View original 1,043 more words

Significance Chasing in Research Practice

A new paper by Jennifer Ware and Marcus Munafò (University of Bristol, UK)


Background and Aims
The low reproducibility of findings within the scientific literature is a growing concern. This may be due to many findings being false positives which, in turn, can misdirect research effort and waste money.

Methods
We review factors that may contribute to poor study reproducibility and an excess of ‘significant’ findings within the published literature. Specifically, we consider the influence of current incentive structures and the impact of these on research practices.

Results
(more…)

All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Archives

Follow

Get every new post delivered to your Inbox.

Join 56 other followers