Research Transparency & Open Knowledge: Lessons from #OKFest14

By Guillaume Kroll (CEGA)


Over a thousand scientists, activists, and civil society representatives from over 60 countries gathered in Berlin last week for the 2014 Open Knowledge Festival (OKFest14). The Festival is the flagship event of the Open Knowledge Foundation, an international nonprofit promoting open tools, data, and information for the positive transformation of society.

Welcome Ceremony at OKFest14 © Guillaume Kroll

Welcome Ceremony at OKFest14 © Guillaume Kroll

It’s a well-oiled event, full of energy and creativity, bringing together an eclectic group that goes as far as to include street artistsEU commissioners, and a lot of data geeks scientists, to build a community around the ideals of openness and transparency.

Below is an overview of cool tools, resources, and other pieces of advice gathered at the Festival pertaining to social science research:

(more…)

Science Establishes New Statistics Review Board

The journal Science is adding an additional step of statistical checks to its peer-review process in an effort to strengthen confidence in published study findings.


From the July 4th edition of Science:

[...] Science has established, effective 1 July 2014, a Statistical Board of Reviewing Editors (SBoRE), consisting of experts in various aspects of statistics and data analysis, to provide better oversight of the interpretation of observational data. Members of the SBoRE will receive manuscripts that have been identified [...] as needing additional scrutiny of the data analysis or statistical treatment. The SBoRE member assesses what the issue is that requires screening and suggests experts from the statistics community to provide it.

So why is Science taking this additional step? Readers must have confidence in the conclusions published in our journal. We want to continue to take reasonable measures to verify the accuracy of those results. We believe that establishing the SBoRE will help avoid honest mistakes and raise the standards for data analysis, particularly when sophisticated approaches are needed.

[...] I have been amazed at how many scientists have never considered that their data might be presented with bias. There are fundamental truths that may be missed when bias is unintentionally overlooked, or worse yet, when data are “massaged.” Especially as we enter an era of “big data,” we should raise the bar ever higher in scrutinizing the analyses that take us from observations to understanding.

(more…)

How to Manipulate Peer Review and Get Your Paper Published

Another scandal of peer review abuse should urge academic journals to reconsider their publication requirements.


This one comes from the Journal of Vibration and Control (JVC), an highly technical outlet in the field of acoustics, which just retracted 60 papers at once.

The mass retraction followed the revelation of a “peer review ring” in which one or more researchers fabricated identities to review their own papers and get them published.

It is not the first time such scandal happens, but the number of papers involved and the apparent facility under which the shady researchers operated should ring the alarm bell.

JVC is part of SAGE Publications, a leading international publisher of scholarly and educational products, many of which in the social sciences.

Considering the significant impact that social science studies can have on the design of social and economic policies, which affect all of us, isn’t it time the academic community realign its professional incentives with scholarly values?

Peer Review of Social Science Research in Global Health

A new working paper by Victoria Fan, Rachel Silverman, David Roodman, and William Savedoff at the Center for Global Development.


Abstract

In recent years, the interdisciplinary nature of global health has blurred the lines between medicine and social science. As medical journals publish non-experimental research articles on social policies or macro-level interventions, controversies have arisen when social scientists have criticized the rigor and quality of medical journal articles, raising general questions about the frequency and characteristics of methodological problems and the prevalence and severity of research bias and error.

Published correspondence letters can be used to identify common areas of dispute within interdisciplinary global health research and seek strategies to address them. To some extent, these letters can be seen as a “crowd-sourced” (but editor-gated) approach to public peer review of published articles, from which some characteristics of bias and error can be gleaned.

In December 2012, we used the online version of The Lancet to systematically identify relevant correspondence in each issue published between 2008 and 2012. We summarize and categorize common areas of dispute raised in these letters.

(more…)

Replication in Economics Database

For scientific progress, it is pivotal to review research findings by independently replicating results, thus making the findings more reliable. However, in econometric research, it is not yet common practice to publish replication findings.


Replication Wiki

This wiki, developed by researchers at the University of Göttingen (Germany), compiles replications of empirical studies in economics. It is a great resource for teaching replications to students. There is also opportunities for publication, as replication of existing studies can be published as replication working papers of the University of Göttingen’s Center for Statistics.

Join the Project

You can join the project by entering a new replication or indicating empirical studies that could be replicated. For the latter, you can vote which ones should be replicated in priority. Furthermore, you can improve the articles in the wiki and make comments. Under current events, you can post news and announce upcoming events. The community portal lists open tasks and suggestions for the wiki.

Added Value

(more…)

10 Things You Need to Know About…

Check out this new EGAP series:

Research Transparency, Data Access, and Data Citation: A Call to Action for Scholarly Publications

This collaborative statement calls upon the scholarly publishing community to take leadership in advancing knowledge through research transparency, data access, and data citation.

Please consider adding your name to the endorsements page and encourage others to do the same.

This Call to Action was produced at the “Data Citation and Research Transparency Standards for the Social Sciences” meeting convened by the Inter-university Consortium for Political and Social Research (ICPSR) on June 13-14, 2013 in Ann Arbor, MI, with support from the Alfred P. Sloan Foundation.

The document has already attracted much attention. It was was discussed at a recent meeting of the Consortium of Social Science Associations (COSSA), where participants agreed to hold additional discussions on the issue. It is also on the agenda of an upcoming meeting of the American Political Science Association (APSA) to discuss common policies for political science journals.

All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Archives

Follow

Get every new post delivered to your Inbox.

Join 47 other followers