Guest Post by Liz Allen (ScienceOpen)
For the 3rd annual conference of The Berkeley Initiative for Transparency in the Social Sciences (BITSS), ScienceOpen, the new Open Access (OA) research + publishing network, would like prospective and registered attendees to consider the role that Post-Publication Peer Review (PPPP) can play in increasing the transparency of research.
When we launched earlier this year, we interviewed Advisory Board Member Peter Suber. One of the original founders of the Open Access movement, Peter is currently director of the Harvard Office for Scholarly Communication and the Harvard Open Access Project. His latest book, “Open Access” (MIT Press, 2012), is an important starting point for anyone new to the topic. We asked Peter various questions including “How important is it that OA penetrates research disciplines beyond science?” Here’s what he said:
“It is very important in my opinion. I have been arguing since 2004 that OA brings the same benefits in every field, even if some fields present more obstacles or fewer opportunities. For example, the natural sciences are better funded than the humanities, which means they have more money to pay for OA. In particular, there is more public funding for the sciences than the humanities, which means that the compelling taxpayer argument for OA gets more traction in the sciences than the humanities. In addition, books are at least as important as journal articles for humanities scholars, if not more import ant, and OA for books, while growing quickly, is objectively harder than OA for journal articles. The good news is that OA in the humanities is growing – not faster than OA in the sciences, but faster than in the past. More humanities scholars understand the benefits and opportunities for OA, and are answering the objections and misunderstandings raised against it”.
A close partner of BITSS, the Center for Open Science (COS) has launched a free consulting service to anyone seeking help with “statistical and methodological questions related to reproducible practices, research design, data analysis, and data management.”
The Center is dedicated to increasing the “openness, integrity, and reproducibility of scientific research” and is looking to advance its mission through a more hands-on approach. Those with methodological questions can email email@example.com for free assistance from computer and data scientists trained in reproducibility and advanced research methods. If a question is too complicated to be answered via email, researchers can schedule a Google Hangout with a COS consultant to have their questions answered in real time. Visit the COS Google Calender for availability.
The Center also offers online and on-site workshops for those seeking to gain a greater understanding of open research topics and tools. For more information on the details of COS’s services visit their Statistical & Methodological Consulting Services page.
BITSS is pleased to announce its 3rd annual meeting (December 11-12 – Berkeley, CA).
This year’s research transparency meeting will be the first to be open to the public and is anticipated to be the largest BITSS event to date. The event will act to update the academic community of the growing movement for greater openness in research and as a forum to discuss a variety of transparency related issues such as changing journal practices, novel evidence of publication bias and burgeoning related initiatives.
The gathering will cater to a range of guests from seasoned transparency experts to new supporters of the transparency movement. As such the event will consist of a variety of activities including a collaborative training session geared towards disseminating new tools, a public conference with presentations from transparency leaders, and a research seminar showcasing the latest developments in the world of research transparency.
Confirmed speakers include Edward Miguel (CEGA’s Faculty Director), John Ioannidis from Stanford School of Medicine, Victoria Stodden from the University of Illinois and Brian Nosek from the Center for Open Science.
The White House’s Office of Science and Technology Policy (OSTP) has released a request for information on improving the reproducibility of federally funded scientific research.
Given recent evidence of the irreproducibility of a surprising number of published scientific findings, how can the Federal Government leverage its role as a significant funder of scientific research to most effectively address the problem?
A similar request for comments posted by the OSTP on open-access research resulted in policies mandating federally-funded research be made publicly accessible.
To submit comments email firstname.lastname@example.org by September 23rd.
Following a groundswell of interest for replications in the political sciences, first noticed from survey results posted on the Monkey Cage Blog, Political Scientists Seth Werfel (Stanford University) and Nicole Janz (Cambridge University), and research consultant Stephanie Wykstra launched the Political Science Replication Initiative, a new repository for uploading study replications.
Increasingly, methodological political scientists have recognized the need to reexamine empirical studies. The survey results clearly indicated a growing trend towards assigning replications to graduate students in research methods classes. Political scientists hope the growing number of publicly available replications will ensure the validity of original work and act as a safeguard against human error in research.
A new study recently published in Science provides striking insights into publication bias in the social sciences:
Stanford political economist Neil Malhotra and two of his graduate students examined every study since 2002 that was funded by a competitive grants program called TESS (Time-sharing Experiments for the Social Sciences). TESS allows scientists to order up Internet-based surveys of a representative sample of U.S. adults to test a particular hypothesis [...] Malhotra’s team tracked down working papers from most of the experiments that weren’t published, and for the rest asked grantees what had happened to their results.
What did they find?
There is a strong relationship between the results of a study and whether it was published, a pattern indicative of publication bias [...] While around half of the total studies in [the] sample were published, only 20% of those with null results appeared in print. In contrast, roughly 60% of studies with strong results and 50% of those with mixed results were published [...] However, what is perhaps most striking is not that so few null results are published, but that so many of them are never even written up (65%).
eLife, the Center for Open Science, and Science Exchange partner to assess the reproducibility of cancer biology research
eLife will be the publisher for the results of the Reproducibility Project: Cancer Biology, an effort led by the Center for Open Science and Science Exchange.
First announced in October 2013, with $1.3 million in funding from the Laura and John Arnold Foundation, The Reproducibility Project: Cancer Biology aims to replicate key experimental findings in 50 high-profile pre-clinical cancer biology studies published between 2010 and 2012. The articles were selected on the basis of their very high citation rates and other scores of online attention.
The Reproducibility Project: Cancer Biology is conducted at a time when concerns have been expressed about the reproducibility of biomedical research, and the National Institutes of Health are calling for renewed efforts to enhance reproducibility. The project will provide a substantial and unprecedented dataset on the reproducibility of a large body of high-profile cancer biology articles.
“We need an objective way to evaluate reproducibility,” said Randy Schekman, Editor-in-chief of eLife and professor at the University of California at Berkeley. “This project is a valuable opportunity to generate a high-quality dataset to address questions about reproducibility constructively and rigorously.”