Welcome to the BITSS Blog

Reminder: Call for Papers Deadline is October 10th

Papers or long abstract for the Call for Papers on Research Transparency must be submitted by Friday, October 10th (11:59pm PST) through CEGA’s Submission Platform. Topics for papers include, but are not limited to: pre-registration and the use of pre-analysis plans; disclosure and transparent reporting; replicability and reproducibility; data sharing; and methods for detecting and reducing…

MCC’s First Open Data Challenge

The U.S. Government’s Millennium Challenge Corporation (MCC) wants to hear your new and innovative ideas on how to maximize the use of data that MCC finances for its independent evaluations. Keynote speakers at this year’s BITSS Research Transparency Forum, Jennifer Sturdy and Jack Molyneaux at MCC’s Department of Policy and Evaluation, and Kathy Farley…

MCC's First Open Data Challenge

The U.S. Government’s Millennium Challenge Corporation (MCC) wants to hear your new and innovative ideas on how to maximize the use of data that MCC finances for its independent evaluations. Keynote speakers at this year’s BITSS Research Transparency Forum, Jennifer Sturdy and Jack Molyneaux at MCC’s Department of Policy and Evaluation, and Kathy Farley…

To Get More Out of Science, Show the Rejected Research

In a recent opinion piece on the New York Times news portal the Upshot, Brendan Nyhan, an assistant professor of government at Dartmouth College, comments on a host of transparency related issues. Closely echoing the mission of BITSS, Nyhan identifies the potential of research transparency to improve the rigor and ultimately the benefits…

Africa’s Data Revolution – Amanda Glassman

Interview originally posted on the Global Poverty Wonkcast:   Is the revolution upon us? When it comes to data, the development world seems to be saying yes, Yes, YES! To look beyond the hype, I invited Amanda Glassman, a CGD senior fellow and director of our global health policy program, to join me…

Africa's Data Revolution – Amanda Glassman

Interview originally posted on the Global Poverty Wonkcast:   Is the revolution upon us? When it comes to data, the development world seems to be saying yes, Yes, YES! To look beyond the hype, I invited Amanda Glassman, a CGD senior fellow and director of our global health policy program, to join me…

Can Post-Publication Peer-Review Increase Research Transparency?

Guest Post by Liz Allen (ScienceOpen) For the 3rd annual conference of The Berkeley Initiative for Transparency in the Social Sciences (BITSS), ScienceOpen, the new Open Access (OA) research + publishing network, would like prospective and registered attendees to consider the role that Post-Publication Peer Review (PPPP) can play in increasing the transparency…

COS Now Offering Free Consulting Services

A close partner of BITSS, the Center for Open Science (COS) has launched a free consulting service to anyone seeking help with “statistical and methodological questions related to reproducible practices, research design, data analysis, and data management.” The Center is dedicated to increasing the “openness, integrity, and reproducibility of scientific research” and…

Announcing The 2014 Research Transparency Forum

BITSS is pleased to announce its 3rd annual meeting (December 11-12 – Berkeley, CA). This year’s research transparency meeting will be the first to be open to the public and is anticipated to be the largest BITSS event to date. The event will act to update the academic community of the growing movement for greater…

White House Calls for Comments on Reproducible Research

The White House’s Office of Science and Technology Policy (OSTP) has released a request for information on improving the reproducibility of federally funded scientific research. Given recent evidence of the irreproducibility of a surprising number of published scientific findings, how can the Federal Government leverage its role as a significant funder of scientific research…

Political Scientists Launch New Replication Initiative

Following a groundswell of interest for replications in the political sciences, first noticed from survey results posted on the Monkey Cage Blog, Political Scientists Seth Werfel (Stanford University) and Nicole Janz (Cambridge University), and research consultant Stephanie Wykstra launched the Political Science Replication Initiative, a new repository for uploading study replications. Increasingly, methodological political scientists have recognized…

New Study Sheds Light on File Drawer Problem

A new study recently published in Science provides striking insights into publication bias in the social sciences: Stanford political economist Neil Malhotra and two of his graduate students examined every study since 2002 that was funded by a competitive grants program called TESS (Time-sharing Experiments for the Social Sciences). TESS allows scientists…

Call for Pre-analysis Plans of Observational Studies

Observational Studies is a peer-reviewed journal that publishes papers on all aspects of observational studies. Researchers from all fields that make use of observational studies are encouraged to submit papers. Observational Studies encourages submission of study protocols (pre-analysis plans) for observational studies. Before examining the outcomes that will form the basis for…

Job Opportunity in Data Curation/Publication

Innovations for Poverty Action (IPA) seeks a Research Analyst to join the Data Analysis/Data Publication team. This team is leading an innovative and exciting new part of IPA’s effort to promote high quality research: releasing research data from social science experiments publicly, for re-use and replication. The position also involves helping to develop a…

Call for Papers on Research Transparency

BITSS will be holding its 3rd annual conference at UC Berkeley on December 11-12, 2014. The goal of the meeting is to bring together leaders from academia, scholarly publishing, and policy to strengthen the standards of openness and integrity across social science disciplines. This Call for Papers focuses on work that elaborates new tools and strategies to increase the transparency and reproducibility of research. A committee of…

Data Access and Research Transparency Panel @ APSA 2014

Join the BITSS co-sponsored panel Implementing Data Access and Research Transparency: Multiple Challenges, Multiple Perspectives at the upcoming meeting of the American Political Science Association (August 27-31, 2014 — Washington, DC). Chairs: Colin Elman, Syracuse University Arthur Lupia, University of Michigan, Ann Arbor

Data Science Meets Social Science (Video)

The video from a recent BITSS roundtable entitled “Data Science Meets Social Science” is now available online. Organized in partnership with the UC Berkeley D-Lab, the event brought together leading social scientists and Silicon Valley professionals to discuss pathways of collaboration between the two different fields, and their increasing impact on society in the…

Significance Chasing in Research Practice

A new paper by Jennifer Ware and Marcus Munafò (University of Bristol, UK) Background and Aims The low reproducibility of findings within the scientific literature is a growing concern. This may be due to many findings being false positives which, in turn, can misdirect research effort and waste money. Methods We review factors that…

Privacy, Big Data, and the Public Good

Videos and presentations from the book launch of “Privacy, Big Data and the Public Good” (Lane, J., Stodden, V., Bender, S. & Nissenbaum, H. (Eds)) are now available online. Hosted by the NYU Center for Urban Science on July 16, the event included several panels with the book’s editors and a number of the authors. Overview of the book: Massive amounts of new data about people,…

Research Transparency & Open Knowledge: Lessons from #OKFest14

By Guillaume Kroll (CEGA) Over a thousand scientists, activists, and civil society representatives from over 60 countries gathered in Berlin last week for the 2014 Open Knowledge Festival (OKFest14). The Festival is the flagship event of the Open Knowledge Foundation, an international nonprofit promoting open tools, data, and information for the positive transformation of society. It’s a…

Research Transparency & Open Knowledge: Lessons from #OKFest14

By Guillaume Kroll (CEGA) Over a thousand scientists, activists, and civil society representatives from over 60 countries gathered in Berlin last week for the 2014 Open Knowledge Festival (OKFest14). The Festival is the flagship event of the Open Knowledge Foundation, an international nonprofit promoting open tools, data, and information for the positive transformation of society. It’s a…

Science Establishes New Statistics Review Board

The journal Science is adding an additional step of statistical checks to its peer-review process in an effort to strengthen confidence in published study findings. From the July 4th edition of Science: […] Science has established, effective 1 July 2014, a Statistical Board of Reviewing Editors (SBoRE), consisting of experts in various aspects of statistics and data analysis,…

How to Manipulate Peer Review and Get Your Paper Published

Another scandal of peer review abuse should urge academic journals to reconsider their publication requirements. This one comes from the Journal of Vibration and Control (JVC), an highly technical outlet in the field of acoustics, which just retracted 60 papers at once. The mass retraction followed the revelation of a “peer review ring” in…

Peer Review of Social Science Research in Global Health

A new working paper by Victoria Fan, Rachel Silverman, David Roodman, and William Savedoff at the Center for Global Development. Abstract In recent years, the interdisciplinary nature of global health has blurred the lines between medicine and social science. As medical journals publish non-experimental research articles on social policies or macro-level interventions, controversies…

Replication in Economics Database

For scientific progress, it is pivotal to review research findings by independently replicating results, thus making the findings more reliable. However, in econometric research, it is not yet common practice to publish replication findings. Replication Wiki This wiki, developed by researchers at the University of Göttingen (Germany), compiles replications of empirical studies in economics.…

10 Things You Need to Know About…

Check out this new EGAP series: 10 Things You Need to Know About Causal Effects 10 Things You Need to Know About Randomization 10 Things You Need to Know About Statistical Power 10 Strategies to Figure Out if X Caused Y

Research Transparency, Data Access, and Data Citation: A Call to Action for Scholarly Publications

This collaborative statement calls upon the scholarly publishing community to take leadership in advancing knowledge through research transparency, data access, and data citation. Please consider adding your name to the endorsements page and encourage others to do the same. This Call to Action was produced at the “Data Citation and Research Transparency Standards for the Social Sciences”…

Summer Institute Material Now Available

All the material from our summer institute in transparency practices for empirical research is now accessible on our training page. This weeklong workshop provides an overview of the latest trends in the shift towards increased transparency, combining presentations on conceptual issues in current research practices with hands-on training on emerging tools and approaches…

Future Steps for Research in Dishonesty

The Economic Science Association, an experimental economics professional organization, recently organized a workshop about dishonesty research at the University of Copenhagen, Denmark. The panel was composed of Mike Norton (Harvard Business School), Johannes Abeler (Oxford University), Marco Piovesan (University of Copenhagen), and Roberto Weber (University of Zurich). The discussion focused on future directions for research in…

The Controversy of Preregistration in Social Research

Guest post by Jamie Monogan (University of Georgia) A conversation is emerging in the social sciences over the merits of study registration and whether it should be the next step we take in raising research transparency. The notion of study registration is that, prior to observing outcome data, a researcher can publicly…

Call for Papers: Special Issue of Comparative Political Studies on Research Transparency

From Michael Findley (University of Texas), Nathan Jensen (Washington University), Edmund Malesky (Duke), and Thomas Pepinsky (Cornell) We invite proposals for a special issue of Comparative Political Studies (CPS) on research transparency in the social sciences. Proposals for original research papers using quantitative or qualitative approaches, and collecting quantitative or qualitative data…

Scientific Misconduct in the Middle East

On plagiarism and fraud in the Middle Eastern research community (by Ranya Stamboliyska): Gallons of digital ink have been spilt discussing depressing laundry lists of misconduct cases in the west (and more recently, in China). There is, however, very little on unethical behaviour in the Arab world, despite the wide number of Mid-Eastern students…

Tonight! Data Science Meets Social Science

Tonight in Berkeley! June 5, 2014 6:00pm — 7:30pm. David Brower Center (Kinzie Room), 2150 Allston Way. All across the social sciences we can see a convergence around the ideals of openness and reproducibility. Over the past years, the injection of ways of thinking and working from scientific computing into social science research has helped…

BITSS Summer Institute – Summary of the First Day

BITSS is currently holding its first summer institute in transparency practices for empirical research. The meeting is taking place in Berkeley, CA with 30+ graduate students and junior faculty in the attendance. Ted Miguel (Economics, UC Berkeley), one of the founding members of BITSS, started with an overview of conceptual issues in…

What can be done to prevent the proliferation of errors in academic publications?

Every now and again a paper is published on the number of errors made in academic articles. These papers document the frequency of conceptual errors, factual errors, errors in abstracts, errors in quotations, and errors in reference lists. James Hartley reports that the data are alarming, but suggests a possible way of…

Replicate it! A Proposal to Foster Knowledge Accumulation

Thad Dunning and Susan D. Hyde in the Washington Post: Like many social scientists, we take it almost as an article of faith that scientific methods will advance our knowledge about how the world works. The growing use by social scientists of strong research designs — for example, randomized controlled experiments or…

The Ethics of Using and Sharing Data for Development Programming (5/22 @ NYC)

The Responsible Data Forum is a new effort to map the ethical, legal, privacy and security challenges surrounding the increased use and sharing of data in development programming. The Forum aims to explore the ways in which these challenges are experienced in project design and implementation, as well as when project data is…

Flawed Research On Your Plate

You might want to reconsider paying extra dollar for these fish oil supplements. A new study said most of the research literature on the cardiovascular benefits of omega-3 fatty acids is flawed. In the early 1970s, two Danish researchers started to investigate the diet of Greenland’s Inuit populations, which were believed to live longer than their Caucasian counterparts. The study…

Tomorrow — West Coast Experiments Conference @ Claremont Graduate University

Tomorrow at Claremont Graduate University (Burkle Building, Room 16). One workshop to focus on research transparency and integrity: Session II (12:45 to 3:15, 45 minutes for the papers, one hour for the training session) Neil Malhotra (Stanford), “Publication Bias in Political Science: Using TESS Experiments to Unlock the File Drawer” Thad Dunning (Berkeley),…

The Reformation: Can Social Scientists Save Themselves?

From Jerry Adler in the Pacific Standard—on the credibility crisis in social science research, publication bias, data manipulation, and non-replicability. Featuring BITSS aficionados Brian Nosek, Joe Simmons, Uri Simonsohn and Leif Nelson. Something unprecedented has occurred in the last couple of decades in the social sciences. Overlaid on the usual academic incentives of…

Open Data Workshops (London, UK)

The Open Data Institute is organizing multiple workshops (May 12-16, July 21-25 | London, UK) to introduce the technical, commercial and legal aspects of open data. The courses aim to highlight key opportunities for working with open data and how these can be exploited across government, business and society. The workshops are designed to enable…

Reproducibility Session @ AAAS Forum on Science and Technology Policy

A core value of science is the ability to reproduce the findings of others in order to check for methodological rigor, errors, plausible interpretations, and/or misconduct. However, for a variety of reasons, from lack of time or resources to little professional recognition or credit for doing so, science may be failing to…

New book: Implementing Reproducible Research

New book from Victoria Stodden, Friedrich Leisch, and Roger D. Peng: “Implementing Reproducible Research“. In many of today’s research fields, including biomedicine, computational tools are increasingly being used so that the results can be reproduced. Researchers are now encouraged to incorporate software, data, and code in their academic papers so that others can…