Former BITSS Institute Participant Advocates for Replication in Brazil

Dalson Britto Figueiredo Filho, Adjunct Professor of Political Science at the Federal University of Pernambuco in Recife, Brazil, who attended the BITSS Summer Institute in June 2014, recently published a paper on the importance of replications in Revista Política Hoje.

“The BITSS experience really changed my mind on how to do good science”, said Figueiredo Filho. “Now I am working to diffuse both replication and transparency as default procedures of scientific inquiry among Brazilian undergraduate and graduate students. I am very thankful to the 2014 BITSS workshop for a unique opportunity to become a better scientist.”

The paper, written in Portuguese, is available here. Below is an abstract in English.

(more…)

Paper Presentations for Annual Meeting Confirmed!

With the 2014 Research Transparency Forum around the corner (Dec. 11-12), we are excited to announce the papers to be presented during Friday’s Research Seminar.

After carefully reviewing over 30 competitive submissions, BITSS has selected 6 paper presentations:

  • Neil Malhotra (Stanford University): “Publication Bias in the Social Sciences: Unlocking the File Drawer”
  • Uri Simonsohn (University of Pennsylvania): “False-positive Economics”
  • Maya Petersen (UC Berkeley): “Data-adaptive Pre-specification for Experimental and Observational data”
  • Arthur Lupia (University of Michigan): “Data Access, Research Transparency, and the Political Science Editors’ Joint Statement”
  • Jan Höffler (University of Göttingen): “ReplicationWiki: A Tool to Assemble Information on Replications and Replicability of Published Research”
  • Garret Christensen (BITSS): “A Manual of Best Practices for Transparent Research”

For those yet to register please find more information regarding the event here and where to register here.

Creating Standards for Reproducible Research: Overview of COS Meeting

By Garret Christensen (BITSS)


Representatives from BITSS (CEGA Faculty Director Ted Miguel, CEGA Executive Director Temina Madon, and BITSS Assistant Project Scientist Garret Christensen–that’s me) spent Monday and Tuesday of this week at a very interesting workshop at the Center for Open Science aimed at creating standards for promoting reproducible research in the social-behavioral sciences. Perhaps the workshop could have used a catchier name or acronym for wider awareness, but we seemed to accomplish a great deal.  Representatives from across disciplines (economics, political science, psychology, sociology, medicine), from funders (NIH, NSF, Laura and John Arnold Foundation, Sloan Foundation), publishers (Science/AAAS, APA, Nature Publishing Group), editors (American Political Science Review, Psychological Science, Perspectives on Psychological Science, Science), data archivists (ICPSR), and researchers from over 40 leading institutions (UC Berkeley, MIT, University of Michigan, University of British Columbia, UVA, UPenn, Northwestern, among many others) came together to push forward on specific action items researchers and publishers can do to promote transparent and reproducible research.

The work was divided into five subcommittees:

1) Reporting standards in research design

2) Reporting standards in analysis

3) Replications

4) Pre-Registration/Registered Reports

5) Sharing data, code, and materials

(more…)

What to Do If You Are Accused of P-Hacking

In a recent post on Data Colada, University of Pennsylvania Professor Uri Simonsohn discusses what do in the event you (a researcher) are accused of having altered your data to increase statistical significance.


Simonsohn states:

It has become more common to publicly speculate, upon noticing a paper with unusual analyses, that a reported finding was obtained via p-hacking.

For example “a Slate.com post by Andrew Gelman suspected p-hacking in a paper that collected data on 10 colors of clothing, but analyzed red & pink as a single color” [.html] (see authors’ response to the accusation .html) or “a statistics blog suspected p-hacking after noticing a paper studying number of hurricane deaths relied on the somewhat unusual Negative-Binomial Regression” [.html].

Instinctively, Simonsohn says, a researcher may react to accusations of p-hacking by attempting to justify the specifics of his/her research design but if that justification is ex-post, the explanation will not be good enough. In fact:

P-hacked findings are by definition justifiable. Unjustifiable research practices involve incompetence or fraud, not p-hacking.

(more…)

First Swedish Graduate Student Training in Transparency in the Social Sciences

Guest Post by Anja Tolonen (University of Gothenburg, Sweden)


(PHOTO CREDIT: www.gu.se)

(PHOTO CREDIT: GU.SE)

Seventeen excited graduate students in Economics met at the University of Gothenburg, a Monday in September, to initiate an ongoing discussion about transparency practices in Economics. The students came from all over the world: from Kenya, Romania, Hong Kong, Australia and Sweden of course. The initiative itself also came from across an ocean too: Berkeley, California. The students had different interests within Economics: many of us focus on Environmental or Development Economics but there were also Financial Economists and Macroeconomists present.

The teaching material, which was mostly based on material from the first Summer Institute, organized by BITSS in June 2014, quickly prompted many questions. “Is it feasible to pre-register analysis on survey data?”, “Are graduate students more at risk of P-hacking than their senior peers?”, “Are some problems intrinsic to the publishing industry?” and “Does this really relate to my field?” several students asked. Some students think yes:

(more…)

Scientific consensus has gotten a bad reputation—and it doesn’t deserve it

In a recent post, Senior science editor at Ars TechnicaJohn Timmer defends the importance of consensus.


Opening with the following quote from author Michael Crichton:

Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results.

Timmer defends the importance of consensus pointing out:

Reproducible results are absolutely relevant. What Crichton is missing is how we decide that those results are significant and how one investigator goes about convincing everyone that he or she happens to be right. This comes down to what the scientific community as a whole accepts as evidence. (more…)

The 10 Things Every Grad Student Should Do

In a recent post on the Data Pub blog, Carly Strasser provides a useful transparency guide for newcomers to the world of empirical research. Below is an adapted version of that post. 


1. Learn to code in some language. Any language.

Strasser begins her list urging students to learn a programming language. As the limitations of statistical packages including STATA, SAS and SPSS become increasingly apparent, empirical social scientists are beginning to learn languages such as MATLAB, R and Python. Strasser comments:

Growing amounts and diversity of data, more interdisciplinary collaborators, and increasing complexity of analyses mean that no longer can black-box models, software, and applications be used in research.

Start learning to code now so you are not behind the curve later!

2. Stop using Excel. Or at least stop ONLY using Excel.

In Excel modifying data is done without a trace. This makes documenting changes made to a dataset more difficult and prevents researchers using Excel from producing fully replicable research. Read “Potentially Problematic Excel Features” to learn more about the pitfalls of Excel.

(more…)

All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Archives

Follow

Get every new post delivered to your Inbox.

Join 170 other followers