Registered Reports to the Rescue?

After writing an article for The Upshot, Brendan Nyhan (Assistant Professor at Dartmouth) was interviewed by The Washington Post.


The original Upshot article advocates for a new publishing structure called Registered Reports (RRs):

A research publishing format in which protocols and analysis plans are peer reviewed and registered prior to data collection, then published regardless of the outcome.

In the following interview with the Washington Post, Nyhan explains in greater detail why RRs are more effective than other tools at preventing publication bias and data mining. He begins by explaining the limitations of preregistration.

As I argued in a white paper, […] it is still too easy for publication bias to creep in to decisions by authors to submit papers to journals as well as evaluations by reviewers and editors after results are known. We’ve seen this problem with clinical trials, where selective and inaccurate reporting persists even though preregistration is mandatory.

(more…)

TIER Faculty Fellowships 2015-16

Richard Ball, Associate Professor of Economics, and Norm Medeiros, Associate Librarian, of Haverford College, are co-principal investigators of Project TIER. They are seeking the first class of TIER Fellows to promote and extend teaching of transparent and reproducible empirical research methods.


Project TIER (Teaching Integrity in Empirical Research) is an initiative that promotes training in open and transparent methods of quantitative research in the undergraduate and graduate curricula across all the social sciences.

The Project anticipates awarding three or four TIER Faculty Fellowships for the 2015-16 academic year. Fellows will collaborate with TIER leadership and work independently to develop and disseminate transparent research methods that are suitable for adoption by students writing theses, dissertations or other supervised papers, or that can be incorporated into classes in which students conduct quantitative research.

The period of the Fellowships will be from June 1, 2015 through June 30, 2016. Each Fellow will receive a stipend of $5,000.

Applications are due April 19, 2015. Early inquiries and expressions of interest are encouraged. To learn more and apply, visit http://www.haverford.edu/TIER/opportunities/fellowships_2015-16.php.

The End of p-values?

Psychology Professors David Trafimow and Michael Marks of New Mexico State University discuss the implications of banning p-values from appearing in published articles.


To combat the practice of p-hacking, the editors of Basic and Applied Social Psychology (BASP) will no longer publish p-values included in articles submitted to the journal. The unprecedented move by the journal’s editorial board signals publishing norms may be changing faster than previously believed, but also raises certain issues. In a recent article published by Rutledge, editors of BASP, David Trafimow and Michael Marks, bring up 3 key questions associated with the banning of the null hypothesis significance testing procedure (NHSTP).

Question 1: Will manuscripts with p-values be desk rejected automatically?

Answer 1: No […] But prior to publication, authors will have to remove all vestiges of the NHSTP (p-values, t-values, F-values, statements about ‘‘significant’’ differences or lack thereof, and so on).

Question 2: What about other types of inferential statistics such as confidence intervals or Bayesian methods?

Answer 2: Analogous to how the NHSTP fails to provide the probability of the null hypothesis, […] confidence intervals do not provide a strong case for concluding that the population parameter of interest is likely to be within the stated interval. Therefore, confidence intervals also are banned from BASP.

(more…)

How NOT to Interpret p-values

Your dose of BITSS humor, via xkcd.


Source: xkcd.com

(PhOTO CREDIT: xkcd.com)

Now Accepting Applications for Summer Institute

BITSS is pleased to announce it is now accepting applications to attend its 2015 Summer Institute.


This year’s workshop entitled “Transparency and Reproducibility Methods for Social Science Research” will be held in Berkeley, June 10-12. The intensive course will provide participants with a thorough overview of best practices for open, reproducible research, allowing them to remain in the vanguard of new scientific frontiers.

The 2014 Cohort of the BITSS Institute

The 2014 Cohort of the BITSS Institute

Topics covered include:

  • Ethics in Experimental Research
  • False-positives, P-hacking, P-curve, Power Analysis
  • Data Management & Statistical Analysis in R
  • Theory and Implementation of Pre-analysis Plans
  • Approaches to the Replication of Research
  • Meta-analyses: New Tools & Techniques
  • Next Steps in Changing Scientific Research Practices

(more…)

Research Transparency Meeting with CGD

By Garret Christensen (BITSS)


Though BITSS hopes to increase research transparency across the social sciences, several of us, myself included, have a background in development economics. So we were happy to take part in a meeting last week at the Center for Global Development (CGD) in Washington, DC. In addition to BITSS and CGD, representatives from the International Initiative for Impact Evaluation (3ie), Inter-American Development Bank, InterAction, Innovations for Poverty Action Lab (IPA), Millennium Challenge Corporation (MCC), World Bank research group, United States Agency for International Development (USAID), and the US Treasury were present.

I was impressed by how much agreement there was, and how interested these large, sometimes slow-moving, organizations seemed to be, but I should probably temper my enthusiasm a bit: the people in the room were not randomly selected from their respective agencies, and even if they had been, we may still be far from actual policy changes and wider adoption. Regardless, we had a fruitful discussion about some of the roadblocks on the way to increased transparency.

Here are a few of the themes we discussed, mostly obstacles to increased transparency:

(more…)

The Disturbing Influence of Flawed Research on Your Living Habits

Last year, we featured a story on our blog about the so-called cardiovascular benefits of fish oil, largely based on a seminal research study that had more to do with hearsay than with actual science. After your diet, flawed research is now trying to meddle with your sports life.


A Danish study published in the Journal of the American College of Cardiology recently made the headlines for suggesting that too much jogging could have a negative impact on life expectancy. In a recent post in the New York Times, economist Justin Wolfers carefully analyses the study and provides a brilliant response discrediting this overly-confident claim:

The researchers asked Danish runners about the speed, frequency and duration of their workouts, categorizing 878 of them as light, moderate or strenuous joggers. Ten years later, the researchers checked government records to see how many of them had died […] Happily, only 17 had. While this was good news for the surviving runners, it was bad news for the researchers, because 17 was clearly too few deaths to discern whether the risk of death was related to running intensity.

Nonetheless, the study claimed that too much jogging was associated with a higher mortality rate […] The evidentiary basis for this claim is weak. It is based on 40 people who were categorized as “strenuous joggers” — among whom only two died. That’s right: The conclusions that received so much attention were based on a grand total of two deaths among strenuous joggers. As Alex Hutchinson of Runner’s World wrote, “Thank goodness a third person didn’t die, or public health authorities would be banning jogging.”

Because the sample size was so small, this difference is not statistically significant. You may have heard the related phrase “absence of evidence does not equal evidence of absence,” and it is particularly relevant here […] In fact, the main thing the study shows is that small samples yield unreliable estimates that cannot be reliably discerned from the effects of chance.

Wolfers goes on highlighting other weaknesses in the Danish study. This new case of unreliable research finding receiving wide media coverage brings out a couple of important points that are central to our work at BITSS:

(more…)

All the latest on research transparency

Here you can find information about the Berkeley Initiative for Transparency in the Social Sciences (BITSS), read and comment on opinion blog posts, learn about our annual meeting, find useful tools and resources, and contribute to the discussion by adding your voice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Archives

Follow

Get every new post delivered to your Inbox.

Join 327 other followers