Transparency Practices for Empirical Social Science Research
June 2-6, 2014
University of California, Berkeley
This workshop has now passed. Check out regularly for more training opportunities.
Over the past years, an inspiring number of bottom-up innovations across social science disciplines have sought to advance the reliability, reproducibility, and validity of empirical social studies, realigning scholarly incentives with scholarly values. Examples include systematic disclosure of methods and results, registration and pre-analysis plans, and open data and materials. Meanwhile, multiple organizations have been developing tools to make it easier to archive and share research design, plans and data.
This workshop seeks to inform participants about the latest trends in the shift towards increased transparency, providing an overview of the different tools and techniques that are available and appropriate for social science research. The curriculum has been designed for anyone interested in learning more about best practices for empirical research in economics, political science, psychology or any other social science discipline.
Teaching Material & Resources
- Emerging Issues in the Practice of Empirical Social Science (Ted Miguel, UC Berkeley, Economics): Presentation | Promoting Transparency in Social Science Research
- Ethics in Social Science Research (Scott Desposato, UC San Diego, Political Science): Presentation | Ethics and Clinical Research | Research Exceptionalism | Obituary John Charles Cutler
- False-Positives, p-Hacking, Statistical Power, and Evidential Value (Leif Nelson, UC Berkley, Psychology): Presentation | False-Positive Psychology | P-curve
- Reporting Standards for Social Science Experiments (Kevin Esterling, UC Riverside, Political Science): Esterling Presentation | Recommended Reporting Standards for Experiments | The Quest for Unbiased Research – CONSORT Reporting Guidelines
- Pre-Registration & Transparent Reporting: Perspectives from Biomedical Research (Maya Petersen, UC Berkeley, Biostatistics): Presentation | A Randomized Trial to Evaluate the Effectiveness of Antiretroviral Therapy | Efficacy of a Single-Session HIV Prevention Intervention for Black Women | Making Prospective Registration of Observational Research a Reality | The ClinicalTrials.gov Results Database — Update and Key Issues | The Registration of Observational Studies — When Metaphors Go Bad
- Comparing and Consolidating Empirical Findings (Solomon Hsiang, UC Berkeley, Economics): Presentation | Distributed Meta-Analysis System | Quantifying the Inﬂuence of Climate on Human Conﬂict
- Pre-specification across Research Projects (Thad Dunning, UC Berkeley, Political Science): Presentation | EGAP Regranting Initiative RFP
- Theory and Implementation of Pre-analysis Plans (Kate Casey, Stanford University, Economics): Presentation | Reshaping Institutions – Evidence on Aid Impacts Using a Pre-Analysis Plan | GoBifo Appendix
- Dataverse: Research Transparency through Data Sharing (Merce Crosas, Harvard University, Data Science): Presentation | The Dataverse Network | A Data Sharing Story | Replication, Replication | Ten Simple Rules for the Care and Feeding of Scientific Data | The Evolution of Data Citation | Toward A Common Framework for Statistical Analysis and Development
- A Student-Oriented Protocol for Research Documentation and Replicability (Richard Ball, Haverford College, Economics): TIER Protocol Information | Project TIER Website | Teaching Integrity in Empirical Research | Check the Numbers – The Case for Due Diligence in Policy Formation | Replication in Empirical Economics
- Reproducible and Collaborative Statistical Data Science (Philip Stark, UC Berkeley, Statistics): Presentation | An Invitation to Reproducible Computational Research | New Truths That Only One Can See | Reproducibility (Science Editorial) | Reproducibility PI Manifesto | Reproducible Research for Scientific Computing – Tools and Strategies for Changing the Culture | Resolving Irreproducibility in Empirical and Computational Research | Setting the Default to Reproducible | The Past, Present and Future of Scholarly Publishing | Challenges in Irreproducible Research | Reproducibility Project: Psychology
- Tools and Resources for Data Curation (Staff from California Digital Library): Presentation | UC Curation Center | Open your minds and share your results | Ten Simple Rules for the Care and Feeding of Scientific Data
- Computing for Data-intensive Social Science (Staff from UC Berkeley D-Lab): Presentation | dexy demo
- Using the Open Science Framework (Staff from the Center for Open Science): Presentation
A group of 32 graduate students and junior faculty attended the BITSS 2014 Summer Institute – representing a total of 13 academic institutions in the US, six overseas, and four research non-profits.