By Carson Christiano (CEGA)
You may wonder why a network of development researchers is taking the lead on a transparency initiative. The answer lies in the profound and omnipresent power of failure.
Most would agree that risk-taking is essential to innovation, whether we’re talking about creating a simple hand-washing station or a state-of-the-art suspension bridge. At the same time, we tend to highlight our successes while downplaying ambiguous research results, misguided technologies, and projects that fail to achieve their desired impact. We fear humiliation and the curtailment of donor interest. Yet open discussion about what doesn’t work, in addition to what works, is critical to our eventual success as innovators. We at the Center for Effective Global Action (CEGA), like so many others working towards social change, believe strongly that there should be “no silent failures” in development.
In Silicon Valley, failure is regulated by the market. Venture capitalists don’t invest in technologies that consumers won’t buy. In the social impact space, particularly in developing countries where consumer demand is difficult to quantify, donors and governments rely on loose, assumption-laden predictions of return on investment. As millions of people in poor countries around the world stand to benefit (and potentially lose) from large-scale social and economic development programs, it follows that CEGA and our research partners maintain a steadfast commitment to research transparency as a moral imperative.
That being said, it is infinitely easier to commit to the concept of research transparency than to actually engage in it. We all know that writing a pre-analysis plan takes precious time and resources; study registration holds us accountable for the results of our research, which may not turn out as we expect. How, then, can we change behavior around research transparency, and encourage researchers to accept (and admit) failure?
Perhaps it’s not failure itself, but public failure, that we fear. We weren’t pre-conditioned to be this way. In a 2010 TED Talk, Tom Wujec revealed the results of a “Marshmallow Challenge” which he ran over 70 times with individuals of varying age and occupation. In the challenge, teams of four were given 18 minutes to build the tallest freestanding structure they could out of 20 sticks of spaghetti, one yard of tape, and one marshmallow. Lo and behold, kindergartners built structures over twice as tall as recent business school graduates. The little kids, with their lack of self-consciousness, engaged in a highly productive process of trying, failing, and trying again, while the business school grads spent most of their 18 minutes coming up with a “perfect plan,” with no time to adjust when their plan fell short of its mark (which it often did).
Significant progress has already been made towards mainstreaming transparency and public failure. Computer scientists are increasingly making their software open source; technologists and social enterprises are participating in “FAILfaires” which proudly showcase widgets and programs that have failed to meet their targets (to the extent they avert future failure). Engineers without Borders Canada publishes an annual “Failure Report” with the goal of creating space for experimentation and learning. IDEO.org’s HCD Connect platform allows human-centered designers around the world to share their stories through the web, and learn from one another’s mistakes. The Global Partnership on Output-based Aid (GPOBA) challenges the way donors and international organizations approach grant-making, by making support contingent on documentation of successful service and infrastructure delivery.
Failure is slowly but surely making its way into the academic world as well. Ashok Gadgil’s analysis of a failed lighting efficiency program in India, “Stalled on the Road to Market,” is an early example of a project failure that was published in a top journal, Energy Policy. By the end of 2011, the results of Gadgil’s analysis were saving 100 million poor customers in 42 developing countries over $5 billion per year in lighting costs. A recent study on the efficacy of micro-nutrient powders (MNP) in reducing anemia in Bangladesh, Nepal, and Kenya documents evidence of partial effectiveness while highlighting low adoption rates among target populations, and offering some possible explanations for this failure.
CEGA is excited to incubate the Berkeley Initiative for Transparency in the Social Sciences (BITSS), and to host this blog forum, as we are deeply committed to (and well-versed in) this flavor of inter-disciplinary dialogue. CEGA’s research affiliates hail from over ten distinct disciplines; our research programs span numerous regions and sectors; and we are closely linked with other networks contemplating research transparency, including Innovations for Poverty Action (IPA), the Abdul Latif Jameel Poverty Action Lab (J-PAL), the World Bank, and EGAP (Experiments in Governance and Politics). We believe this inter-connectedness affords us the unique ability to synthesize diverse opinions while swiftly translating good ideas into action.
BITSS represents a coming of age for CEGA, in that it provides an opportunity to influence how research is practiced, in addition to churning out results. We look forward to drawing on the expertise of many others in the process.
|