Governing Development Failure

In the last few years there has been a proliferation of new “little development devices” and practices in places where we might least expect them: at the World Bank and in national development agencies usually associated with the kinds of large-scale infrastructure mega-projects that these institutions pioneered after World War II. Yet the current emphasis on “little” development devices cannot be understood as a straightforward reaction to earlier forms of development policy that used “big” development devices. Rather, if we want to understand the current fascination with little development devices, we need to look at a different moment in international development institutions’ history: the many prominent failures in development assistance that marked the 1990s, such as the AIDS epidemic, the Asian financial crisis, and the “lost decade” of development in sub-Saharan Africa.

If we cannot understand the emergence of these new devices without paying attention to the recent failures of development policy, does that mean that they signal the failure of international development as we’ve known it? Yes and no: yes because many of them have been developed as innovative responses to the failures of development assistance, and no because they are nonetheless still very much development devices aimed at many of the same objectives that have held sway since the mid-twentieth century, including economic growth and poverty reduction.

In fact, although policy failures are central to this story, the part they play is a surprisingly creative one. These failures were profound enough to provoke a crisis of development expertise, leading development practitioners to question their very metrics of success and failure. Over time, these practitioners sought to re-establish the grounds for their authority, reconceiving the object of development—poverty—by forging new metrics of aid success, by developing new techniques for its measurement, and by adopting new devices amenable to this kind of measurement.

Rather than the failure of development, what precipitated the proliferation of these new micro-devices was thus the transformation of development governance through its engagement and problematization of failure, as well as its growing preoccupation with the ever-present possibility of future failures.

Responding to Past Failures

Beginning in the 1990s, there was a lot of talk about the failure of development policies. Some external critics focused on the persistence of extreme poverty in sub-Saharan Africa, while others pointed to the AIDS crisis in Africa, or the sudden increase in poverty in Asia after the 1997–1998 financial crisis. All of these crises had occurred on the watch of the major development organizations in spite of (or, as many critics suggested, because of) their efforts.

Inspired by these crises, both external critics and many of those working in the policy development and evaluation units at the World Bank and the International Monetary Fund (IMF) began to point to various policy failures (Collier 1997; Killick 1997). Staff in the Policy Development and Review department at the IMF, for example, noted that the ever-increasing number of conditions that aid packages imposed on poor countries had no positive effect on compliance, and were significantly reducing borrower governments’ “ownership” of the reforms (Boughton 2003). Meanwhile, the World Bank’s Operation Evaluation Department’s (OED) assessments were pointing to dramatically declining success rates—from 80% to 85% in the 1980s to less than 65% in the 1990s (OED 1994), figures that were of great concern to World Bank president James Wolfensohn.

One of the underlying targets of these criticisms was the policy framework known as the “Washington Consensus,” a broadly neoliberal approach to development that put growth at its core and saw the market as the best way of achieving development goals (e.g., Stiglitz 1998:1). Yet, even as the World Bank dedicated its 1997 flagship World Development Report to the “rediscovery” of the state after two decades of denigrating or denying its role, the report was also very careful to distinguish the World Bank’s present strategy from earlier state-led approaches to development, arguing for the need to “take the burden off the state by involving citizens and communities in the delivery of core collective goods” (World Bank 1997:3).  Treating both state- and market-dominated approaches as failures, the World Bank has pursued a middle way between the two, forging new and dynamic assemblages of public and private actors, claims, and practices to simultaneously pursue public goals and private interests (Best 2014a).

Contested Failures

Of course, policy failures occur all the time. Sometimes they are perceived as failures, and sometimes they are ignored. Yet occasionally they become what I call “contested failures”: failures important enough to produce widespread debates about the meaning of success and failure and the metrics through which we evaluate them (Best 2014b). The concept of contested failure is connected to what Andrew Barry calls “knowledge controversies,” in which the metrics that are usually taken for granted become, for a time, politicized (Barry 2012).

These are interesting moments when we confront them in our everyday lives. Many of those of us who teach for a living, for example, have confronted a set of exams that fall so far below our expectations that they force us to re-evaluate our conceptions of success and failure (and, at least in my case, to change the assignment altogether). Such contested failures are fascinating moments in politics because the question of what counts as success is both highly technical—involving questions of evaluation and calculation—and normative—raising the question of what we value enough to define as success.

The “aid effectiveness” debate that emerged in the 1990s and early 2000s was a classic example of this kind of contested failure, as its participants responded by problematizing and ultimately rethinking what makes aid succeed or fail. This widespread debate, which included practitioners, nongovernmental organizations (NGOs), academics, and politicians, raised important questions about why aid did not seem to be working, and ultimately produced some rather different definitions of what counts as successful development (World Bank 1998).

New Definitions of Success

The new definitions of success that began to take hold from the late 1990s onward were somewhat paradoxical.

On the one hand, the conception of success that began to emerge was far bigger and messier than it had been in the past. In the place of narrowly economic definitions of effectiveness, agencies now sought to pursue a much broader and longer-term set of objectives, recognizing that economic development is inextricably linked to political, social, and cultural dynamics that are often particular to a given country or region. For example, development staff hoped to achieve a much greater level of “country ownership” over the policies that they believed needed to be pursued, seeking to encourage domestic engagement by various stakeholders. Their goal was to build political support for ambitious, longer-term institutional reforms, whether through (at least somewhat) participatory consultations or community-driven development.

On the other hand, the metrics for measuring success became increasingly narrow, particularly as the enthusiasm for results and outcomes-based evaluation began to grow in the 2000s. These new metrics sought to respond to (and reduce) the ambiguities produced by the expanded conception of development objectives by making them more readily quantifiable. If development policymakers and aid ministers were no longer able to point to a school or dam to show where the dollars had gone, at least (the theory went) they could point to a measureable result that affirmed a direct line of causality between policy, output, and longer-term outcome.

Not surprisingly, one of the effects of this drive to make aid outcomes measurable has been to create incentives for pursuing policies that are easier to measure. For example, the “cash on delivery” approach, developed in 2006 by the U.S.-based think tank Center for Global Development, promises to pay a set amount for each “unit” of an agreed result. One pilot project developed by the British Department for International Development (DFID) in Ethiopia pays the government £50 for each student who sits a particular exam, and £100 for each one who passes it. This kind of fixation on measurable results creates a proliferation of policies aimed at getting students in exam seats and bed-nets on beds while driving policymakers away from the kind of complex, messy conceptions of development success that the aid effectiveness debate had revealed to be so important.

New Micro-Devices: Poverty, Cash Transfers, and Microcredit

Many of the devices and practices that emerged in the years since the aid effectiveness debate reflect this hybrid character. Although the large-scale, macro-level ambitions of market-led development and poverty reduction remain at the heart of these policies, they are now increasingly pursued through more cautious, smaller-scale, micro-level techniques. This does not just mean that these interventions address the same targets at a smaller scale. Rather, the embrace of these new techniques of intervention corresponds to a new ontology of the object of development.

One area in which we can clearly see this combination of macro-ambitions and micro-techniques is in efforts to reduce poverty. Part of what development researchers and practitioners found so unsettling about the Asian financial crisis and AIDS crisis was how these events pushed huge numbers of people back into poverty, undoing decades of progress. Led by the Social Protection Unit at the World Bank shortly after its creation in 1996, a number of aid agencies began to move away from static conceptions of poverty that generally assumed once an individual or family moved out of poverty they would stay that way (World Bank 2001a). These policy failures forced aid practitioners to rethink poverty on an ontological level, seeing it as a dynamic process rather than a static state (Best 2013). Staff working on social protection at the World Bank sought to redefine poverty as social risk and vulnerability, and to devise a range of more flexible devices in response. This approach to poverty reduction ultimately became a core part of the influential 2000–2001 World Development Report Attacking Poverty, and has been adopted by a number of other organizations, including the DFID and the Organisation for Economic Co-operation and Development (OECD; World Bank 2001b).

The logic of the social risk approach is straightforward: in a volatile and unpredictable world where political, economic, climate, and health crises are always possible, poverty-reduction policy needs to help individuals and communities become better risk managers, capable of preparing for and responding to external shocks. Because some risks are covariant (affecting a large community or even the entire national population), traditional forms of insurance may not be effective because they were designed to respond to idiosyncratic risks (such as a single individual’s health difficulties, or a house fire). The state therefore becomes an important part of the solution, but only as one actor among many, resolving problems of market failure, supporting and combining with private sector initiatives, and enabling individuals to become more active in managing their own risks.

Some of the most popular devices for managing poor people’s vulnerability to poverty, including conditional cash transfers (CCTs) and microcredit initiatives, clearly reflect this hybrid public-private, micro-level focus. CCTs are state-provided funds targeted toward very poor populations, particularly women, generally on the condition that they keep their children in school and bring them in for regular health check-ups. The funds are supposed to help poor people respond to immediate shocks, whereas the conditions are aimed at increasing the resilience of future generations and improving their chances of becoming better risk managers.

Microcredit initiatives, which provide very small loans to people who would not qualify for conventional credit, started out as state and NGO-funded programs but have become increasingly market (and profit) driven in recent years. Their objective is to provide poor individuals with the kind of financial credit that they need to actively take “good” economic risks (such as investing in education or an entrepreneurial activity), in the belief that this will allow them to become more active and autonomous participants in the market economy.

As the World Bank’s first Social Protection Strategy’s title made clear, although this approach works at the micro level, it continues to have macro-level development ambitions, even as it reconceives them in more dynamic terms: seeking to transform social protection efforts from “safety-net to springboard” (World Bank 2001a). The Social Protection Unit’s current website builds on this idea:

In a world filled with risk and potential, social protection systems help individuals and families especially the poor and vulnerable cope with crises and shocks, find jobs, improve productivity, invest in the health and education of their children, and protect the aging population (World Bank 2017).

Evaluation

For the many experts and officials at international development agencies seeking to re-establish their authority in the wake of the failures of the 1990s, these new development devices are attractive in part because their promise of calculability. Many CCT programs have been explicitly designed to collect evidence about their effectiveness, and their growing popularity among development agencies is linked to the promise of demonstrating measurable results. After inconclusive evidence about whether it was the cash or the conditions in CCTs that had some positive effects on school enrollment, a growing number of CCT programs have been designed as randomized experiments that test the effectiveness of conditional and unconditional payments (Baird et al. 2010).

In the case of microcredit, calculability plays a very different but nonetheless crucial role: the development of increasingly sophisticated techniques for evaluating and pricing credit risk among the very poor has made it possible for large financial firms to become involved, not only expanding microcredit but also building a new financial industry around the packaging and resale of these loans to foreign investors (Langevin 2017). These firms have managed in some cases to securitize large portfolios of microloans (rather like the subprime mortgages at the heart of the last global financial crisis), translating the often very high interest rates charged to poor borrowers into global flows of investor value (Aitken 2013).

Governing Failure

Although these various new development devices hold the promise of measurable results, we should not overestimate their technical proficiency; they continue to face the problem of failure even as they seek to respond to it. In fact, many of these new development initiatives have failed to meet at least some of their main objectives. The evidence on conditional cash transfers, though plentiful, is mixed: they do seem to have positive short-term effects on educational enrollment in particular, but their longer-term effects are difficult to demonstrate, and it is not clear yet whether the conditions themselves make any difference. There have also been some highly publicized failures in microcredit, including a rash of suicides by individuals crushed by microfinance debts in Andhra Pradesh, India, that have reinforced a broader questioning of its capacity to alleviate poverty.

More fundamentally, the tension that I identify at the outset of this article—between a growing recognition of the messiness of development success and a persistent desire to tame and often deny that complexity by simplifying forms of measurement and evaluation—remains itself a nagging source of failure. Many of the development practitioners I have spoken to are well aware that it is nearly impossible to make tidy causal links between a given policy action and a complex series of longer-term outcomes, particularly where there are multiple other aid actors and external dynamics in play. Yet, because they are forced to play the game of measurable results, they have begun to design their policies so that they are as easy to measure as possible, distorting development objectives to make them appear calculable (Natsios 2010).

This emergent micro approach to development assistance remains a paradoxical one: cultivating public goals by mobilizing private interests, pursuing more complex objectives while trying to translate them into simpler metrics, and ultimately courting repeated failure to give the veneer of success.

Jacqueline Best is Professor in the School of Political Studies at the University of Ottawa, where she works on political, cultural and social underpinnings of the global economy. Her most recent research examines the concept and role of economic exceptionalism in times of crisis.

References

Aitken, R. 2013. “The Financialization of Micro-Credit.” Development and Change 44(3):473–499.

Baird, S., C. McIntosh, and B. Ösler. 2010. “Cash or Condition? Evidence from a Randomized Cash Transfer Program.” World Bank Policy Research Working Paper 5259.

Barry, A. 2012. “Political Situations: Knowledge Controversies in Transnational Governance.” Critical Policy Studies 6(3):324–336.

Best, J. 2013. “Redefining Poverty as Risk and Vulnerability: Shifting Strategies of Liberal Economic Governance.” Third World Quarterly 34(2):109–129.

———. 2014a. “The ‘Demand’ Side of Good Governance: The Return of the Public in World Bank Policy.” In The Return of the Public in Global Governance, edited by J. Best and A. Gheciu, pp. 97–119. Cambridge, UK: Cambridge University Press.

———. 2014b. Governing Failure: Provisional Expertise and the Transformation of Global Development Finance. Cambridge, UK: Cambridge University Press.

Boughton, J. M. 2003. “Who’s in Charge? Ownership and Conditionality in IMF-Supported Programs.” IMF Working Paper WP/03/191.

Collier, P. 1997. “The Failure of Conditionality.” Perspectives on Aid and Development, edited by C. Gwyn and J. Nelson, pp. 51–77. Washington, DC: Overseas Development Council.

Killick, T. 1997. “Principals, Agents and the Failings of Conditionality.” Journal of International Development 9(4):483–494.

Langevin, M. 2017. “L’agencement entre la haute finance et l’univers du développement: des conséquences multiples pour la formation des marchés (micro)financiers.” Canadian Journal of Development .

Natsios, A. 2010. The Clash of the Counter-Bureaucracy and Development. Washington, DC: Center for Global Development.

Operation Evaluation Department, World Bank (OED). 1994. Annual Review of Evaluation Results 1993. Washington, DC: Operation Evaluation Department, World Bank.

Stiglitz, J. 1998. Towards a New Paradigm for Development: Strategies, Policies, and Processes. Prebisch Lecture. Geneva, Switzerland: United Nations Conference on Trade and Development.

World Bank. 1997. World Development Report 1997: The State in a Changing World. Washington, DC,: World Bank.

———. 1998. Assessing Aid: What Works, What Doesn’t, and Why. New York: Oxford University Press.

———. 2001a. Social Protection Sector Strategy: From Safety Net to Springboard. Washington, DC: World Bank.

———. 2001b. World Development Report 2000/01: Attacking Poverty. Washington, DC: World Bank.

———. 2017. “Social Protection: Overview.” Available at  link.

Image Credit: Consumer pays for her purchase with 1000 rupiah at a market in Lenek Village, East Lombok District, Indonesia. Asian Development Bank.

“Ich kann nicht”: Hearing Racialized Language in Josh Inocéncio’s Purple Eyes (Ojos Violetas)

In Spring 2017, I brought Houston-based playwright/performer Josh Inocéncio to my campus—the University of Houston—to perform his solo show Purple Eyes (for more on the event, see “Campus Organizing, or How I Use Theatre to Resist”). Purple Eyes is what Inocéncio calls an “ancestral auto/biographical” performance piece which explores his upbringing as a closeted gay Chicano living in the midst of the cultural heritage of machismo. Following a legacy of solo performance storytelling aesthetics seen in John Leguizamo’s Freak and Luis Alfaro’s Downtown, Inocéncio plays with memory to understand how the United States and Mexico have influenced his family and his own identity formation. Moreover, Purple Eyes explores the intersections of queerness and Chican@ identity alongside the legacy of machismo in his family (For more on the play, see “Queering Machismo from Michoacán to Montrose”).

Still from Purple Eyes (Ojos Violetas), with permission from Josh Inocencio who retains copyright.

During my Intro to LGBT Studies course following the performance, students discussed issues of representation and how many of them had never seen a queer Latin@/x play or performance, with some of them having never seen a live play. Many students picked up on how Purple Eyes foregrounds the intersections of race, ethnicity, gender, and sexuality. While these discussions were indeed fruitful, what struck me most was how both classes harped on Inocéncio’s use of different linguistic registers. Put simply, what stayed with them was how the performance sounded. My students obsessed over the Spanish in the play, leading me to question why this group of students at a Hispanic-Serving Institution in a city that is over 40% Latin@ had so much trouble whenever Inocéncio spoke Spanish, or the sounds of Latinidad.

In what follows, I discuss how my students heard Purple Eyes. While the play is predominately in English, Inocéncio often code-switches into Spanish and German to more accurately embody particular family members. This blog adds to previous research by Dolores Inés Casillas, Sara V. Hinojos, Marci R. McMahon, Liana Silva, and Jennifer Stoever  on the relationship between the Spanish language and non-Spanish speaking Americans. Indeed, my students racialized the Spanish in Purple Eyes while completely disregarding the German in the play. Why?

Drawing from sociology, racialization is the process of imposing racial identities to a social practice or group that might not have identified in such a way. Typically, the dominant group racializes the marginalized group; i.e. Latin@s in the U.S. become racialized by the mainstream. Even so, Latin@s are not a race, but are an ethnic group. Yet, I argue that non-Latin@ Americans view Latin@s through a lens of race which often becomes a sonic one, in which language becomes one of the most overt identity markers. In terms of Spanish, while many races and ethnicities speak the language, in the United States it is often viewed as a way to mark Spanish-speaking Latin@s as Other. In this way, language plays a fundamental role in shaping mainstream ideas about race. According to Dolores Inés Casillas, “For unfamiliar ears, the sounds of Spanish, the mariachi ensemble, and/or accented karaoke all work together to signal brownness, working-class,” and as Jennifer Stoever argues, the sounds of Latinidad indicate “illegality” in the U.S.

Drawing from the intersections of race, language, and racism, the relatively new academic field Raciolinguistics has emerged as a means to explain how people use language to shape their identity (For more, see Raciolinguistics: How Language Shapes Our Ideas About Race). Branching off from Raciolinguistics, I am most interested in exploring how the mainstream hears languages and racializes what they are hearing. The result is that Spanish is seen as Other, meaning that monolingual U.S. listeners hear Spanish-speakers as inherently different and a threat to a mainstream United States cultural and, more importantly, national identity.

Still from Purple Eyes (Ojos Violetas), with permission from Josh Inocencio who retains copyright.

Reflecting Inocéncio’s cultural multiplicity, Purple Eyes features English, Spanish, and German strategically used at different moments in the play to reflect the temporality, positionality, and relationship to language of each character that Inocéncio inhabits. While the chapter on his father is entirely in English, the final chapter focusing on Josh himself opens with a monologue in Spanish in which the performer narrates the events of the FIFA World Cup before finally announcing to the crowd that the epilogue is Inocéncio’s journey of young love and heartbreak on his journey of queer discovery. This moment features the longest extended use of Spanish in the play. The remaining Spanish is sprinkled in as Josh code-switches between the two languages for added cultural specificity.

While some of my Spanish-speaking students appreciated hearing a play that reflected their linguistic identities, monolingual English speakers in my class claimed that the Spanish confused them and made it difficult for them to follow certain parts of the play. After several students echoed these thoughts, a student from Mexico without full fluency in English comprehension told others about how her experiences were the exact opposite. She had trouble following some of the parts in English since she is still learning the language. I then pivoted the conversation to discuss how my English-dominant students approached the play with the assumption that English is the norm and a performance on a university campus should reflect this. Case in point: several told me that the show should have been subtitled.

But what was most telling was the following exchange. After several expressed confusion over the Spanish, one particularly woke student from Nigeria raised her hand and said: “I haven’t heard anyone say anything about the German in the play and not being able to follow the play during the German part.”  She then noted how, in the United States, Spanish is racialized whereas German is not. In fact, most of the students did not even recall German in the play. Admittedly, the play features far more Spanish than German, but the scene in which Inocéncio speaks German occurs while dramatizing his Austrian grandmother’s abortion. As Inocéncio (as Oma) frantically repeated “Ich kann nicht” (I can’t), my students had no trouble; to use some Millennial vernacular, it was with Spanish that they “couldn’t even.” Arguably, this is the most intense scene in the performance and one that my students wanted to discuss. That the majority of them understood this scene without fully registering the German, coupled with their confusion over lines spoken in Spanish, speaks to not only how race and ethnicity impact how languages are heard in the United States. German is viewed as familiar and accessible whereas Spanish is immediately heard as foreign, i.e. undesirable, not welcome here.

As the Latin@ population continues to grow and the Spanish language becomes an increasingly present reality in U.S. everyday life, audiences must consider possibilities not grounded in an English-only narrative. My experiences with Purple Eyes are not unique. I have witnessed and heard many stories about audiences at mainstream theatre companies who have struggled whenever a play included Spanish. While I don’t claim to have the answers to address this across the nation, as an educator, I question what tools I can give my students to help prepare them for sonic experiences outside of their comfort zone and, specifically, how they become aware of subconscious racialization practices. What will they hear? And, more importantly, how will they react?

Featured Image: Still from Purple Eyes (Ojos Violetas), with permission from Josh Inocencio who retains copyright.

Trevor Boffone is a Houston-based scholar, educator, writer, dramaturg, producer, and the founder of the 50 Playwrights Project. He is a member of the National Steering Committee for the Latinx Theatre Commons and the Café Onda Editorial Board. Trevor has a Ph.D. in Latin@ Theatre and Literature from the Department of Hispanic Studies at the University of Houston where he holds a Graduate Certificate in Women’s, Gender, & Sexuality Studies. He holds an MA in Hispanic Studies from Villanova University and a BA in Spanish from Loyola University New Orleans. Trevor researches the intersections of race, ethnicity, gender, sexuality, and community in Chican@ and Latin@ theater and performance. His first book project, Eastside Latinidad: Josefina López, Community, and Social Change in Los Angeles, examines the textual and performative strategies of contemporary Latin@ theatermakers based in Boyle Heights that use performance as a tool to expand notions of Latinidad and (re)build a community that reflects this diverse and fluid identity. He is co-editing (with Teresa Marrero and Chantal Rodriguez) an anthology of Latinx plays from the Los Angeles Theatre Center’s Encuentro 2014 (under contract with Northwestern University Press).

  REWIND!…If you liked this post, you may also dig:

“Don’t Be Self-Conchas”: Listening to Mexican Styled Phonetics in Popular Culture*–Sara V. Hinojos and Dolores Inés Casillas

Deaf Latin@ Performance: Listening with the Third Ear–Trevor Boffone

Moonlight’s Orchestral Manoeuvers: A duet by Shakira Holt and Christopher Chien

If La Llorona Was a Punk Rocker: Detonguing The Off-Key Caos and Screams of Alice Bag–Marlen Rios-Hernández

Registration Open for NECS Post-Conference: Open Media Studies

The process of scholarly communication is changing dramatically. Digitization of archives, online research methods and tools, and new ways to disseminate research results are developing fast. During the past four annual NECS (Network for Cinema and Media Studies) conferences, we have held two-hour workshops to discuss the implementation of open access, organized by, among others, editors of the open access journals VIEW: Journal of European Television History and Culture and NECSUS: European Journal of Media Studies. Both journals were founded in 2012 with a NWO grant.

In 2018 we will expand on our experience by organizing a one-day workshop immediately following the annual NECS conference, which this year will be held in Amsterdam, organized by the University of Amsterdam (UvA), University Utrecht (UU), and the Free University of Amsterdam (VU) on 27-29 June. Our post-conference workshop will take place on Saturday 30 June 2018 at the Netherlands Institute for Sound and Vision, Hilversum, The Netherlands.

The developments in open research follow each other at a rapid pace. For the discipline of media studies, developments appear to be a bit faster than for other disciplines in the humanities, as there is already a longer tradition of online sharing, and different media are used for scholarly communication (blogs, videos, audiovisual essays, etc.), besides the traditional peer-reviewed journal article or monograph. With this one-day workshop we aim to explore the concept of ‘open’ in media studies by sharing best practices as well as to investigate what is needed for media scholars to make the entire scholarly communication process (research, analysis, writing, review, publishing, etc.) more transparent.

We will do so by bringing together a group of maximum 25 researchers in media studies in a series of workshops devoted to the themes: 1) research and analysis, 2) writing and publishing, 3) peer review, and 4) public engagement. The day will open with a keynote by prof.dr. Malte Hagener (Philipps-Universität Marburg), one of the co-founding editors of NECSUS and founder of the recently launched project MediaRep, a subject repository for media studies.

The main goals for the day are: creating awareness among the researchers; offer solutions to concrete issues; and explore new open access/science initiatives in relation to media studies. Outcomes of the workshop will be published on the website of NECS, as well as on the Open Access in Media Studies website.

Registration is free. However, there is a maximum of 25 participants. Workshops will be hands-on and active participation is encouraged. Interested?

For the preliminary program and registration, please follow this link.

Hope to see you in Amsterdam/Hilversum!

Organising team: Jeroen Sondervan (Utrecht University), Jeffrey Pooley (Muhlenberg College, US), Jaap Kooijman (University of Amsterdam), Erwin Verbruggen (Netherlands Institute for Sound and Vision).

New Learned Society Network: ScholarlyHub

On October 21st the ScholarlyHub initiative launched its website, mission and ideas about developing a new social academic open access network for sharing papers and other scholarly literature. The project is in its incubator stage and needs crowdfunding to further develop these plans. ScholaryHub wants to directly compete with academic social networking platforms like Academia.edu and ResearchGate. The big difference between these commercial networking behemoths and the ScholarlyHub initiative, is the scholarly-led bottom-up approach of the latter. A remarkable group of academics from different disciplines have gathered to take the first steps towards a non-profit framework with options to share papers, collaborate with other researchers, and enhance public engagement, using social networking tools.

From the project website:

ScholarlyHub will be a non-profit framework, where members pay a small annual fee (directly or through an existing learned society, network, project or institution) and create personal, thematic, project-based, associational or institutional profiles and populate them with scholarly and educational materials as they see fit. These are stored in a searchable, real open-access archive, and are directly viewable and downloadable from the portal by anyone (that is, not only members), without having to register or volunteer personal data.”

In order to make this happen, money is needed to built an infrastructure. No venture capital, but actual support from actual researchers. On November 29th ScholarlyHub launched a crowdfunding campaign hoping to raise € 500.000,- for developing the first version of the platform.

It will be very interesting to see how this initiative will evolve in the next few months, because in the last few years criticism has grown about the commercialization of the aforementioned platforms Academia.edu and ResearchGate. For these enterprises, the user is the product and that obviously leads to important (ethical) questions power, ownership, reuse, and archiving policies, etc..[1] All in all, practices ScholarlyHub explicitly rejects.

As can be found on their website: “Growing threats to open science have made it more crucial than before to develop a sustainable, not-for-profit environment. One that allows you to publish, share, and access quality work without financial constraints.”

But some have already asked the question how this platform will relate to, for example, the Humanities Commons, which pursues similar goals and which saw the light last year[2]. And another example the Open Science Framework platform, which offers an open repository for papers and data. A very interesting and much needed discussion will happen in the coming months to investigate whether and how these non-profit platforms should co-exist.

In any case, it will be a much healthier situation if, in addition to the existing commercial academic social networks, non-profit equivalents enter this market.

Notes:

[1] Further reading: Pooley, J. (2017). Scholarly Communication Shouldn’t Just be Open but Non-Profit Too: http://blogs.lse.ac.uk/impactofsocialsciences/2017/08/15/scholarly-communications-shouldnt-just-be-open-but-non-profit-too/

[2] https://www.edsurge.com/news/2017-11-16-researchers-ask-does-academia-need-another-alternative-to-for-profit-scholarly-platforms / ScholarlyHub Response: https://www.scholarlyhub.org/feed/2017/11/12/launch-weeks-ffaqs

Sounding Out! Podcast #64: Standing Rock, Protest, Sound and Power (Part 2)

CLICK HERE TO DOWNLOADStanding Rock, Protest, Sound and Power (Part 2)

SUBSCRIBE TO THE SERIES VIA ITUNES

ADD OUR PODCASTS TO YOUR STITCHER FAVORITES PLAYLIST

Part Two of a special series on Standing Rock, Protest, Sound and Power. The guest for today’s podcast is Tracy Rector. Tracy is a Choctaw/Seminole filmmaker, curator, community organizer, and Executive Director and Co-founder of Longhouse Media. In 2017 Indigenous grassroots leaders called upon allies across the United States and around the world to peacefully march in support of the Standing Rock Sioux Tribe. They asked allies to simply exist, resist, and rise in solidarity with Indigenous peoples and their rights–rights which protect mother earth for all future generations. In this podcast we talk about Tracy’s thoughts and observations as a filmmaker who was present at Standing Rock. We discuss the election of a new administration, increasing threats to native land, and police violence in today’s podcast.

In Part One, our host Marcella Ernest spoke with Dr. Nancy Marie Mithlo, a Native American art historian and Associate Professor of Art History and American Indian studies. They discussed how Nancy experiences the sonic elements of Native activism as a trained anthropologist. In Part Two, Tracy’s experience playing with sound and visuals as a documentarian brings a different perspective to understanding Native activism.

Marcella Ernest is a Native American (Ojibwe) interdisciplinary video artist and scholar. Her work combines electronic media with sound design with film and photography in a variety of formats; using multi-media installations incorporating large-scale projections and experimental film aesthetics. Currently living in California, Marcella is completing an interdisciplinary Ph.D. in American Studies at the University of New Mexico. Drawing upon a Critical Indigenous Studies framework to explore how “Indianness” and Indigenity are represented in studies of American and Indigenous visual and popular culture, her primary research is an engagement with contemporary Native art to understand how members of colonized groups use a re-mix of experimental video and sound design as a means for cultural and political expressions of resistance.

Featured image used with permission by Tracy Rector.

tape reelREWIND! . . .If you liked this post, you may also dig:

Sounding Out! Podcast #60: Standing Rock, Protest, Sound, and Power (Part 1) — Marcella Ernest

Sounding Out! Podcast #51: Creating New Worlds From Old Sounds – Marcella Ernest

Sounding Out! Podcast #58: The Meaning of Silence – Marcella Ernest

Rational Sin

In the last 20 years, global health experts have recognized the importance and encouraged the adoption of sin taxes in the fight against non-communicable diseases (NCDs) in the Global South. At the level of discourse, this is illustrated by the vast global health literature on NCDs published from the late 1990s onwards: reports and action plans issued by international organizations like the World Health Organization (WHO) and the World Bank, editorials and scientific papers in medical journals like The Lancet, and policy documents and pamphlets prepared by aid agencies, health charities, and private philanthropies. Most of these documents start by reminding readers that NCDs—chronic diseases such as cancer and diabetes associated with behavioral risk factors like smoking, drinking, and unhealthy diets—are now responsible for most of the burden of death and disability across the Global South. They then identify excise taxes levied on tobacco, alcohol, and sugar as the most effective strategy to address this burden of death and disability.

WHO poster for the 2014 World No Tobacco Day advocating for taxes on tobacco products as a strategy to lower the associated burden of death and disease.

WHO poster for the 2014 World No Tobacco Day advocating for taxes on tobacco products as a strategy to lower the associated burden of death and disease.

This literature explains how—given that price is correlated with demand for tobacco, alcohol, and sugar—increasing taxes on these products will markedly reduce rates of smoking, drinking, and unhealthy eating and thereby the incidence of chronic diseases associated with these behaviors. It also stresses how sin taxes not only improve the health of nations, but also strengthen their finances. Indeed, as many of the experts cited in this literature make clear, increased taxation rates largely compensate for the decrease in tobacco, alcohol, and sugar consumption, thus allowing national governments to amass larger tax revenues that can be earmarked to finance national health systems and achieve universal health coverage. Last but not least, this literature also extols the fact that, as indirect taxes, sin taxes are relatively easy to set up and administer for governments. At the level of practice, the growing importance of sin taxes within global health can be illustrated by the mounting number of countries in the Global South—from Chile, Mexico, and South Africa to Thailand, India, and the Philippines—that have introduced taxation schemes for tobacco, alcohol, and/or sugar to combat the NCD epidemic. Many of these national schemas have been supported by international efforts such as the Bloomberg Initiative, a US$1 billion project to reduce tobacco use in developing countries led by the Bloomberg and Gates foundations, in which sin taxes play a central role.

In many ways, sin taxes are typical of the micro-technologies that have proliferated in the fields of development and humanitarian aid in the past two decades, what Stephen Collier, Peter Redfield, and their colleagues have called “little development devices” and “humanitarian goods” (Collier et al., 2017; Cross, 2013; Redfield, 2012). Indeed, like many of these micro-devices, sin taxes are meant to improve people’s quality of life, are eminently portable, and, as I discuss below, operate at the micro level, targeting individuals’ aspirations, preferences, and calculations rather than any larger macroeconomic aggregate. In this essay I shed some light on the complex genealogies of these micro-technologies by unpacking some of the political theories, scientific concepts, and ethical norms that make up sin taxes. I suggest that sin taxes are built around a particular subject—the rational actor seeking to maximize their welfare in line with their own preferences—whose origins can be traced back to the Chicago School’s microeconomic tradition and its concern with rational choice theory. In doing so, I draw on Madeleine Akrich’s (1992) concept of “de-scription” and her claim that one can find inscribed in a technical device many of the assumptions, aspirations, and values of those who designed it. In my de-scription of sin taxes I examine the work of a small network of economists led by University of Chicago professor Gary Becker and two of his collaborators, Mike Grossman and Frank Chaloupka, that was instrumental in transforming sin taxes into an accepted global health strategy. In particular, I focus on this network’s research on tobacco taxation, which was the first type of sin tax to gain acceptance in the global health field and later served as a model for excises on alcohol and sugar. I begin by showing how this research grew out of Chicago’s microeconomic tradition and Becker’s work in particular before examining how it radically transformed international tobacco control and the model of the smoker that underpins it. I conclude by reflecting on what this story can teach us about the wider history of the recent proliferation of micro-technologies in the fields of development and humanitarian aid.

Tax revenue stamp from South Africa. From Andrey Vasiunin’s online collection.

The Chicago microeconomic tradition was articulated by George Stigler, Gary Becker, and other members of the Chicago School from the 1950s onwards. As historian Steven Medema (2011:153) has carefully documented, for the earlier generations of Chicago economists, from Frank Knight to Milton Friedman, economics was the study of the “social organization of economic activity” and, in particular, “markets as coordinating devices.” This changed after the 1960s following the arrivals of Stigler and especially Becker at the University of Chicago. For this new generation, economics was redefined as the study of “human behavior” and, specifically, “rational individual choices” under “conditions of scarcity” (Medema 2011:161–162). By redefining their object of study in this way, the new generation of economists at Chicago profoundly altered their discipline (Foucault 2008). First, they made it possible to analyze how individual decisions had implications at the macro level, thus extending economic analysis within its own domain. Second, they encouraged economists to espouse an expansionist agenda and apply their methods to traditionally non-economic domains. As Medema (2011:172) has also showed, the reason for the shift of focus from social organization and markets to individual behavior and choice lies in the marked influence that rational choice theory had on many of the new generation of Chicago economists. Indeed, this “new science of choice,” articulated during the Cold War around the notion of the “rational actor,” was a “catalyst for change” in the American social sciences, where it introduced a fresh focus on and new techniques to analyze the role of individuals and their decisions in the making of complex social phenomena (Amadae 2003:5–8).

Gary Becker’s work has been central to Chicago’s microeconomic tradition (Medema 2011). Becker established the idea that economics was about the study of human behavior and choice. A disciplinary imperialist, he also believed that economics should not be limited to behaviors usually studied by economists but expanded to behaviors traditionally analyzed by other social scientists such as sociologists and anthropologists. As Becker explained, economics was about “problems of choice,” whether that was “the choice of a car, a marriage mate [or] a religion” (cited in Medema 2001:161). These beliefs strongly influenced the sort of questions (Why do individuals decide to invest in education? Why do they elect to marry and have children? Why do they choose to engage in criminal activity?) that he sought to address in his own research. The way in which Becker approached and analyzed human behavior was informed by rational choice theory. Specifically, he suggested that choices made by individuals should always be considered rational, even when they are criminal or antisocial. By rational, Becker (1992:38) meant that these choices are made by individuals who seek to “maximize welfare as they conceive it.” He believed that when doing so, individuals take into account their own “values and preferences” and anticipate as best they can “the uncertain consequences of their actions” (Becker 1992:38). He also supposed that their choices are “constrained by income, time, imperfect memory, calculating capacities and other limited resources” and shaped by “the available opportunities in the economy and elsewhere” (Becker 1992:38). For Becker, the task of the economist was to develop and empirically test mathematical models that identified and organized these different variables in a way that explained and predicted the type of behavior being analyzed.

Not until the 1980–1990s did economists systematically apply the tools and concepts of Chicago microeconomics to the study of smoking (Reubi 2013, 2016). Two interrelated bodies of work were critical in that respect. The first encompassed the studies on the demand for tobacco products carried out by Mike Grossman together with his former student Frank Chaloupka and others (e.g., Chaloupka and Grossman 1996; Lewit et al. 1981). Grossman was key in popularizing the use of Chicago microeconomics to analyze health-related behaviors, both in his own research and as director of the National Bureau of Economic Research’s (NBER) Health Economics Program. For his PhD carried out under Becker’s supervision, Grossman constructed a model of the “demand for good health” where health was a form of “human capital” that everyone possessed and could choose to invest in and increase (Grossman 1972:xiv–vx). Given his interest in health at a time when smoking had become a major public health issue in North America and Europe, it is unsurprising that Grossman subsequently chose to work on the demand for cigarettes together with Chaloupka and other colleagues. This research first established that price was a key factor for the demand for cigarettes. The research also showed that price was a particularly powerful motivator for young adults and individuals of low socioeconomic status, who have less income and are more resistant to public information campaigns on the dangers of smoking. The second body of work encompassed the studies on addiction conducted by Becker in collaboration with Grossman, Chaloupka, and a few others (e.g., Becker and Murphy 1988; Chaloupka 1990). Building on insights from rational choice theory, Becker and his collaborators claimed that contrary to popular belief, “addictions are rational in the sense of involving forward-looking maximization with stable preferences” (Becker and Murphy 1988:675). Using cigarettes and alcohol as their case study, they also built and tested a behavioral model that predicted the demand for addictive substances was greater among individuals who had “low incomes,” were “more present-oriented” and/or had experienced “unhappy” and “stressful events” (Becker and Murphy, 1988:694; Chaloupka 1990:737).

Up to this point, two very different intellectual traditions dominated the field of international tobacco control. The first, which stemmed from the field of health education, was built on the notion of knowledge or information (Berridge 2007, chapter 2; Reubi and Berridge 2016). Public health experts working within this tradition assumed that people smoked because they did not know that tobacco was harmful to their health. Following that assumption, experts believed that their main task was to ensure people were informed about the dangers of smoking. This meant educating people about these dangers through warning labels on cigarette packages, school education programs, and, most important, public information campaigns, which were deemed to be the most powerful anti-smoking measure at the time. This also meant shielding people from the tobacco industry’s marketing and public relations efforts through advertising bans and advocacy tactics to monitor and counter the industry. The second tradition, which grew out of developments in psychology and psychopharmacology, was centered on the notion of addiction (Berridge 2007, chapter 9; Brandt 2004). For public health experts and psychologists who came from this tradition, the reason people smoked, or continued to smoke, was their addiction to nicotine, the psychoactive substance in tobacco. Specifically, they contended that nicotine could, by acting on the brain via complex biomolecular pathways, control the behavior of smokers and compel them to continue smoking. For these experts, the main task was to treat this addiction, which they viewed as a pathology, by using smoking cessation techniques such as behavioral and nicotine replacement therapies.

Cover of the International Union against Tuberculosis and Lung Disease’s Factsheet on Tobacco Taxation

Cover of the International Union against Tuberculosis and Lung Disease’s Factsheet on Tobacco Taxation, with the caption “Young people are most likely to quit when prices rise.”

The work on smoking carried out by Becker and his colleagues posed a direct challenge to these two intellectual traditions, leading to a rupture in and a partial reconfiguration of the field of international tobacco control in the late 1990s. To start, the work of Becker and his colleagues radically altered the view public health experts held on taxation (Reubi 2013). Until then, these experts largely ignored sin taxes as an anti-smoking measure for many reasons, ranging from ignorance about how taxation worked to discomfort about sin taxes’ regressive nature. The network of economists led by Becker helped change this perception, progressively bringing public health experts to see taxation (rather than public information campaigns) as the most potent strategy in the fight against tobacco. Grossman’s work in particular, which showed that price (rather than knowledge) was key in curbing tobacco use in groups where prevalence rates had remained stubbornly high (like the young and the poor), was critical in that respect. Furthermore, the work of Becker and his colleagues also helped establish a new model of the smoker in public health thought. Inscribed in the taxation schemes now multiplying across the tobacco control field, this model was centered on the idea of individual choice rather than the notions of knowledge and addiction associated with health education and psychology, respectively. In this new model, people smoked because they made a rational choice to do so in the sense of a welfare-maximizing calculus based on their preferences and existing circumstances. Although knowledge and addiction retained a place within this model, they were only two factors among many others such as price, education, and pleasure that could influence an individual’s decision to smoke. Moreover, it was up to that individual to determine the importance of these two factors when they weighed their options. As Chaloupka and other leading public health experts and economists argued in an influential World Bank (1999:3) report on tobacco control:

Consumers are usually the best judges of how to spend their money…. [They make] rational and informed choices after weighing the costs and benefits of [their actions]…. Smokers clearly perceive benefits from smoking, such as pleasure and the avoidance of withdrawal, and weigh these against the private costs of their choice. Defined this way, the perceived benefits outweigh the perceived costs, otherwise smokers would not pay to smoke.

To recapitulate, I showed here how a global health device like sin taxes grew out of Chicago’s microeconomic tradition and, in particular, Becker’s project to redefine economics as a function of individual choice and expand it to non-economic domains. Moreover, I outlined how sin taxes were later decoupled from Becker’s project and redeployed as a key strategy in public health efforts to fight the smoking epidemic in the Global South. This redeployment, I also showed, was accompanied by the introduction of a new model of the smoker—the rational, welfare-maximizing individual—within the international tobacco control field. To conclude, I want to reflect on how this story relates to wider historical accounts about the proliferation of micro-technologies within international development and humanitarian aid. In their writings, Collier, Redfield, and others caution against the familiar and well-rehearsed explanation that this proliferation is the result of a shift from welfare states and the social to markets and the individual (e.g., Collier 2011; Cross, 2013; Redfield 2012). Instead, they suggest that the multiplication of these micro-devices is associated with a rupture in development thought from a macroeconomic concern with large, national physical infrastructure projects to a microeconomic focus on the investments in human capital (Collier et al. 2017; see also Reubi 2016). The story of sin taxes outlined here strongly resonates with this broad historical tableau sketched by Collier and others. To begin with, sin taxes emerge from the reconfiguration of Chicago economics from a macroeconomic discipline concerned with markets as coordinating devices to a microeconomic tradition focused on rational individual behavior. It is worth emphasizing that, in the context of this reconfiguration, markets and individual choices stand in contrast to each other. Indeed, this might come as a surprise to some readers for whom markets and individual choice are necessarily—almost naturally—associated. Furthermore, it is critical to realize that the shift from mass public information campaigns to sin taxes that marked the field of international tobacco control in the late 1990s was not a shift from the social to the individual, but rather a change in the concept of the individual. It was a move away from an individual for whom knowledge always and automatically triggered certain actions to an individual who could decide not to act on knowledge and prioritize other elements such as money and pleasure instead. Last, the strong emphasis placed on individual choice in both Becker’s attempts to reform economic thought and global health efforts to curb smoking should not be interpreted as the death of the social. Indeed, in echo of Collier’s (2011) work on the post-Soviet social, the notion of the social or society has remained important for both projects, albeit in different forms. Thus, for Becker (1997:150), sin taxes are “social taxes” that can protect American “society” from the “social harms” associated with rational addictive behaviors, whereas for global health experts, sin taxes are public health “interventions” that can shield developing “societies” from the health and “socio-economic toll” of “21st-century lifestyles” (WHO 2010:vii, 37).

David Reubi is a Wellcome Trust Fellow in the Department of Global Health & Social Medicine, King’s College London, where he is currently working on a manuscript about the biopolitics of the African smoking epidemic.

Acknowledgements

This essay draws on research funded through a Wellcome Trust Society and Ethics Fellowship. The essay also benefited from Stephen Collier and Peter Redfield’s thoughtful comments.

References

Akrich, Madeleine. 1992. “The De-Scription of Technical Objects.” In Shaping Technology/Building Society, edited by W. E. Bijker and J. Law, pp. 205–224. Cambridge, MA: MIT Press.

Amadae, S. M. 2003. Rationalizing Capitalist Democracy. Chicago, IL: University of Chicago Press.

Becker, Gary. 1992. “The Economic Way of Looking at Life.” In Nobel Lectures, Economics 1991–1995, edited by T. Persson, pp. 38–58. Singapore: World Scientific.

———. 1997. The Economics of Life. New York: McGraw-Hill.

Becker, Gary, and Kevin Murphy. 1988. “A Theory of Rational Addiction.” Journal of Political Economy 96(4):675–700.

Berridge, Virginia. 2007. Marketing Health. Oxford, UK: Oxford University Press.

Brandt, Allan. 2004. “From Nicotine to Nicotrol.” In Altering American Consciousness, edited by S. W. Tracy and C. J. Acker, pp. 383–402. Boston, MA: University of Massachusetts Press.

Chaloupka, Frank. 1990. “Rational Addictive Behavior and Cigarette Smoking.” Journal of Political Economy 99(4):722–742.

Chaloupka, Frank, and Mike Grossman. 1996. Price, Tobacco Control Policies and Youth Smoking. New York: NBER.

Collier, Stephen. 2011. Post-Soviet Social. Princeton, NJ: Princeton University Press.

Collier, Stephen, Peter Redfield, Jamie Cross, and Alice Street. 2017. “Little Development Devices/Humanitarian Goods.” Limn 9.

Cross, Jamie. 2013. “The 100th Object.” Journal of Material Culture 18(4):367–387.

Foucault, Michel. 2008. The Birth of Biopolitics. Basingstoke, UK: Palgrave Macmillan.

Grossman, Mike. 1972. The Demand for Health. New York: NBER.

Lewit, Eugene, Douglas Coate, and Mike Grossman. 1981. “The Effects of Government Regulation on Teenage Smoking.” Journal of Law and Economics 24:545–569.

Medema, Steven. 2011. “Chicago Price Theory and Chicago Law and Economics.” In Building Chicago Economics, edited by R. Van Horn, P. Mirowski, and T. Stapleford, pp. 151–178. Cambridge, UK: Cambridge University Press.

Redfield, Peter. 2012. “Bioexpectations.” Public Culture 24(1):157–184.

Reubi, David. 2013. “Health Economists, Tobacco Control and International Development.” BioSocieties 8:205–228.

———. 2016. “Of Neoliberalism and Global Health.” Critical Public Health 26(5):481–486.

Reubi, David, and Virginia Berridge. 2016. “The Internationalisation of Tobacco Control, 1950–2010.” Medical History 60(4):453–472.

World Health Organization (WHO). 2010. Global Status Report on Noncommunicable Diseases. Geneva, Switzerland: WHO.

World Bank. 1999. Curbing the Epidemic. Washington, DC: World Bank.

Image Credits: Featured image of Gary Becker, from The University of Chicago

The Participatory Development Toolkit

The Participatory Development Toolkit is a “small briefcase (26 x 33 x 10 cm) containing 221 activity cards, 65 pictures, 11 charts, 1 guidebook”; it is “covered in brown pattered cloth, with leather handle and leather snap closure.” It is decorated with drawings of women, abstract patterns, huts, trees, animals: drawings, the kit’s guide explains, “by the Warli tribe, who live in the Sahadri mountains in Maharashtra state north of Bombay” and who are “known for their mythic vision of Mother Earth, their traditional agricultural methods, and their lack of caste differentiation” (Narayan-Parker and Srinivasan 1994).

The Participatory Development Toolkit, created by Deepa Narayan, Lyra Srinivasan and others, funded by the World Bank and the United Nations Develop- ment Program, produced in India by Whisper Design of New Delhi, coordinated by Sunita Chakravarty of the Regional Water and Sanitation Group in New Delhi in 1994. This copy owned by the Getty Research Library, Los Angeles, CA.

The Participatory Development Toolkit was created in 1994 primarily by Lyra Srinivasan and Deepa Narayan, two development professionals who at the time worked for the United Nations Development Program (UNDP) and the World Bank. Unsnapped and opened, it reveals a set of 25 folders and a booklet: “Each individual envelope is coded with a number and a title on its flap.” The lid folds back to allow the kit to form a stand, and “every fifth envelope has a color-coded tab. To gain access to the materials in each set of envelopes, pull the tab and the envelopes will extend toward you” (Narayan-Parker and Srinivasan 1994). The Participatory Development Toolkit arrived at the zenith of the rage for “participatory” development. That enthusiasm lasted from the early 1970s, when the United Nations created a “Popular Participation Program” (Pearse and Stiefel 1979), to the 1980s spread of “participatory action research” (Reason 2008), to the prominence in the 1990s of the “participatory rural appraisal” (Chambers 1994) to the 2000/2007 World Bank Development Report, which incorporated “Participatory Poverty Assessments” from around the world (Green 2014; World Bank 2001). Alongside the World Bank Development Sourcebook (World Bank 1996) and a range of other handbooks and sourcebooks and kits, the Participatory Development Toolkit stands out for being an actual kit: a briefcase containing folders that reveal a range of activities, cards, photographs, game pieces, puppets (“flexi-flans”), and, especially, sets of images.

Activty #3 Cards a and b; Flexi-flans, Activity #8 sheets 1 and 2, (Narayan- Parker and Srinivasan 1994).

One can sense in the Participatory Development Toolkit an enthusiasm for inclusiveness, respect, curiosity, and a close-to-the-community style of development; these games, cards, and images are designed to draw people into discussing problems and situations that immediately affect them, to elicit stories and images of the future they would prefer to have, and to debate the solutions to the problems they experience. There are countless versions of a sort of now-and-later game: pictures of unsanitary, impoverished, violent nows, followed by cleaner, wealthier, more humane laters.

It’s not clear how often the kit (itself) was used. The games and images and techniques it contains show up in different settings across decades of attempts to install participatory development in various times and places. Many of the 25 different folders contain activities from Lyra Srinivasan’s SARAR[1] methodology, one of dozens of different packaged methods for engaging people in participation and collective uplift. Others are cribbed directly from the social psychology of Kurt Lewin, who himself inspired a generation of “participatory” research, especially in management (Alden 2012; Lezaun and Calvillo 2014).

Pump Repair Issues. Activity #19 in (Narayan-Parker and Srinivasan 1994, pp. 44-45).

But the simple fact that the kit exists at all is worth dwelling upon. Why was a “toolkit” necessary for an activity called “participatory development” in the 1990s? Who were the tool users, and what might they have done with it? Is the toolkit a device for enticing participation, for improving it, or for something else? What imagination drove its form and function, and can we learn anything from it about today’s attempts to build little development devices, or design humanitarian goods? Can we think of the Participatory Development Toolkit as a precursor to our contemporary attempts to transform development through apps, platforms, algorithms or infrastructures?

The Problem of a Participatory Development Toolkit

At the heart of this kit is a conundrum. The toolkit seeks to “scale up” and spread globally something conceived of as essentially “context specific.” Participatory development, in most of its different guises, has always resisted the idea of a uniform, universal, top-down, one-size-fits-all development. Along with many other critiques of such dreams, participatory development proposes that proper development success should depend on attending to the very specific needs of particular people. Each community, village, neighborhood, council, or agricultural extension district is its own special place, with its own special needs that cannot be simply treated just like the next. Rather, development should involve the residents in diagnosing problems and planning solutions.

A toolkit is a device for decontextualizing: it is filled with tools that can be used in multiple different contexts, tools that are standardized and hardened into a semi-universal state. But the tools are not automatic; a toolkit implies the existence of a skilled tool user as well. A toolkit sits somewhere between an imagination of a context-specific, autonomous, and self-guided development without any facilitation on the one hand; and on the other, the large-scale, universal, automatic spread of one-size-fits-all solutions everywhere. The Participatory Development Toolkit itself reflects exquisite awareness of this problem. The authors take pains to mount warnings at every turn: the kit does not stand alone; the images and games should not be used without adapting them; the kit should not be used to extract information (rather than incite participation); the user of the kit should be prepared to give up control of the kit; the kit, indeed, is not essential (see, for example, Narayan-Parker and Srinivasan 1994:1–5; Srinivasan 1990:12–13).

Activity #3, Charts 2, (Narayan-Parker and Srinivasan 1994).

In between universalism and hyper-specificity sits the kit: mediating by taking what works at a local level, attempting to quasi-formalize it, and inserting it into a briefcase so that it can be carried to the next site to repeat its context-specific success.

“Scalability” of this sort is also at the heart of our contemporary enthusiasm for apps, platforms, and quasi-algorithmic solutions to the problems of developments. The large-scale “big development” projects of mid-century, where scale often meant simply “large,” used “economies of scale” to attain a certain economization or efficiency as a project grew larger; conversely, the “small-is-beautiful” technology solutions of the 1970s counseled a return to the local, the situated, and the appropriate. But contemporary scalability sees in the small a mere instance of the large: a solution at the small scale (e.g., a LifeStraw for dirty water; Redfield 2016) can be “scaled up” and distributed globally. It is small and large at the same time. Some kinds of “tools” are scalable in this sense (software and algorithms preeminent among them), and others, perhaps, are not (dams and bush pumps).

The Participatory Development Toolkit tries to accomplish something similar: it takes a program for participation developed in response to specific cases, generalizes it, and spreads it to other sites and cases. It is “quasi-algorithmic” in the sense that it involves a set of steps in a sort of recipe, but it also relies on the existence of both a skilled tool user (the facilitator of participation, usually a development professional of some kind) and a defined group of participants (women, members of a village, a congress of delegates, extension workers, etc.). Such collectives are called into being just at the moment when the kit is in use. This process produces an experience called “participation.”

To put this contemporary problem in perspective, it is important to emphasize that there have over the decades been plenty of examples of “experiments with participation” not only in development, but also in art, in science and technology policy, in urban planning, or in the workplace (Kelty 2017; Lezaun et al. 2016). It is worth turning to the history of participatory development to understand better what these past experiments sought to achieve.

The Participation That Was

Participatory development has failed at least once already. This is perhaps not obvious to a generation of development workers or scholars discovering participation for the first time in the 2010s. In the 1970s both small, alternative groups (such as Budd Hall and the Participatory Development Network) and large organizations such as the United Nations Popular Participation Program embraced an earlier version of participatory development with enthusiasm. And as it succeeded from the 1970s to the 1990s, it came in for its own critique: by the year 2001, participatory development was being called “a new tyranny” (Cooke and Kothari 2001). The book bearing that subtitle suggested that many things had gone wrong with participation: that it had been bureaucratically routinized; that recipients were gaming the system to become “professional participants”; that it rested on a myth of community or village structure that was inadequate in most places, or to the realities of globalization, and so on. Perhaps most important, it wasn’t clear that participatory development alleviated poverty any better than non-participatory development had.

There was also a clear sense, captured best in Francis Cleaver’s critique, that true participation had been betrayed by toolkits in general:

“Participation” in development activities has been translated into a managerial exercise based on “toolboxes” of procedures and techniques. It has been turned away from its radical roots: we now talk of problem solving through participation rather than problematization, critical engagement and class (Cooke and Kothari 2001:53).

Unserialized Posters; “Fourteen pictures showing various human situations and interactions.” Activity #7 in (Narayan-Parker and Srinivasan 1994, pp. 20-21)

What were these radical roots, and how did they grow into a Participatory Development Toolkit? There are multiple interesting origin points for the Participatory Development Toolkit. The “radical” that Cleaver is no doubt thinking of is the work of Paolo Freire and more generally of “participatory action research” from the early 1970s onward (Freire 2014; Reason and Bradbury 2001). The idea that toolkit makers might try to roll up Paolo Freire and tuck him inside a kit is perhaps surprising, but actually quite obvious if one reads his work carefully. Freire’s ideas of “conscientização” dictated not just a participatory engagement with the impoverished subject, but in particular the use of imagery, games, and specific forms of contextualization. The instructions for using these images in the Participatory Development Toolkit parallel Freire’s own discussion of them in Pedagogy of the Oppressed (2014): they must be “non-directive” (i.e., not “sectarian”) and they must rely entirely on the perceptions (and “perceptions of previous perceptions”) of the “wretched of the earth” themselves. Many of the activities of the kit are directed toward instilling first an understanding of this “non-directive” form of analytical work, to be followed only later by substantive discussion of pumps, latrines, disease, and so on. Once inside the kit, however, Freire’s radical, Marxist pedagogy runs the risk of appearing lightweight and inauthentic, transformed into an exercise in “project management” ripe for critique.

Stuffed inside the kit alongside Freire is Robert Chambers, the development scholar and practitioner most often associated with the rise of participatory development in the 1980s. Chambers started life as a colonial administrator in Kenya, and it was only late in the 1980s that he began to embrace participation as a technique (Cornwall and Scoones 2011). He came to it not as Freire did, as a liberation of the wretched of the earth, but primarily as a question of ascetic practice, which is to say it was less about the participation of the impoverished villager than a form of work on the self for the development professional. Chambers was primarily concerned with “seeing reality” clearly in the hopes of transforming poverty, and he insisted that most of what development professionals did obscured reality: they engaged in “rural development tourism,” they suffered from “tarmac blindness” and “survey slavery” (Chambers 1983). They needed to be given the tools to see what was right in front of them, and to this end, Chambers advocated the flexible use of multiple different methods.

Activity #6, Diagrams 1–3, (Narayan-Parker and Srinivasan 1994)

To address this problem, Chambers pioneered a kind of “method of any method,” by which development workers could transform the simplest of techniques, like walking around and talking with people, into legitimate tools in a toolkit. Interviews, transect walks, pocket charts, ethnographic observation, and much more were lumped together and labeled “participatory rural appraisal.” The approach is clear in the Participatory Development Toolkit: there are 25 folders with different games and activities, each appropriate to a different challenge. There are also explicit directions, much like those that Chambers issued in everything he wrote, to “improvise” and adjust activities to the context and the site in question, to extend the kit and add to it, and, especially, to do so with the participation of those at the receiving end of development’s interventions.

Chambers’ approach implied that any such toolkit required a skilled tool user, and to become such a person, one had to work on oneself, develop new capacities, overcome blindness, see reality clearly, and so on. Only such transformed development workers would be able to effectively take this kit to the field to elicit the kind of participation promised by the likes of Freire (whom he recognized but did not claim as an inspiration). Despite the step-by-step nature of the toolkit (or any of the sourcebooks, scripts, or manuals promulgated as “participatory development”), the quasi-algorithm required a bit of human input: not just any human input, but that of self-reflective, awakened experts.

From the perspective of a later critic such as Cleaver, the toolkit is a proxy for the rigid, hierarchical, male engineer who sees a standard, technological solution to every problem. Participatory development—radical or not—is directly opposed to such powerful, unaccountable forms of decision-making. To the extent that the figure of the Engineer is the tool user, the toolkit is dangerous.

From Chambers’ perspective, however, the enlightened user of the toolkit can achieve a different outcome; tools are figured as neutral and emancipating when given to the right people by the right people, and the result would be the scalable development of both the professional agent and the impoverished subject of development.

Device, Toolkit, Algorithm

What is at stake in thinking of the Participatory Development Tookit as a “quasi-algorithm”? What might be the difference between a briefcase of paper games and routines for eliciting participation, and a piece of software that tries to do something similar, but is implemented on a solar-powered, GPS-enabled, data-intensive smartphone app? Can we see this kit as a vantage point from which to evaluate the contemporary explosion of various devices for development, especially those that demand the input of users concerning local conditions while using standard forms and algorithmic procedures to scale up and travel?

One obvious thing to say about the Participatory Development Toolkit is that it does not contain tools or supplies of a conventional kind. There are no hammers, pliers, or wrenches; there are no Band-Aids, gauze, or Bactine as there would be in a first aid kit; it is not quite the “kit” pioneered by Médecins Sans Frontières capable of unfolding an emergency treatment center in a remote or decimated location (Redfield 2013:69ff). Instead, it contains scripts, games, and procedures designed to elicit experiences. When opened and set into operation, it tries to create a joyful occurrence: people are called to draw pictures, make maps, play a game, or discuss a problem related to their immediate life experience and surroundings. In this respect, its “devices” are similar to what Soneryd and Lezaun call “technologies of elicitation,” or what Caroline Lee refers to as “do-it-yourself” or “designer” democracy; they are procedures and practices of convoking individuals to elicit debate, deliberation, opinion, or decision-making (Lee 2014; Lezaun and Soneryd 2007).

The toolkit is not, however, immaterial as a result. The material properties of the Participatory Development Toolkit are important; it is meant to travel, it has a handle, and it carries both its theory and its practice in easily accessible compartments and a handy users’ manual. The toolkit is not a device itself, but more like a “platform”: a box full of different devices all dependent on a similar form of action and general theory of participation. These devices are not technologically sophisticated, but neither, really, are most apps or software programs. They may depend on a technologically sophisticated infrastructure (to exist), but at the end of the day they are simple programs: devices designed to achieve particular results. What is the relation between the participating humans and the toolkit? In the toolkit, the games and images and scripts call on people to interact in specific ways. The development agents, along with those they interact with (villagers, women, engineers, farmers, politicians), are given rules, or shown images, or follow loose scripts for “non-directive” interaction with each other. The goal, or outcome, is to either diagnose a problem or propose solutions to it. It does not solve a problem diagnosed elsewhere, higher up or far away, without the involvement of people, but presumes instead that the diagnoses of a problem itself has yet to happen, or that the proposed solutions must come from the context-specific encounter itself.

This is the origin of its power: it promises a highly context-dependent exploration of problems specific to those who meet and engage in the production of these experiences. This is why it enrolls people into its project. The conundrum comes from the fact that the devices for eliciting such experiences are (perhaps unwillingly) universalized in the toolkit, made to travel. Whereas an individual development consultant might bring a set of techniques and procedures with her to a variety of (necessarily limited) places, the Participatory Development Toolkit implicitly suggests that through replication, many more people can carry these procedures to many more places.

What’s more, it is not merely a toolkit-as-commodity being replicated; it is also a toolkit funded by and branded with the insignia of the World Bank and the UNDP. These institutions make participation more or less bureaucratic, and authorize them as forms of practice. It is not clear that the Participatory Development Toolkit was required in any way, but along with manuals such as the World Bank Participation Sourcebook t(World Bank 1996) he techniques and procedures were incorporated into the standardized practices of development. One can find the same games and scripts in the Sourcebook that appear in the Participatory Development Toolkit.

The institutional standardization or participation is what provokes the suspicion of the toolkit itself, in cases such as Francis Cleaver’s critique above; rather than a highly contextualized participatory engagement, it suggests instead a bureaucratically standardized set of forms and practices, riven from the context. Soneryd makes a similar point in discussing more recent “technologies of participation”: it is not an accident that this standardization happens precisely because many actors in these organizations actively seek to “imitate and replicate” forms of participation that have worked elsewhere (Soneryd 2016:149).

Such institutional embedding (to use the new institutionalist language) is not dissimilar to the kind of infrastructural “network effects” (to use the engineering/economic language) of internet-based apps and platforms that similarly circulate plans, techniques, and procedures in the interest of producing an experience. As the kit succeeds, it draws more people into a particular form of participation, and produces professionals and networks of practice that draw on these tools as exemplary forms of participation. Both aim at scaling up and circulating the local without losing (the character of that) local specificity. But such tools are inevitably subject to both technological and institutional mimcry, standardization, and control, whether that be an audit culture of measuring results or an advertising-dependent system of revenue generation.

The Participatory Development Toolkit represents a stage in this evolution. It is “quasi-algorithmic” but not fully routine in the sense that it does not operate automatically, in the absence of context, judgment, or serendipity. Nor is it “computational” in any sense. Rather, a development agent takes the place of the networked computer: he or she runs the program (as a neutral agent: a CPU, as it were) and records the data into memory. The users of the algorithm are the participants: villagers, women, extension agents, etc. They give their data and ideas to the machine in the hopes that it will spit out a solution and perhaps some money.

The term “algorithm” used to mean a set of rules, not unlike a recipe, or the rules of a game. In this respect, the operator is like a player or a chef: some are good and some are bad. Robert Chambers’ desire to see development agents remake themselves as agents of participation relies on such a notion: you can have the best recipes in the world, and still produce a bad meal.

Open Ended Snakes and Ladders; Activity #24 in (Narayan-Parker and Srinivasan 1994, pp. 54–55)/

Lately, however, the “algorithm” has come to mean something more than just a set of steps. Rather, it is a kind of living system that depends both on computational processing of recipe-like rules, and on the constant input of many participants: participants who feed it regularly, not just use it. The Facebook timeline, to take only the most storied case, depends both on a large set of rules of searching, sorting, and comparing possible content, and on an always-changing database of what people who are connected to other people view, like, linger upon, or swipe past. This is not the same thing as a simple set of rules that depend on expert execution; rather, it seems to enable a certain fantasy of—and provoke a certain desire for—participating in an enormous, amorphous, yet nevertheless intimate collective that represents itself to itself constantly.

In its ideal version, this happens completely without human control or intervention, making the local into a universal. In reality, such “automation” reproduces the good and the bad of the local (as Facebook, Twitter, and others are discovering in the case of the 2016 U.S. election), and a reversion to the former meaning of algorithm becomes more appealing again.

Seen from this perspective, the Participatory Development Toolkit is an interesting moment in the development of devices for development. It is perhaps more like the algorithm-as-recipe in its quaint leather-bound form, but perhaps it also betrays a desire for the newer algorithm-as-system in which all over the world, people are enabled to participate constantly in the diagnosis and solution of their own problems. Or maybe it should be seen from the success of the contemporary demand for constant, unreflective participation of the sort promoted by social media. Perhaps it reveals a now nearly forgotten desire for scaling up something difficult to scale up: the reflexive practitioner whose “algorithm” is human judgment, memory, and discernment, and not an automatic, machine-learning, artificial intelligence. Perhaps it reveals a present danger of an endless participation without deliberation, whereas the analog briefcase could still, at least, contain a trace of the reflexive practitioner, the Marxist pedagogue, or the evangelical development aesthete.

Christopher Kelty is professor at the University of California, Los Angeles.

References

Alden, Jenna. 2012. “Bottom-Up Management: Participative Philosophy and Humanistic Psychology in American Organizational Culture, 1930-1970” [PhD dissertation]. Columbia University Academic Commons. New York, NY: Columbia University.

Chambers, Robert. 1983. Rural Development: Putting the Last First. World Development Series. Newark, NJ: Prentice Hall.

———. 1994. “The Origins and Practice of Participatory Rural Appraisal.” World Development 22(7):953–969.

Cooke, Bill, and Uma Kothari. 2001. Participation: The New Tyranny? London, UK: Zed Books.

Cornwall, Andrea, and Ian Scoones. 2011. Revolutionizing Development: Reflections on the Work of Robert Chambers. Oxon; New York: Earthscan.

Freire, Paulo. 2014. Pedagogy of the Oppressed: 30th Anniversary Edition. Originally published 1970. London, UK: Bloomsbury Publishing.

Green, Maia. 2014. The Development State: Aid, Culture & Civil Society in Tanzania. Suffolk, U.K.: James Currey.

Kelty, Christopher M. 2017. “Too Much Democracy in All the Wrong Places: Toward a Grammar of Participation.” Current Anthropology 58(S15):S77–S90.

Lee, Caroline. 2014. Do-It-Yourself Democracy: The Rise of the Public Engagement Industry. New York, NY: Oxford University Press.

Lezaun, Javier, and Nerea Calvillo. 2014. “In the Political Laboratory: Kurt Lewin’s Atmospheres.” Journal of Cultural Economy 7(4):434–457.

Lezaun, Javier, Noortje Marres, and Manuel Tironi. 2016. Experiments in Participation. In Handbook of Science and Technology Studies, edited by Ulrike Felt, Rayvon Fouché, Clark A. Miller, and Laurel Smith-Doerr, pp. 195-222. Cambridge, MA: MIT Press.

Lezaun, Javier, and Linda Soneryd. 2007. “Consulting Citizens: Technologies of Elicitation and the Mobility of Publics.” Public Understanding of Science 16(3):279–297.

Narayan-Parker, Deepa, and Lyra Srinivasan. 1994. Participatory Development Tool Kit: Materials to Facilitate Community Empowerment. Compiled by Deepa Narayan-Parker and Lyra Srinivasan. 221 activity cards, 65 pictures, 11 charts, 1 guidebook in briefcase; 26 ´ 33 ´ 10 cm. Washington, DC: World Bank.

Pearse, Andrew, and Matthias Stiefel. 1979. “Inquiry into Participation: A Research Approach.” Technical report UNRISD/79/C.14 GE.79-2103. Geneva, Switzerland: United Nations Research Institute for Social Development.

Reason, Peter. 2008. The SAGE Handbook of Action Research: Participative Inquiry and Practice. Los Angeles, CA, and London, UK: SAGE.

Reason, Peter, and Hilary Bradbury. 2001. Handbook of Action Research: Participative Inquiry and Practice. London, UK: Sage.

Redfield, Peter. 2013. Life in Crisis: The Ethical Journey of Doctors Without Borders. Berkeley, CA: University of California Press.

———. 2016. “Fluid Technologies: The Bush Pump, the LifeStraw®, and Microworlds of Humanitarian Design.” Social Studies of Science 46(2):159–183.

Sawyer, Ron. 2011. “SARAR: A Methodology by Lyra Srinivasan.” Culture Unplugged. Available at online/play/12780/SARAR–a-methodology-by-Lyra-Srinivasan”>link.

Soneryd, Linda. 2016. “Technologies of Participation and the Making of Technologized Futures.” In Remaking Participation: Science, Environment and Emergent Publics, edited by Jason Chilvers and Matthew Kearnes, pp. 144–161. London, UK, and New York, NY: Routledge.

Srinivasan, Lyra. 1990. Tools for Community Participation: A Manual for Training Trainers in Participatory Techniques. New York, NY: PROWWESS/United Nations Development Program.

World Bank. 1996. The World Bank participation sourcebook. Washington, D.C. : The World Bank. link.

World Bank. 2001. World Development Report 2000/2007: Attacking Poverty. World Development Report. New York: World Bank.


Notes

[1] SARAR is a an acronym for “Self esteem, Associative strength, Resourcefulness, Action Planning, Responsibility.” For more on SARAR, see Sawyer’s documentary (2011).

The Firesign Theatre’s Wax Poetics: Overdub, Dissonance, and Narrative in the Age of Nixon

Screen Shot 2017-11-22 at 12.29.57 AM

The Firesign Theatre are the only group that can claim among its devoted fans both Thom Yorke and John Ashbery; who have an album in the National Recording Registry at the Library of Congress and also coined a phrase now used as a slogan by freeform giant WFMU; and whose albums were widely distributed by tape among U.S. soldiers in Vietnam, and then sampled by the most selective classic hip hop DJs, from Steinski and DJ Premier to J Dilla and Madlib.

Formed in 1966, they began their career improvising on Los Angeles’s Pacifica station KPFK, and went on to work in numerous media formats over their four-decade career. They are best known for a series of nine albums made for Columbia Records, records that remain unparalleled for their density, complexity, and sonic range. Realizing in an astonishing way the implications of the long playing record and the multi-track recording studio, the Firesign Theatre’s Columbia albums offer unusually fertile ground for bringing techniques of literary analysis to bear upon the fields of sound and media studies (and vice versa). This is a strategy that aims to reveal the forms of political consciousness that crafted the records, as well as the politics of the once-common listening practices binding together the disparate audiences I have just named. It is no accident that the associative and referential politics of the sample in “golden age” hip hop would have recognized a similar politics of reference and association in Firesign Theatre’s sound work, in particular in the group’s pioneering use of language, time, and space.

Screen Shot 2017-11-22 at 12.31.54 AM

The Firesign Theatre (wall of cables): John Rose, Image courtesy of author

The Firesign Theatre is typically understood as a comedy act from the era of “head music” — elaborate album-oriented sounds that solicited concerted, often collective and repeated, listening typically under the influence of drugs. But it may be better to understand their work as attempting to devise a future for literary writing that would be unbound from the printed page and engaged with the emergent recording technologies of the day. In this way, they may have crafted a practice more radical, but less recognizable, than that of poets —such as Allen Ginsberg or David Antin, both of whose work Firesign read on the air — who were also experimenting with writing on tape during these years (see Michael Davidson’s Ghostlier Demarcations: Modern Poetry and the Material Word, in particular 196-224). Because their work circulated almost exclusively on vinyl (secondarily on tape), it encouraged a kind of reading (in the strictest sense) with the ears; the fact that their work was distributed through the networks of popular music may also have implications for the way we understand past communities of music listeners as well.

The period of Firesign’s contract (1967-1975) with the world’s largest record company parallels exactly the recording industry’s relocation from New York to Los Angeles, the development of multitrack studios which made the overdub the dominant technique for recording pop music, and the rise of the LP as a medium in its own right, a format that rewarded, and in Firesign’s case required, repeated listening. These were all factors the Firesign Theatre uniquely exploited. Giving attention to the musicality of the group’s work, Jacob Smith has shown (in an excellent short discussion in Spoken Word: Postwar American Phonograph Cultures that is to date the only academic study of Firesign) how the group’s attention to the expansion of television, and in particular the new practice of channel-surfing, provided both a thematic and a formal focus for the group’s work: “Firesign […] uses channel surfing as the sonic equivalent of parallel editing, a kind of horizontal or melodic layering in which different themes are woven in and out of prominence until they finally merge. Firesign also adds vertical layers to the narrative in a manner analogous to musical harmony or multiple planes of cinematic superimposition” (181). But more remains to be said not only about the effect of the Firesign Theatre’s work, but about its carefully wrought semantics, in particular the way the “horizontal” and “vertical” layers that Smith identifies were used as ways of revealing the mutually implicated regimes of politics, culture, and media in the Vietnam era — at the very moment when the explosion of those media was otherwise working to disassociate those fields.

The group’s third album, Don’t Crush That Dwarf, Hand Me the Pliers is typically understood as their first extended meditation on the cultural phenomenology of television. Throughout the record, though there is much else going on, two pastiches of 1950s genre movies (High School Madness and a war film called Parallel Hell!) stream intermittently, as if through a single channel-surfing television set. The films coincide in two superimposed courtroom scenes that include all the principal characters from both films. By interpenetrating the school and the war, the record names without naming the killing of four students at Kent State and two students at Jackson State University, two events that occurred eleven days apart in May 1970 while the group was writing and recording in Los Angeles. Until this point rationalized by the framing fiction of a principal character watching both films on television, the interpenetration of the narratives is resolvable within the album’s diegesis—the master plot that accounts for and rationalizes every discrete gesture and event—only as a representation of that character’s having fallen asleep and dreaming the films together, a narrative sleight of hand that would testify to the group’s comprehension of literary modernism and the avant-garde.

The question of what may “cause” the interpenetration of the films is of interest, but the Firesign Theatre did not always require justification to elicit the most outrageous representational shifts of space (as well as of medium and persona). What is of more interest is the way rationalized space — the space implied by the “audioposition” of classic radio drama, as theorized by Neil Verma in Theater of the Mind— could be de-emphasized or even abandoned in favor of what might instead be called analytic space, an aural fiction in which the institutions of war and school can be understood as simultaneous and coterminous, and which more broadly represents the political corruptions of the Nixon administration by means of formal and generic corruption that is the hallmark of the Firesign Theatre’s approach to media (35-38).

While the techniques that produce this analytic soundscape bear some resemblance to what Verma terms the “kaleidosonic style” pioneered by radio producer Norman Corwin in the 1940s — in which the listener is moved “from place to place, experiencing shallow scenes as if from a series of fixed apertures” — even this very brief sketch indicates how radically the Firesign Theatre explored, deepened, and multiplied Corwin’s techniques in order to stage a more politically diagnostic and implicative mode of cultural interpretation. Firesign’s spaces, which are often of great depth, are rarely traversed arbitrarily; they are more typically experienced either in a relatively seamless flow (perspective and location shifting by means of an associative, critical or analytical, logic that the listener may discover), or are instead subsumed within regimes of media (a radio broadcast within a feature film which is broadcast on a television that is being watched by the primary character on the record album to which you are listening). According to either strategy the medium may be understood to be the message, but that message is one whose horizon is as critical as it is aesthetic.

Screen Shot 2017-11-22 at 12.33.38 AM

Firesign Theatre (pickup truck): John Rose, Image courtesy of author

The creation of what I am terming an analytic space was directly abetted by the technological advancement of recording studios, which underwent a period of profound transformation during the years of their Columbia contract, which spanned the year of The Beatles’s Sergeant Pepper’s Lonely Hearts Club Band (arguably the world’s first concept album, recorded on four tracks) to Pink Floyd’s Wish You Were Here (arguably that band’s fourth concept album, recorded on 24 tracks). Pop music had for years availed itself of the possibilities of recording vocals and solos separately, or doubly, but the dominant convention was for such recordings to support the imagined conceit of a song being performed live. As studios’ technological advances increased the possibilities for multitracking, overdubbing, and mixing, pop recordings such as Sgt. Pepper and the Beach Boys’ Pet Sounds (1966) became more self-evidently untethered from the event of a live performance, actual or simulated. In the place of the long-dominant conceit of a recording’s indexical relation to a particular moment in time, pop music after the late 60s came increasingly to define and inhabit new conceptions of space, and especially time. Thus, when in 1970 Robert Christgau asserted that the Firesign Theatre “uses the recording studio at least as brilliantly as any rock group” (and awarding a very rare A+), he was remarking the degree to which distortions and experiments with time and space were if anything more radically available to narrative forms than they were to music.

The overdub made possible much more than the simple multiplication and manipulation of aural elements, it also added depth and richness to the soundfield. New possibilities of mixing, layering, and editing also revealed that the narrative representation of time, as well as spatial element I’ve just described, could be substantially reworked and given thematic meaning. In one knowing example, on 1969’s How Can You Be in Two Places at Once When You’re Not Anywhere at All, an accident with a time machine results in the duplication of each of the narrative’s major characters, who then fight or drink with each other.

This crisis of the unities is only averted when a pastiche of Franklin Delano Roosevelt interrupts the record’s fictional broadcast, announcing the bombing of Pearl Harbor, and his decision to surrender to Japan. On a record released the year the United States began secret bombing in Cambodia, it is not only the phenomenological, but also the social and political, implications of this kind of technologically mediated writing that are striking: the overdub enables the formal representation of “duplicity” itself, with the gesture of surrender ironically but pointedly offered as the resolution to the present crisis in Southeast Asia.

To take seriously the Firesign Theatre’s experiments with medium, sound, and language may be a way of reviving techniques of writing — as well as recording, and of listening — that have surprisingly eroded, even as technological advances (cheaper microphones, modeling software, and programs from Audacity and Garage Band to Pro Tools and Ableton Live) have taken the conditions of production out of the exclusive purview of the major recording studios. In two recent essays in RadioDoc Review called “The Arts of Amnesia: The Case for Audio Drama Part One” and “Part Two,” Verma has surveyed the recent proliferation of audio drama in the field of podcasting, and urged artists to explore more deeply the practices and traditions of the past, fearing that contemporary aversion to “radio drama” risks “fall[ing] into a determinism that misses cross-fertilization and common experiment” (Part Two, 4). Meanwhile, Chris Hoff and Sam Harnett’s live performances from their excellent World According to Sound podcast are newly instantiating a form of collective and immersive listening that bears a resemblance to the practices that were dominant among Firesign Theatre listeners in the 1960s and 70s; this fall they are hosting listening events for Firesign records in San Francisco.

Screen Shot 2017-11-22 at 12.34.37 AM

The Firesign Theatre (mixing board): Bob & Robin Preston,  Image courtesy of  author

It is tempting to hope for a wider range of experimentation in the field of audio in the decade to come, one that either critically exploits or supersedes the hegemony of individualized listening emblematized by podcast apps and noise-cancelling headphones. But if the audio field instead remains governed by information-oriented podcasts, leavened by a subfield of relatively classical dramas like the very good first season of Homecoming, a return to the Firesign Theatre’s work can have methodological, historical, and theoretical value because it could help reveal how the experience of recorded sound had an altogether different political inflection in an earlier era. Thinking back to the remarkably heterogeneous set of Firesign Theatre fans with which I began, it is hard not to observe that the dominant era of the sample in hip hop is one where it was not the Walkman but the jambox — with its politics of contesting a shared social space through collective listening — was the primary apparatus of playback. However unwished- for, this determinist line of technological thinking would clarify the way media audiences are successively composed and decomposed, and show more clearly how, to use Nick Couldry’s words in “Liveness, ‘Reality,’ and the Mediated Habitus from Television to the Mobile Phone,” “the ‘habitus’ of contemporary societies is being transformed by mediation itself” (358).

Featured Image: The Firesign Theatre (ice cream baggage claim): John Rose, courtesy of author.

Jeremy Braddock is Associate Professor of English at Cornell University, where he specializes on the production and reception of modernist literature, media, and culture from the 1910s throughout the long twentieth century. His scholarship has examined the collective and institutional forms of twentieth-century authorship that are obscured by the romanticized figure of the individual artist. His book Collecting as Modernist Practic— a study of anthologies, archives, and private art collections — won the 2013 Modernist Studies Association book prize. Recent publications include a short essay considering the literary education of Robert Christgau and Greil Marcus and an essay on the Harlem reception of James Joyce’s Ulysses. He is currently working on a book on the Firesign Theatre.

REWIND! . . .If you liked this post, you may also dig:

tape-reel

“Radio’s “Oblong Blur”: Notes on the Corwinesque”–Neil Verma

The New Wave: On Radio Arts in the UK–Magz Hall

This is Your Body on the Velvet Underground–Jacob Smith

Demanding Mobile Health

In 2013, MOS@N, an experimental mobile health (mHealth) network providing medical monitoring and follow-up of pregnant women, was launched in the health district of Nouna in rural Burkina Faso. MOS@N is implemented by the Centre de Recherche en Santé de Nouna (CRSN), a national health research center. It is funded by Canada’s International Development Research Centre (IDRC) and supported by the Ministry of Health. MOS@N operates in an area where maternal mortality remains a major public health challenge, and where the rates of antenatal care consultation (ANC) attendance and of assisted delivery are relatively low. It aims to pilot the use of mobile devices to improve the use of health care services by pregnant women. MOS@N sends voice medical appointment reminders and health advice to “godmothers,” community relays selected as part of the project to follow up with pregnant women in their respective villages. To do so, godmothers were provided with a mobile phone and a bicycle to facilitate their movement within the village as they travel to the local primary health care center (PHC). The cell phone has prerecorded health education messages for godmothers to play when convening maternal health awareness sessions. Equipped with phones and data connectivity, godmothers can reach remote populations to provide them with health advice and information.

MOS@N also includes an electronic health record system that runs on computers installed for that purpose at local PHCs. Since none of the local PHCs are connected to the electricity grid, they were also provided solar panels to keep the computers running. Health workers—nurses and midwives—at PHCs are in charge of entering patient data into the system, which then automatically generates the reminders sent to the godmothers’ phones. In 26 villages, served by five different PHCs, MOS@N brings together pregnant women, godmothers, rural PHCs, health workers, technicians, public health researchers, server rooms, an automatic callback system, bicycles, computers, portable solar panels, batteries, cell phones and refill cards, not to mention husbands, dirt roads, bicycle repair stations, heavy rains, and village authorities, in an experimental network.

The entrance of the CRSN

Fig. 1. The entrance of the CRSN. Photo by author.

The number of mHealth projects and systems implemented in low- and middle-income countries has doubled in the past five years. Driven by the leadership of the World Health Organization (WHO), global health organizations, researchers, and donors increasingly expect data connectivity to strengthen health systems, reduce costs for access to health care, and thus contribute to health equity. Connectivity promises to bring new bodies and populations into sight, alleviating suffering and saving lives; any obstacle to the flow of information is increasingly the cause of suffering and loss of life.

In Burkina Faso too, mHealth networks are multiplying. Most initiatives are aimed at making the national health system more data driven, with a strong emphasis on maternal and child health. In the wake of the Ebola virus epidemic in West Africa, initiatives aimed at digitizing public health surveillance and outbreak response management have also been on the rise. Organizations involved in the funding, design, and deployment of mHealth in Burkina Faso include the Centre Muraz, Terre des Hommes, WHO, Bill & Melinda Gates Foundation, UNICEF, and Grand Challenges Canada. Notable projects include the Integrated eDiagnostic Approach (IeDA), which uses mHealth devices to provide diagnostic support to health care workers and to collect data made available to public health decision makers. Another example is the integration of the WHO-sponsored Maternal Death Surveillance and Response (MDSR) into the national disease surveillance system, requiring health workers to immediately report cases of maternal death via mobile phones. The Burkina Faso Ministry of Health has been supportive of these developments, and it has recently adopted a nationwide strategy to integrate digital technologies into the national health system.

MOS@N was designed in response to a call for proposals by the IDRC to attract projects that would contribute to building evidence of the impact of digital technologies on health systems. There were three specific expectations:
 

  • Access. Information and communication technologies (ICTs) suggest the projects have the potential to make health systems more equitable through better access to health care and information. IDRC takes up a popular theme within mHealth literature: mobile devices are expected to strengthen equity by reducing disparities related to cost, distance, and inadequate health infrastructure (Mehl and Labrique 2014:184). They are expected to enable relatively transparent, seamless communication, thus facilitating the provision of health services to previously underserved populations.
  •  

  • Operational knowledge. IDRC-supported projects should contribute in bridging the gap between research and implementation. Exemplifying the rising popularity of operational research (or operations research) in global health, projects were expected to generate evidence for decision making by studying the process of implementation itself rather than focusing only on health outcomes. IDRC’s call thus insisted that selected projects should examine how ICTs were being integrated into resource-constrained settings by paying attention to the local usage and adoption of mobile technology. Although IDRC’s call was premised on the notion that connectivity should improve access to health care, it aimed to find out what “works” and what does not in various contexts.
  •  

  • Replicability. IDRC’s call directly responded to the lack of evidence in the literature about the scaling of mHealth initiatives into health systems. The vast majority of mHealth interventions are indeed only pilot projects, and remain so. Proposals selected by IDRC were to pay particular attention to the potential for scalability or replicability. As I suggest, MOS@N indeed raises the problem of scale: How can its implementation process be replicated so that connectivity produces similar effects in different settings?

 

Enclosures and Expenditures

MOS@N is trying to facilitate the wireless mobility of data. However, soon after it was launched, it became evident that individuals, devices, data, and information assembled in MOS@N do not move easily. Their circulation is severely hindered, if not altogether immobilized. Obstacles are many and include poor geographical access to PHCs, considering that women often live between 5 and 10 kilometers from the nearest centers; the rainy season, when roads become impassable—sandy, clayey, if not literally flooded; a livelihood depending on women working in the fields, away from home and thus from solar panel chargers, and often from cell phone signals, too; devices that were not as portable as expected, with godmothers carrying their phones in their hands, an obstacle that some have overcome by crafting neck pouches; gender dynamics, with some husbands trying to keep godmothers away from pregnant women, or pregnant women away from the PHC; broken things, including phones, solar panels, bicycles, and computers; unreliable network connectivity; unintelligible voice messages; difficulties using mobile phones and computers; health workers lacking time to enter health data into the computer, compromising its circulation; women who won’t discuss their health status with godmothers or health workers; and conflicts between health workers, godmothers, and pregnant women. A godmother summarizes some of these obstacles:

The other day, when I was accompanying a pregnant woman, we started walking but we did not reach the Bagala PHC on time so she delivered on the road. I called the nurse at the Sikoro PHC [to which the godmother would normally take women of her village, except during the rainy season] to let her know that one of my women had just delivered on the road. I then brought her to Bagala but the nurse refused to see her. She kept asking why she delivered at home. I explained that we really were on the road to come here, and that the umbilical cord was not cut yet and her clothes were soiled with blood. But she reprimanded us, so we left. I gave my phone to her husband who then called the Sikoro nurse to let her know. In the end we brought her back home to cut the umbilical cord ourselves.

The mobility of data, which MOS@N aims to facilitate, in fact still entails the mobility of devices and bodies. And it entails significant expenditure. First, there is physical labor. MOS@N generates displacement, especially for godmothers. Although they were provided with a bicycle to facilitate their travels, the role of godmothers has gradually evolved to include the accompaniment to PHCs. As the story above shows, godmothers now walk and ride along with pregnant women. They also assist health workers during deliveries. This new role was improvised in response to the technical difficulties in generating automatic voice reminders. Indeed, as a result of many of the challenges listed above, godmothers generally do not receive the automatic reminders on time, if at all, as was initially planned. Therefore, they may spend hours on the road every day depending on where they live.

Adverse Road Conditions

Fig. 2. Adverse Road Conditions. Photo by DAKISSAGA Judion.

Mobility in MOS@N also comes with material and energy expenditure. Batteries, cell phones, and portable solar panels are often recharged, disposed of, and replaced. Bicycles are repaired and replaced. Bandwidth is consumed. Project managers move across the district, not to mention donors, researchers, and other visitors traveling to Nouna. MOS@N also comes with a significant increase in workload for health workers, who have to enter patient data into the computer after each consultation, not to mention the tireless work of MOS@N’s field manager, logisticians, technicians, and supervisors. Improved access to health care and information in MOS@N has little to do with an easy circulation—of devices, godmothers, and messages—enabled by a stable, underlying network infrastructure. MOS@N foregrounds the corporeal and material demands of media mobility. Little devices apparently carry more than their own weight.

Media-Worlds

“Knowledge is like light. Weightless and intangible, it can easily travel the world, enlightening the lives of people everywhere” (World Bank 1999:1). It is with these words that a flagship World Development Report on knowledge for development began before emphasizing that millions of children die because of their parents’ lack of access to knowledge. Since the report was published almost two decades ago, mHealth devices have come to embody better than any other technology the medium promising such a life-saving access to knowledge. As such, they display a strong capacity to enchant and mobilize affect (Harvey and Knox 2012; Larkin 2013). The affective power of mHealth devices is directly related to their technical qualities, including their compact size and portability. Although the hype surrounding mHealth has lessened in recent years—a situation acknowledged by IDRC’s insistence on the need to determine what “works” and what does not—the underlying vision of mobile devices as fluid, neutral conduits for the flow of information remains largely unchallenged (Duclos, 2016).

In contrast to this understanding of media devices, ethnographic material on MOS@N points to a conception of media as messy, unpredictable, and transformational. In MOS@N, media devices not only carry symbols and meanings but actually shape connections and transform who/what is connected. This is partly due to the expenditures that come with failing data connectivity, resulting in godmothers still using their phones but now also walking with pregnant women to monitor their attendance to ANCs. Although we are left speculating about the effects of reliable automatic reminders, what is certain is that MOS@N alters individual and collective existence in Nouna in far-reaching ways.

A primary illustration of this lies in how godmothers are not merely connected to PHCs through mobile devices but in fact have come to think of themselves as the intermediaries between PHCs and their community: “We act as intermediaries between health workers and communities.” “The main effect of the project is that now villagers are not afraid of health workers anymore.” Or, in the words of a nurse speaking of godmothers:

Godmothers are extremely useful because here at the PHC, we do not know people in the community. Because they live in the villages, they have become our mouths and our ears with the population. …I’m a stranger here, but they know everyone. Who else could get them to participate in our activities? Now, all we have to do is call them [the godmothers].

In a sense, godmothers, not mobile phones, are MOS@N’s primary mediating devices. Godmothers of course are not passive conduits, and their work of mediation may have unforeseen effects. They spend considerable time with midwives and nurses, gaining practical knowledge and experience, but also experiencing conflicts and performing tasks not designed as part of MOS@N. Godmothers have assisted women delivering on their way to the PHC, stayed several nights at the PHC, cleaned up the PHC, and have mobilized local women as part of mass vaccination campaigns. Some have lent their phone to their children so that they could listen to music, sometimes never to see it back in working order. Others have forgotten to deliver messages, or delivered them late, or to the wrong person. Overall, though, godmothers and their phones have been described as a reassuring presence.

MOS@N has also had a significant impact on the organization of community life in Nouna. Cell phone ownership, accompaniment, and health education sessions have brought godmothers considerable social recognition. It has changed the way they are perceived. Neighbors, family members, and children borrow their phones. Some are called “doctors,” or are given small presents. In some cases, the role of godmother has come with emotional hardship. Their husbands sometimes frown upon their ownership of mobile phones. Resentment from fellow villagers is also common. Godmothers can, for instance, be accused of spreading rumors, or of deception. The confidential nature of pregnancy, the age of godmothers (in some cases younger than the women they follow), jealousy over the choice of the godmother (and her stipend and equipment), or health complications may all contribute in causing tension, and in one case even leading to the banishment of a godmother and her family from their village. In general, though, when speaking of MOS@N godmothers express sentiments of satisfaction, excitement, and deep pride. Being a godmother arguably comes with a new orientation to others and to the world.

The Futures of MOS@N

At the time of writing, funding had run out and MOS@N had come to an end. MOS@N was designed and deployed as a pilot project. As mentioned earlier, MOS@N had to improve access to health care and information to be considered successful. To a large extent, MOS@N has done just that. In addition to creating unexpected relationships, MOS@N has generated measurable public health outcomes showing significant improvement in antenatal attendance and assisted delivery rates in participating villages. MOS@N was also expected to generate data for global health donors/funders/policymakers, particularly about how it achieved this outcome, and how this process could be scaled. This is where success became harder to measure. How could MOS@N lead to something else? Can it be replicated and, if so, under what conditions?

As discussed earlier, the mobility of data in MOS@N came at a heavy cost, whether it was in terms of physical labor, or material and energy expenditure. MOS@N depends on a persistent and demanding care for the relations that constitute it as media-world. In a related manner, MOS@N relied on a high degree of improvisation, or what could be considered an “experimental” ethos. It is important to emphasize that researchers at the CRSN were aware that MOS@N remained largely experimental. Although this was their first mHealth project, they knew, because they had worked closely with local communities in the past decade, that the demands placed upon mHealth within these communities might differ from their own. MOS@N involved qualitative research aimed at exploring these demands, at examining the project’s effect, and at fine-tuning it along the way.

In other words, researchers at the CRSN were aware of what they did not know, which is the problem: now that they are looking for funding to scale the network, there is still much they do not know. They have gained implementation skills, but can they trust that “more of the same” will lead to future success when the “same” is in itself contingent and unpredictable? To what extent does the experimental ethos guiding MOS@N’s implementation lend itself to formalization? What if the way MOS@N succeeded does not suggest easy replicability? After all, MOS@N does deflate any expectation that mHealth networks can be extended in a parsimonious, predictable manner. CRSN researchers did pay close attention to the processes that breathe life into MOS@N and make it work, thus at least partially meeting IDRC’s expectation for operational knowledge. But rendering processes visible does not suddenly make these processes amenable to prediction. The future of MOS@N faces a conundrum critical to the deployment of little development devices in general: only out of fragile, messy connections do consistencies appear to emerge.

Acknowledgments

This paper is the result of a collaboration with the Centre de Recherche en Santé de Nouna (CRSN). I would like to thank Ali Sié, Maurice Yé, Gilles Bibeau, Hamidou Sanou, Moubassira Kagoné and Hélène Sawadogo for introducing me to the CRSN, and for their work which is critical to this research. I also thank the editors of this collection for their genuine comments.

Vincent Duclos is Assistant Professor at Drexel University, with a joint appointment in the Center for Science, Technology & Society, and the Dept. of Global Studies & Modern Languages.

References

Duclos, V. 2016. The map and the territory: An ethnographic study of the low utilisation of a global eHealth network. Journal of Information Technology 31(4):334–346.

Harvey, P., and H. Knox. 2012. The Enchantments of Infrastructure. Mobilities 7(4):521–536.

Larkin, B. 2013. The Politics and Poetics of Infrastructure. Annual Review of Anthropology 42(1):327–343.

Mehl, G., and A. Labrique. 2014. Prioritizing integrated mHealth strategies for universal health coverage. Science 345(6202):1284–1287.

World Bank. 1999. World Development Report 1998/99: Knowledge for Development. New York, NY: Oxford University Press.

Image Credit: Panel from The White Ribbon Alliance for Safe Motherhood Burkina Faso, Section Kadiogo. Click to view full image.

Iterate, Experiment, Prototype

Innovation by Design

In 2012, the United Kingdom’s Department for International Development (DFID) launched its Innovation Hub (i-Hub). It thereby followed a trend among development actors who see innovation, broadly defined as generating new ideas leading to large-scale solutions, as crucial to increase the impact of their interventions. In the words of Judith Rodin, former long-time president of the Rockefeller Foundation, “Innovation alone will not solve all of the problems facing humanity, but we certainly won’t solve many without it” (2016:6). In contrast to mainstream innovation that does not pay attention to existing inequalities and can thereby exacerbate them, academics and practitioners are increasingly using the term “inclusive innovation” to describe innovation practices that explicitly aim to improve the lives of marginalized groups. Having subsumed terms such as “pro-poor,” “below-the-radar,” “grassroots,” or “frugal innovation,” inclusive innovation refers to “the inclusion within some aspects of innovation of groups who are currently marginalized,” either through products and services developed specifically for them, their incorporation into the innovation process, or support for their own grassroots innovation efforts (Foster and Heeks 2013: 335).

This innovation turn is often joined by the embrace of humanitarian design, which is the application of design methods to development ends or—as designers would say—to change what is into what ought to be. Humanitarian design has its historical roots in longstanding alternative design traditions such as universal, ecological, or feminist design. E. F. Schumacher’s Small is Beautiful (1973) and Victor Papanek’s Design for the Real World (1984) called for the use of socially and environmentally responsible technologies and design. Papanek himself designed a 9-cent radio made of a used tin can and powered by wax or animal dung burned underneath it, which was distributed by UNESCO in India and Indonesia. The tin-can radio can be seen as an early humanitarian device and the forerunner of the hand-crank or solar-powered radios now ubiquitous in many places in the Global South. More recently, when Melinda Gates and Paul Farmer were asked what they saw as the innovation that is changing most lives in the developing world, they answered, “human-centered design,” in reference to a particular brand of humanitarian design advocated by IDEO.org (Wired 2013).

IDEO.org is the nonprofit subsidiary of IDEO, a Silicon Valley–based international design consultancy that has been one of the most successful commercial entrants into the humanitarian design space. Its DesignKit website (www.designkit.org) provides popular online courses, field guides, and case studies. Through these free resources and articles in leading magazines such as the Stanford Social Innovation Review, IDEO has played an important role in legitimizing the participation of professional designers in the development enterprise. In the latter publication, IDEO Chief Executive Officer Tim Brown and IDEO.org Executive Director Jocelyn Wyatt argue, “Time and again, [development] initiatives falter because they are not based on the client’s or customer’s needs and have never been prototyped to elicit feedback” (Brown and Wyatt 2010:31). By branding its own approach as human-centered design (HCD), IDEO.org makes explicit—and appropriates more visibly than other organizations in this space—the critical centrality of user perspectives in humanitarian design.

Part of the legitimation process of humanitarian design is redefining the development problem as a shortage of creative ideas, flawed system design, and preconceived notions of development practitioners (Schwittay 2014). In addition, advocates of humanitarian design point out that the complexity and fast-paced nature of today’s development challenges calls for innovative, creative, and integrative experts—designers in short—who are best placed to tackle the problem of persistent poverty. Part of their approach is to redefine common constraints, such as poor people’s inability to pay for necessary services, as “creative springboards” and to redefine poor people’s needs as (commercial) opportunities. This vision of development is based on a conceptualization of the poor as consumers and an individualization of infrastructural problems, both hallmarks of humanitarian goods.

Amplifying Development

In 2014, DFID’s i-Hub contracted IDEO.org to the tune of £10 million to develop and implement its flagship program, Amplify (www.amplify.org). Amplify is a crowdsourcing platform aimed at engaging nontraditional development actors such as designers and other creative entrepreneurs, diaspora communities, technologists, engineers, and the public at large. It also wants to establish stronger connections between these actors and potential users of the development solutions generated via the platform. By definition, these users are constituted as poor to fit DFID’s mandate. Reflecting the hyperbole that often surrounds the uptake of innovation and design in development, Amplify’s business plan marketed the program to DFID senior management as a “platform [that] could galvanise truly transformational and unprecedented innovation by attracting new sources of expertise” and the use of collaboration (Amplify 2013). Indeed, Jonathan Wong, former head of i-Hub, reports that thanks to Amplify, HCD is diffusing across DFID and other UK government departments (IDEO.org 2016). This is also a process of making visible the informal and unseen practices already happening at the margins of many development organizations, of mainstreaming them and showing their potential contributions to DFID’s work.

The Amplify platform has been adapted from the original OpenIDEO platform, which was built to stimulate “open innovation” and uses proprietary software and a Creative Commons license. Calling it an online platform rather than a website highlights such crowdsourcing devices as socio-technical spaces that enable diverse and dispersed groups of people to collaborate on joint projects. Amplify itself consists of eight challenges, with topics ranging from women’s safety in urban areas to improved refugee education to youth empowerment in East Africa and enhanced opportunities for people with disabilities. Each challenge goes through a four-month online process, at the end of which a handful of winners attend a design bootcamp, usually held in Nairobi or Kampala, and receive upwards of £100,000 in DFID funding and IDEO design support to implement their ideas. To date, seven challenges have been completed, which allows us to examine if and how the program’s use of open collaboration and humanitarian design have been changing DFID’s modus operandi. Our analysis is based on more than 2 years of qualitative research, encompassing in-person and Skype interviews with three DFID managers, five IDEO managers, and 15 participants, predominantly finalists of the first four challenges. We also conducted detailed numerical and discourse analysis of three challenges through online research and examined secondary materials such policy papers, a business plan, blog posts, YouTube talks, and online meet-ups.

Participants in the 2016 Amplify Bootcamp

Fig. 2. Participants in the 2016 Amplify Bootcamp in Kampala, Uganda.

Most obviously, the Amplify application process differs significantly from DFID’s traditional Requests for Proposals. The latter ask for precisely defined project descriptions and timelines, budgets, and objectives, all presented in development jargon that from the outset narrows the pool of applicants to those able to comply with these requirements. Instead, Amplify’s more flexible and open-ended process emphasizes learning and iteration. Most participants—theoretically anybody with an internet connection can set up a short profile and join the platform, although there are clearly structural constraints to participation—post preliminary ideas on the Web2.0-type website. There are a number of free-form text boxes where participants answer open-ended prompts such as, “Explain your idea,” ‘“Who benefits?” and “How is your idea unique?” In addition, dropdown menus provide more precise information about the participants themselves on, for example, years of experience in the country for which the idea is being proposed, expertise in the sector, and size of operating budgets. Although written text, which has to be in English, dominates the submissions, participants are also encouraged to embrace more visual language by posting photos and short videos. All shortlisted participants have to provide a user-experience map to chart how potential users of their idea would participate in the proposed project. Initial submission can easily be changed using a simple editing function in response to questions and comments from other participants and IDEO.org managers, which are displayed in a comment section.

User experience map created by a Food for Education, a winner of the fourth challenge.

Fig. 3. User experience map created by a Food for Education, a winner of the fourth challenge.

This online process is structured according to the HCD process of Ideas–Feedback–Improvement, which asks that participants show how comments and other feedback are shaping the evolution of their ideas. Although this structure imposes its own logic, it is miles away from the conventional log frame (logical framework) used by many development organizations. Whereas log frames demand that information is presented in boxes, organized by technical terms such as input, outputs, and outcomes, the design process operates through freeform, flexible thinking and writing to produce more open-ended submissions. Several participants we interviewed welcomed this move toward a more realistic process for formulating and implementing development projects, which one participant described as “learning rather than proofing.” Such learning also takes place at the level of the program, as Amplify itself has been conceived as one large prototype where continuous adjustments are made from one challenge to the next.

Amplify’s aim has always been to fund small organizations that DFID, which channels most of its aid through large international nongovernmental organizations (NGOs) or consultancies, is not usually able to support. An examination of the 30 winners of the first five challenges shows that just over half are nonprofit/NGO-type organizations, eight are social enterprises, three are professionals, and another three are design groups. A group of New York University (NYU) design students won the first challenge with a project developed in collaboration with a Nepalese NGO they had met on the website; this was celebrated in a newspaper article as being exactly what Amplify was about (Leach 2015). However, an IDEO manager complained to us that the program should instead be supporting Kenyan design students and connecting them to Kenyan social enterprises. He was not so much objecting to the fact that funds went to the United States rather than target countries (most of the money was channelled via NYU to the Nepalese organization), but that the U.S. students did not know much about what is going on in Nepal. It is through such internal debates that small, community-based organizations have emerged as the “sweet spot” targeted by Amplify, showing how inclusive innovation and open collaborative practices can generate their own politics of exclusion. In the fourth challenge, a Kenya-based design group called KDI did win, with a project to work with Nairobi slum dwellers to redesign open spaces to prevent flooding.

Page from KDI’s pilot concept plan submitted to IDEO.

Fig. 4. Page from KDI’s pilot concept plan submitted to IDEO.

Small Experiments

i-Hub’s first head has described Amplify as “less like Encyclopaedia Britannica and more like Wikipedia” (Wong 2016:125). For him, development knowhow is no longer created in traditional centers of power, which instead “curate” knowledge production in multiple locations around the globe. However, the above description of what is new about the program also reveals some of its continuities with more conventional development regimes. Most important, Amplify expertise remains firmly situated in the Global North, with IDEO.org designers in New York and San Francisco, DFID managers in London, and unnamed subject experts who ultimately chose the winning ideas. This replication of authoritative development geographies can also be seen in the location of the finalists. Although all of the winners of the first five challenges are based in the Global South, 24 of 30 have a connection to the Global North, with 19 to the United States. Close to 50% of all winning projects are located in Kenya, despite the fact that 27 of DFID’s priority countries are eligible for Amplify support. The sixth challenge focused on four countries in East Africa only, narrowing down Amplify’s professed diversity to long-established sites of UK development interventions.

Amplify is also subject to a tension at the heart of the design endeavor. On the one hand, there is an insistence that everyone is a designer because design is a fundamentally creative, human activity, which awkwardly stands counter to claims by professional designers that they possess the right qualifications, skills, and methods to solve the world’s problems. Although everybody who logs onto the Amplify website has to participate in its designer-y process, it is IDEO.org employees who are the program experts. Such an appropriation of expertise resonates with condemnations of humanitarian design as “soft cultural imperialism” operating through neoliberal narratives about poverty and the use of techno-scientific market devices to solve it (Johnson 2011:463). However, one characteristic of humanitarian design is precisely the ever-closer entanglement of markets and morals (Redfield 2016). In addition, it is Silicon Valley’s techno-utopian and libertarian values that shape Amplify’s operations and thereby seep into broader international development efforts.

Accordingly, Amplify’s business plan celebrates its “start small, test, and fail early” mentality (Amplify 2013). Continuous experimentation is encouraged throughout the online process and especially among finalists. In our interviews, some participants recognized the challenges that such an experimental logic can present in the complex world of development, where vulnerable livelihoods leave little room for creative destruction. There is also a qualitative difference between the beta version of the latest geolocation app failing in tests with a consumer focus group, and the implications of discontinuing a project that was providing important community services. In one case, Amplify managers urged a finalist to introduce a new employment skills project requiring substantial upfront investment in equipment against the finalist’s own judgment that it would not be economically sustainable in the long run. Having to abandon the project after the three-month period supported by a small Amplify grant resulted in confusion and disappointment among its users. The organization’s own frustration echoes critiques that humanitarian designers, in their search for marketable innovations, sometimes do not pay enough attention to financial sustainability and organizational cultures (Mulgan 2014).

Above all, Amplify operates as a micro-social generator, incubating interventions at a micro scale. It combines the qualities of a social networking platform (like Facebook/Linkedin) with the features of an open-editable content platform working through the small contributions of numerous individuals and groups and overseen by a group of administrators (like Wikipedia). By design, it supports small organizations producing local solutions that improve the lives of individuals, their families, and their neighborhoods. On the one hand, this results from the nature of design. DFID managers recognized in our interviews that design can only tackle more technical problems, and Amplify was always aimed at particular locales rather than universal coverage. This is not to say that it does not have global ambitions, as scalability is something experts look for in winning ideas. Indeed, one finalist of the Refugee Education challenge, a Jordanian organization called We Love Reading, is aiming to build a global movement to get more children to read for pleasure. But it also remains firmly rooted in its local origins: it started in a neighborhood mosque in Amman, and works through one reading circle at a time.

Image from presentation prepared by IDEO.org for We Love Reading.

Fig. 5. Image from presentation prepared by IDEO.org for We Love Reading.

On the other hand, micro is the scale of humanitarian devices, and similar to these devices, most winning ideas on Amplify advocate for solutions standing apart from state infrastructures or authorities. Working at an individualized and individualizing level, they deploy technical minimalism in the face of immediate needs. They also often create “micro-scale market opportunities” while presenting “small…approaches to social change” (Redfield 2016:179). How has this translated into the production of humanitarian goods?

Designing Humanitarian Devices

Among the winners were certainly devices that fit the inclusive innovation remit with its focus on newness: a clean birth kit for poor Indian mothers, a biogas-powered milk chiller for Tanzanian farmers, novel storage solutions for Ethiopian market vendors, a market-matching chatbot connecting Kenyan poor farmers with buyers, and a peer-to-peer SMS-based information network to alert Jakarta inhabitants to floods. But the great majority of winning projects have been quite conventional, from employment training schemes to microfinance to community health initiatives and educational programs. This reflects Amplify managers’ self-professed scepticism towards technological silver bullets and existing expertise within DFID more broadly, which ultimately limits what can be funded. Equally important, because of the organization’s value-for-money mentality stemming from its fiscal responsibility to UK taxpayers, initial blue-sky thinking has made way to safer ideas and organizations. Has this resulted in poor—rather than pro-poor—innovations, foregoing breakthroughs for incremental change? Although Amplify’s outcomes might be familiar, the process by which they have been achieved is certainly new to DFID, many of the winners, and the platform’s observers.

And what if we regarded design as a particular, “remedial” approach to changing situations whose status quo cannot be accepted, as suggested by Bruno Latour (2008)? Then, rather than being revolutionary, humanitarian design can be seen as careful in a double sense: on the one hand it can only present limited solutions to clearly circumscribed problems, and on the other it is infused with an ethics of care that accords well with the affective dimension of contemporary humanitarianism and the popular embrace of poverty alleviation causes. It can be seen as an approach to international development whose practitioners wonder whether they are asking the right questions where others have ready-made answers, who examine the assumptions that most development interventions take for granted, and who hold in view the messiness and complexity of any project of change, ultimately recommending to proceed with caution. Likewise, for researchers of these approaches, rather than subscribing to the well-trodden critique of neoliberal market dominance, might an agnostic stance that explores their potential while acknowledging their limits be more productive?

Anke Schwittay is the Head of International Development at the University of Sussex. She is the author of New Media and International Development: Representation and Affect in Microfinance. Paul Braund is a Research Associate at the University of Sussex and works at the intersection of information systems, design and international development.

References

Amplify. 2013. Business Plan. Available at link.

Brown, Tim, and Jocelyn Wyatt. 2010. “Design Thinking for Social Innovation.” Stanford Social Innovation Review 8(1):31–35.

Foster, Christopher, and Richard Heeks.2013. “Conceptualising Inclusive Innovation: Modifying Systems of Innovation Frameworks to Understand Diffusion of New Technology to Low-income Consumers.” The European Journal of Development Research 25(3): 333-355.

IDEO.org. 2016. Impact: A Design Perspective. Available at link.

Johnson, Cedric. 2011. “The Urban Precariat, Neoliberalization, and the Soft Power of Humanitarian Design.” Journal of Developing Societies 27(3–4):445–475.

Latour, Bruno. 2008. “A Cautious Prometheus? A Few Steps Towards a Philosophy of Design (with Special Attention to Peter Sloterdijk).” In Proceedings of the 2008 Annual International Conference of Design History, pp. 2–10. Boca Raton, FL: Universal Publishers.

Leach, Anna. 2015. “Collaboration Not Competition: Could This Be the Future of Development?” The Guardian, April 14. Available at link.

Mulgan, Geoff. 2014. “Design in Public and Social Innovation: What Works and What Could Work Better.” Available at link.

Papanek, Victor. 1984. Design for the Real World: Human Ecology and Social Change. Chicago, IL: Academy Chicago Publishers.

Redfield, Peter. 2016. “Fluid Technologies: The Bush Pump, the LifeStraw® and Microworlds of Humanitarian Design.” Social Studies of Science 46(2):159–183.

Rodin, Judith. 2016. “Foreword.” In Innovation for International Development: Navigating the Paths and Pitfalls, edited by Ben Ramalingam and Kirsten Bound, pp. 6–7. London, UK: NESTA.

Schumacher, Ernest Friedrich. 1973. Small is Beautiful: A Study of Economics as if People Mattered. US: Random House.

Schwittay, Anke. 2014. “Designing Development: Humanitarian Design in the Financial Inclusion Assemblage.” PoLAR: Political and Legal Anthropology Review 37(2): 29-47.

Wired. 2013. Interview with Melissa Gates and Paul Farmer. Available at link.

Wong, Jonathan. 2016. “From Britannica to Wikipedia: How Traditional Development Players Are Catalysing Collaboration for Innovation.” In Innovation for International Development: Navigating the Paths and Pitfalls, edited by Ben Ramalingam and Kirsten Bound, pp. 120–130. London, UK: NESTA.

Featured Image: Image from IDEO.org 2016 Impact Report (IDEO.org 2016). “Design: We improve the lives of people in poor and vulnerable communities through the solutions we create.”