Dear Animal Advocates: Focus on Society, Not Individuals
by Zach Groff
How do we figure out what works in animal advocacy? Do we study individuals and how they react to specific arguments and campaigns, or do we look at larger-scale social evidence? To date, much of what those who think about effectiveness in animal activism do has focused on the former; however, a new study by Mercy for Animals suggests that the latter may be what we have to go with - that social change may just be too hard to see on the individual level and that we need to look to larger-scale evidence for clues.
I come from a background of answering individual-level questions in a rigorous way - I'm currently in Tamale, Ghana working on a study testing the effects of insurance, agricultural inputs, and video tutorials on the use of insurance on smallholder farmers' productivity. The study is what is called a randomized controlled trial (RCT) -- the gold standard for studies -- where some people out of a larger sample are randomly selected to receive the program. Since those who get the program and the rest of the sample should be fairly similar, we can then attribute any large differences between those who get it and those who don't to the effect of the program.
In an important study, Mercy for Animals did this with Facebook ads. Using Facebook's ad interface, MFA and researchers were able to randomly show a sample of people either a video on animal cruelty or a "control" video about a tropical disease that should have no effect on meat consumption. The researchers were then able to retarget everyone in the sample with a survey on consumption of animal products, vegetarianism, and veganism, and compare those who had seen the animal cruelty video with those who had not.
The study found very few statistically significant effects. In MFA's words, "Our hope for the study was to get information useful in deciding whether we should increase, decrease, eliminate, or maintain the amount of funding we allocate to online ads that drive people to video of farmed animal cruelty. Unfortunately, we don’t feel the study shed useful light on that question."
It is extremely laudable of all those involved that MFA released this data despite it failing to find support for a program MFA invests heavily in. The research itself required an incredible investment of time and money. Even in the rarefied world of academia, releasing failed studies is extremely rare. This speaks very highly of the team involved in this study.
MFA concludes from the study that further areas of research include further studies of dietary change, whether online ads have adverse effects, and whether women in different age groups tend to respond differently to animal cruelty videos.
This conclusion is likely mistaken, though - instead, this study suggests we should shift our focus away from experimental studies of individual change to broader looks at social change. The proof is in the pudding - as the researchers say, the study did not have enough people to detect any reasonable effect on dietary change (the study was planned only to detect a 10% or greater reduction). Increasing the size of the study to the necessary size, though, would likely require over 1 million people, which would be quite costly. So, the end result is we lack the evidence to say anything - not a particularly useful or surprising finding.
Other studies of dietary change or responses to individual interventions are likely to face the same fate. The bottom line is that social influence is just too subtle a phenomenon to capture on this level. Sadly, there's no such thing as a historical experiment. But by looking at what history, sociology, economics, and other disciplines that study society at a large scale have to say on the subject, we can at least get a sense of what works. One thing does work, time and time again: a nonviolent protest movement.