Hostname: page-component-745bb68f8f-grxwn Total loading time: 0 Render date: 2025-02-07T04:54:41.101Z Has data issue: false hasContentIssue false

Calling Mogadishu: How Reminders of Anarchy Bias Survey Participation

Published online by Cambridge University Press:  29 August 2018

Elaine K. Denny
Affiliation:
Department of Political Science, University of California Merced, Merced, California, U.S.A. e-mail: edenny@ucmerced.edu
Jesse Driscoll
Affiliation:
University of California San Diego (GPS), La Jolla, California, U.S.A. e-mail: jdriscoll@ucsd.edu
Rights & Permissions [Opens in a new window]

Abstract

How does the fear of anarchy affect telephone survey behaviors? A survey experiment administered to a sample of Mogadishu residents—validated with a natural experiment—is used to assess this question. Randomly assigned reminders of anarchic violence conditioned differential effects on survey participation depending on subjects’ background level of security and welfare. Vulnerable subjects were more likely than non-vulnerable subjects to refuse to provide sensitive survey information after reminders of anarchy.

Type
Research Article
Copyright
Copyright © The Experimental Research Section of the American Political Science Association 2018 

SUMMARY

How does acute fear of anarchic violence affect the survey behavior of subjects residing in active war zones? We answer this question through an examination of telephone survey behaviors for a sample residing in contemporary Mogadishu, the capital city of Somalia.

Mogadishu is, as of this writing, a theater of violent competition for political power. While conducting an apolitical panel telephone survey on citizen well-being, we embedded a survey experiment that varied whether respondents received a survey prime that framed recent events in the city as violent and anarchic, or as state consolidation (or given no prime at all). We then immediately asked our most sensitive survey question: whether respondents would be willing to disclose their clan. We measured non-response in order to make inferences about respondent fear. To increase our confidence in the validity of the finding, we leveraged a natural experiment—a major attack on Somalia’s Parliament—to examine whether the reminders of violent anarchy in the real-world caused the same responses as the artificial survey experiment. In both the survey experiment and the natural experiment, the threat of anarchy depressed survey participation disproportionately for the vulnerable.

THEORY: HOW INSECURITY AND FEAR ALTER SURVEY BEHAVIORS

Emotions filter incoming information and condition responses in a way that can alter survey behaviors. Anxiety and fear induce risk-averse behaviors (Lerner and Keltner 2001) that correlate with more conservative policy preferences with respect to security (Getmansky and Zeitzoff 2014; Lerner et al., 2003) and lower political engagement (Hassell and Settle 2017). Exposure to violence has many effects other than fear, and recollections of past violence can alter risk preferences long after the fact (Callen et al. 2014; Cameron and Shah 2015; Eckel et al. 2009; Voors et al. 2012). Emotionally induced stress—which can be experimentally manipulated—can disrupt the normal survey ritual.Footnote 1 Changes in behavior are most easily observed in the form of non-response when fear primes interact with respondents’ background level of well-being (Huddy et al. 2005). Chronically vulnerable populations are more susceptible to “fight or flight” fear-triggers than their secure counterparts (Levine 2015).

A contemporary Somali sample magnifies the salience of these concerns. Subjects living in war zones are strategic about whether to share information and with whom (Lyall et al. 2015). Revealing attitudes on sensitive matters, even over the phone, carries danger (Blair et al. 2013).Footnote 2 As American social scientists taking a close interest in the opinions of Mogadishu residents, our motives were interrogated skeptically by both our Somali subjects and our research associates. To the extent that we were neutral observers we could be accused of engaging in virtual poverty tourism. To the extent that we were something other than neutral observers, however, we were potential partisans. In this setting, where famine has been used as a weapon for decades, charity cannot be seen as politically neutral. Nor is data collection a pure public good, since conducting the first representative survey in a quarter century, with assistance from the local government, could potentially be viewed as a kind of census with the effect of locking-in de-facto property rights for the war’s winners.Footnote 3 The decision by subjects to stay on the phone and answer our questions was weighed against the option of hanging up to remain invisible.

Our theoretical expectation was that acute fear, induced by reminders of the surrounding anarchic environment, would decrease survey participation differentially, especially for the most vulnerable respondents. The low welfare members of our sample were more likely to lack political protection, connections, and representation. They were, we reasoned, more likely to have an emotional response to reminders of political instability that would leave an imprint on the data. Willingness to share information would be affected by the interaction between perceptions of background levels of threat within the society and one’s social position:

H1: Vulnerable residents of Mogadishu will be less likely to answer sensitive survey questions after reminders of anarchic violence than secure residents of Mogadishu.

DATA

Our sample is generated from a face-to-face, population-representative survey of Mogadishu residents conducted in 2012 (Driscoll and Lidow 2014). Of this initial sample, 252 (39% of the sample) provided phone numbers for follow-up surveys. These numbers were called via Skype by fluent Somali-language enumerators from the San Diego Somali community in the spring of 2013 (Wave 2), in the spring of 2014 (Wave 3), and the finally in the fall of 2014 (Wave 4). All of our telephone survey enumerators were male.Footnote 4 While our enumerators varied somewhat in their Somali accents and dialects, we could discern no statistically significant differences by enumerator on the outcomes of interest in this study.

Our sample is attractive because it contains high levels of variation in both acute fear and underlying vulnerability. In all waves, the survey instrument avoided explicitly political questions due to the charged situation on the ground and concerns that we could put vulnerable respondents at risk. Instead, the short 24-question survey asked about respondents’ current level of services and safety. Our enumerators could discern from vocal characteristics whether or not the respondent was male or female, but little additional information could be gleaned without respondent compliance.Footnote 5

Why is clan so sensitive in Mogadishu? Somali clans provide a sense of group identity, organize sub-national distributional politics, and govern a host of traditional intra-ethnic social interactions—from social welfare insurance to marriage markets to militia recruitment. Clans have been focal points for conflict in Somalia since at least the 1980s.Footnote 6 There have been clan-targeting pogroms every time the capital switched hands, leading to an exodus of many non-Hawiye civilians from the city. Abgal and Habr Gidr clans cluster together in homogenous neighborhood enclaves where children and assets like electrical generators can be protected. Measured in terms of overall non-response, the most sensitive question in the 2014 survey was whether respondents would be willing to share clan names—a question that had not been asked face-to-face, or in the initial telephone follow-up survey, because it was deemed too sensitive.Footnote 7 Variation in willingness to provide an answer to this question motivated the experimental design that follows.

SURVEY EXPERIMENT

Our survey experiment randomly assigned respondents to one of two treatment conditions or to a control group. Randomization occurred prior to placing any calls and the resulting categories are largely balanced on observables (See Table 1, columns 1 through 6). In our small sample, unfortunately, gender and displacement status are not distributed evenly across conditions despite randomization. A larger sample would have been desirable.

Table 1 Balance Table of Survey Respondents

Two treatments primed respondents to consider Somalia’s current government consolidation or persisting anarchy. Both were plausible descriptions of respondents’ political reality in 2014, and both were potential sources of fear. We stated: “In this survey, we are interested in how the existence of [a central Somali government/ongoing lawlessness] is affecting citizens’ quality of life.” A control group, consisting of approximately one-third of the survey participants, was given no prime at all.

Directly after the experimental treatment, respondents were asked the question we expected to be our most sensitive: whether they were willing to provide their clan name. Item non-response on this question is the dependent variable—a behavioral indicator of subjects’ participation reticence. Our favored hypothesis is that this reticence was induced by a shift in perception of threat via the mechanism of fear, though we cannot observe this mechanism directly.

Since we theorized that the risks of revealing sensitive information would not be equally acute for the entire sample, varying differentially on underlying vulnerability, statistical analysis examines interaction effects between “fear primes” and respondent vulnerability. Our measure of vulnerability is whether or not a person reports being displaced in the past year.Footnote 8 Violence in the city has driven many from their homes. Internally displaced people (IDPs) live in camps inside Mogadishu or squat in abandoned buildings. Other people displaced by war and famine from the countryside have migrated to Mogadishu for security. Our question does not distinguish between individuals in IDP camps who had fled the rural famine and former urban residents who were relative losers in the ongoing turf war. In either case, the question is a valid proxy for relative social vulnerability. This question also had the advantage of being asked at the very beginning of the survey so had full compliance.Footnote 9

Table 2 presents results of two logistic regressions. Refusal to name one’s clan is the dependent variable. Both include enumerator fixed effects. Surprisingly, evidence suggests the anarchy prime reduced reticence somewhat among the non-vulnerable. The interaction between displacement and the anarchy prime is large and statistically significant—evidence of a differential treatment effect. These results hold when we control for the government consolidation prime, which by itself elicits no significant changes in behavior (See Figure 1). The marginal effect of seeing the anarchy prime is a fourfold increase in the probability that a displaced person will refuse to answer the question “What is your clan?” compared to a primed non-displaced subject.

Table 2 Level of Reticence (Clan Non-Response) by Anarchy Prime and Displaced

Standard errors in parentheses

***p<0.01, **p<0.05, *p<0.1

Figure 1 Survey Experiment Results (Table 2 Effects).

In summary: Two experimental treatments were tested against a control in the survey. Only one experiment produced meaningful results. Reminders of lawlessness differentially changed survey behaviors. Reminders of political consolidation did not. The survey experiment provides evidence that the threat of anarchy decreases participation in sensitive survey questions differentially for the vulnerable respondents. We continued to be troubled by concerns about external validity—e.g., the dubious claim that survey primes or vignettes about violence can substitute meaningfully for experiencing real-world violence.Footnote 10

A NATURAL EXPERIMENT

On 24 May 2014, the last day of Wave 3 of our telephone survey, Al Shabaab militants detonated a car bomb outside of the Somali Parliament compound. A coordinated attack on government security forces followed. The gun battle lasted several hours. At least seven militants and ten members of government forces were killed. Four lawmakers were among the many civilians injured. The public and political nature of the attack differentiated it from background levels of violence that Mogadishu residents experience. Word of the attack spread quickly. We coincidentally placed the last fifth of our survey calls just hours after the attack (after 5:00pm Somali time). Our Somali American enumerator team in San Diego had already heard of the bombing via their social networks by the time they began making survey calls that morning.

From the perspective of our survey sample (i.e., the full set of telephone respondents reached in the third wave) the “Parliament Attack” treatment has all of the relevant characteristics of a natural experiment. The order in which numbers were called (or called back) from our list was randomly assigned. The telephone network continued to function throughout the day. The percentage of respondents agreeing to participate in our survey was unexpectedly higher compared to previous days (97% versus 74%) and subsequent investigation of the balance across samples suggests that these “extra” respondents skewed toward the more vulnerable (Table 1).Footnote 11 Even with this underlying bias in the sample, analysis of the interaction between the treatment effect and respondents’ underlying level of vulnerability can quantify differential effects on vulnerable and non-vulnerable subject pools.

Recall that our theoretical expectation was that the attack would elevate vulnerable residents’ perception of acute threat more than non-vulnerable residents. The attack was a pre-survey treatment, so measuring a behavioral “fear response” was accomplished in three ways: (1) a count of total questions that a respondent refused to answer; (2) a binary variable for whether the respondent refused to answer any of the three most sensitive questions on the survey (as measured by frequency of refusal—two involve clan and one involves a shift in security); and (3) the same outcome measured in the survey experiment: refusal to share clan name (the question that respondents were most reluctant to answer). Table 3 considers the relationship between vulnerability (operationalized once again by displacement), “bomb day,” and our three outcome measures (See Figure 2). A Poisson model is used for the count variable in Column 1, while a logit model is used for the binary variables in Columns 2 and 3. Non-vulnerable (not-displaced) subjects seem more willing to answer questions on bomb day, making interaction between displacement and bomb day substantively large and signed according to expectations in all three models. Models are underpowered due to the small sample size, with p-values ranging between 0.06 and 0.11. The total number of item non-responses probably increased for Mogadishu’s most vulnerable respondents on the day of the Parliament attack.

Table 3 Level of Reticence, by Bomb Day and Displaced Status

Standard errors in parentheses

***p<0.01, **p<0.05, *p<0.1

Figure 2 “Natural Experiment” Results (Table 3 Effects).

DISCUSSION

Both anarchy primes seem to have made non-displaced respondents somewhat more willing to provide sensitive data at the same time they made displaced respondents less willing to answer the same question. Our findings serve as a reminder that very small changes in the survey environment—subtle shifts in consent script, question wording, enumerator identity, or changes in local security context—can alter survey participation. While this has long been appreciated by students of survey behavior, our analysis underscores that in war zones the effect on attrition and missing at random (MAR) item response is acute for insecure populations. In general, we draw the following four practical lessons from the study:

  • Understanding local context is vital, since “basic demographics” questions can be sensitive in war zones and should be asked near the end of the survey instrument (if they are asked at all). If Somali clan identities had been solicited at the beginning of a survey, all downstream data would have been contaminated.

  • University-sponsored scientific research is more likely to generate compliance than research funded by embassies (Corstange, 2014).Footnote 12 Lengthy IRB-approved solicitation scripts irked some subjects (see online Appendix Section 5), but the thorough explanation of our identity and motives probably helped to build rapport and gave subjects opportunities to gauge our sincerity.

  • The proliferation of cheap cellular phones to places like Mogadishu make long-distance surveys (like the one in this paper) possible at very low costs. When the research team’s only link to subjects occurs through a telephone, however, contact is more fragile than in face-to-face surveys. It can be severed by subjects with the touch of a button. Given the low marginal costs of adding additional subjects, large initial samples should be recruited in expectation of attrition. Over-sampling vulnerable populations is one strategy to avoid ending up with very small numbers of respondents after attrition.

  • If item non-response on sensitive questions is anticipated, and policy-relevant inferences depend on claims of sample representativeness, extra care should be taken to collect the kind of data that will aid imputation of missing data. Context is critical to determining which elements can be solicited, without raising suspicions from subjects, to test “MAR conditional on X” assumptions (with the goal of facilitating defensible, transparent, and reasonable reweighing to regain pre-attrition sample characteristics).

CONCLUSION

Vulnerable populations in Mogadishu became less willing to provide sensitive information in a telephone survey when primed to think about ongoing lawlessness. A reminder of real-world anarchy provided an opportunity to probe the external validity of the survey experiment. This behavioral pattern is consistent with the mechanism of fear. Relative winners and relative losers in Somalia’s ongoing war have different incentives to cooperate with scientific data collection. Speculations about the precise mechanism aside, our experimental methodology illuminates that social conditions in conflict zones interact with the survey ritual in ways that can systematically bias the data collected.

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit https://doi.org/10.1017/XPS.2018.20.

Footnotes

All data analyzed in this project were collected according to processes approved by the Human Research Protections Program at the University of California at San Diego (Project # 111743 and #131065). The initial 2012 face-to-face survey was conducted cooperatively with Nicholai Lidow and the assistance of enumerators recruited from SUHA Mogadishu. Support for this research was provided by the National Science Foundation (Award No. SES-1216070), as well as the University of California at San Diego’s Hellman Fellowship and the Program Design and Evaluation Lab. The data, code, and any additional materials required to replicate all analyses in this article are available at the Journal of Experimental Political Science Dataverse within the Harvard Dataverse Network, at: doi:10.7910/DVN/IS2KTU. Madelyn Driscoll at Microsoft facilitated permissions for the Skype calls to Mogadishu that began in 2013. The names of the Somali-language survey enumerators that operated the phone bank are Abdulmalik Buul, Ibrahim Warsame, Ahmed Aden, and Abdisalan Haji, Abdikarim Tukri, and Abdiaziz Hussien. Helpful comments on previous drafts were provided by Craig Macintosh, Nahomi Ichino, Chris Fariss, Claire Adida, Lauren Prather, Nico Ravanilla, William Reno, and David Laitin. We also received valuable advice from editors and anonymous reviewers at this journal (and others), co-panelists at the Peace Sciences Annual Meeting, the American Political Science Association Annual Meeting, and various UCSD working groups. The authors have no conflict of interest.

1 The seminal treatment of “don’t know” and “refuse-to-answer” survey behaviors is Berinsky (2004). For discussions of fear as a plausible mechanism to explain behavior in post-conflict surveys, see Driscoll and Hidalgo (2014). The literature review on enumerator effects in Bush and Prather (2018) is valuable.

2 A superb discussion of how cellular telephone communication networks change the relationship between civilians, insurgents, and counterinsurgent military forces can be found in Shapiro and Weidmann (2015). Their distinction (252) between texts (which cannot be overheard) and voices (which can) is relevant to our study, since respondents were speaking into telephones. Many Somalis regularly admit to a belief that agents of both the Somali government and the US government, as well as agents of the Al Shabaab rebel group, regularly monitor electronic communications. How and whether any of this really mattered remained a source of debate within our research team, as discussed in online Appendix Section 5.

3 See online Appendix Section 2.

4 Benstead (2014a) found that survey respondents in Morocco answered more questions – fewer skips and “don’t knows” –when the interviewer was male (377), which she elsewhere posits may be attributed to “higher authority of males than females in patriarchal societies” (Benstead 2014b, 745), though she could not replicate the finding on a subsequent survey (Benstead 2014b, 757, fn17).

5 For example, none of our enumerators could reliably differentiate subjects by clan by voice characteristics, nor could a Bantu accent be discerned from a “regular” Somali accent.

6 Online Appendix Section 1 provides a succinct background summary. Readers seeking more background on the Somali case should begin with Woldemariam (2018), Chapter 6.

7 The second most sensitive question, measured by non-response, was also clan related. We had asked whether respondents were in the dominant clan in an area (without being asked to name the clan).

8 The question wording was “Are you currently displaced?”

9 We initially analyzed a model where vulnerability was operationalized as a “vulnerability index” of service provision and reported safety (discussed in online Appendix Section 3), though the sample size was smaller due to some non-response on questions that comprised the index.

10 See Driscoll and Maliniak (2016).

11 One speculative rationalization for the fact that vulnerable people were more likely to pick up the phone after the Parliament attack is that a higher threat environment and anxiety have been shown to correlate with greater information-seeking in the political arena (Marcus 2002; Merolla and Zechmeister 2009). See online Appendix Section 4 for additional discussion.

12 The author’s use of the verb “deter” throughout the writeup implies a theory of fear driving non-response, though the fear mechanism is not explicitly theorized.

References

REFERENCES

Benstead, Lindsay. 2014a. “Effects of Interviewer-Respondent Gender Intaraction on Attitudes toward Women and Politics: Findings from Morocco.” International Journal of Public Opinion Research 26 (3): 369383.Google Scholar
Benstead, Lindsay. 2014b. “Does Interviewer Religious Dress Affect Survey Responses? Evidence from Morocco.” Politics and Religion 7: 734760.Google Scholar
Berinsky, Adam. 2004. Silent Voices: Public Opinion and Political Participation in America. Princeton, NJ: Princeton University Press.Google Scholar
Blair, Graeme, Fair, C. Christine, Malhotra, Neil, and Shapiro, Jacob N.. 2013. “Poverty and Support for Militant Politics: Evidence from Pakistan.” American Journal of Political Science 57 (1): 3048.Google Scholar
Bush, Sarah and Prather, Lauren. 2018. “How Electronic Devices in Face-to-Face Interviews Change Survey Behavior: Evidence from a Developing Country.” Working Paper.Google Scholar
Callen, Michael, Isaqzadeh, Mohammad, Long, James D., and Sprenger, Charles. 2014. “Violence and Risk Preference: Experimental Evidence from Afghanistan.” The American Economic Review 104 (1): 123–48.Google Scholar
Cameron, Lisa and Shah, Manisha. 2015. “Risk-Taking Behavior in the Wake of Natural Disasters.” Journal of Human Resources 50 (2): 484515.Google Scholar
Corstange, Daniel. 2014. “Foreign-Sponsorship Effects in Developing-World Surveys: Evidence From A Field Experiment in Lebanon.” Public Opinion Quarterly 78 (2): 474484.Google Scholar
Denny, Elaine and Driscoll, Jesse. 2018. “Replication Data for: Calling Mogadishu: How Reminders of Anarchy Bias Survey Participation.” (https://doi.org/10.7910/DVN/IS2KTU), Harvard Dataverse, accessed July 27, 2018.Google Scholar
Driscoll, Jesse and Hidalgo, Daniel. 2014. “Intended and Unintended Consequences of Democracy Promotion Assistance to Georgia After the Rose Revolution.” Research and Politics 1 (1): 113.Google Scholar
Driscoll, Jesse and Lidow, Nicholai. 2014. “Representative Surveys in Insecure Environments: A Case Study of Mogadishu, Somalia.” Journal of Survey Statistics and Methodology 2 (1): 7895.Google Scholar
Driscoll, Jesse and Maliniak, Daniel. 2016. “Did Georgian Voters Desire Military Escalation in 2008? Experiments and Observations.” The Journal of Politics 78 (1): 265280.Google Scholar
Eckel, Catherine C., El-Gamal, Mahmoud A. and Wilson, Rick K.. 2009. “Risk Loving after the Storm: A Bayesian-Network Study of Hurricane Katrina Evacuees.” Journal of Economic Behavior & Organization 69 (2): 110–24.Google Scholar
Getmansky, Anna and Zeitzoff, Thomas. 2014. “Terrorism and Voting: The Effect of Rocket Threat on Voting in Israeli Elections.” American Political Science Review 108 (03): 588604.Google Scholar
Hassell, Hans J. G. and Settle, Jaime E.. 2017. “The Differential Effects of Stress on Voter Turnout.” Political Psychology 38 (3): 533550.Google Scholar
Huddy, Leonie, Feldman, Stanley, Taber, Charles, and Lahav, Gallya. 2005. “Threat, Anxiety, and Support of Antiterrorism Policies.” American Journal of Political Science 49 (3): 593608.Google Scholar
Lerner, Jennifer S., Gonzalez, Roxana M., Small, Deborah A., and Fischhoff, Baruch. 2003. “Effects of Fear and Anger on Perceived Risks of Terrorism: A National Field Experiment.” Psychological Science 14 (2): 144–50.Google Scholar
Lerner, Jennifer S. and Keltner, Dacher. 2001. “Fear, Anger, and Risk.” Journal of Personality and Social Psychology 81 (1): 146.Google Scholar
Levine, Adam Seth. 2015. American Insecurity: Why Our Economic Fears Lead to Political Inaction. Princeton, NJ: Princeton University Press.Google Scholar
Lyall, Jason, Imai, Shiraito and Kosuke. 2015. “Coethnic Bias and Wartime Informing.” The Journal of Politics 77 (3): 833–48.Google Scholar
Marcus, George. 2002. The Sentimental Citizen: Emotion in Democratic Politics. University Park, PA: Penn State University Press.Google Scholar
Merolla, Jennifer L. and Zechmeister, Elizabeth J.. 2009. Democracy at Risk: How Terrorist Threats Affect the Public. Chicago, IL: University of Chicago Press.Google Scholar
Shapiro, Jacob and Weidmann, Nils. 2015. “Is the Phone Mightier Than the Sword? Cellphones and Insurgent Violence in Iraq.” International Organization 69 (02): 247–74.Google Scholar
Sniderman, Paul and Grob, Douglas. 1996. “Innovations in Experimental Design in Attitude Surveys.” Annual Review of Sociology 22 (1): 377–99.Google Scholar
Voors, Maarten J., Nillesen, Eleonora EM, Verwimp, Philip, Bulte, Erwin H., Lensink, Robert, and Van Soest, Daan P.. 2012. “Violent Conflict and Behavior: A Field Experiment in Burundi.” The American Economic Review 102 (2): 941–64.Google Scholar
Woldemariam, Michael. 2018. Insurgent Fragmentation in the Horn of Africa: Rebellion and its Discontents. New York, NY: Cambridge University Press.Google Scholar
Figure 0

Table 1 Balance Table of Survey Respondents

Figure 1

Table 2 Level of Reticence (Clan Non-Response) by Anarchy Prime and Displaced

Figure 2

Figure 1 Survey Experiment Results (Table 2 Effects).

Figure 3

Table 3 Level of Reticence, by Bomb Day and Displaced Status

Figure 4

Figure 2 “Natural Experiment” Results (Table 3 Effects).

Supplementary material: Link

Denny and Driscoll Dataset

Link
Supplementary material: PDF

Denny and Driscoll supplementary material

Denny and Driscoll supplementary material 1

Download Denny and Driscoll supplementary material(PDF)
PDF 261.3 KB