Hostname: page-component-745bb68f8f-kw2vx Total loading time: 0 Render date: 2025-02-06T12:45:48.458Z Has data issue: false hasContentIssue false

Maximizing Benefits from Survey-Based Research

Published online by Cambridge University Press:  21 June 2018

Noam Lupu
Affiliation:
Vanderbilt University
Elizabeth J. Zechmeister
Affiliation:
Vanderbilt University
Rights & Permissions [Opens in a new window]

Abstract

Type
Symposium: Whose Research Is It? Notable Ways Political Scientists Impact the Communities We Study
Copyright
Copyright © American Political Science Association 2018 

Despite the best of intentions, only a small proportion of social science research findings are input into dialogue, policy, or other actions that improve human welfare (Sieber Reference Sieber, Renzetti and Lee1992). Many projects end up unpublished or with few citations (Samuels Reference Samuels2011), and only a rare subset of studies are acted on by policy makers to address critical global challenges. Yet, rather than draw a pessimistic conclusion, we urge scholars to adopt a broader perspective regarding the diverse ways that their research activities can benefit communities within and outside the academy. When we assess the value of social science research, we must consider all of the possible communities that could benefit from a particular study.

In one of our core areas of research—international public opinion studies—we count at least seven groups that stand to benefit from social science research activities: (1) respondents who agree to participate in surveys; (2) local teams hired to enumerate the questionnaire; (3) local survey-research organizations that participate in fielding the study; (4) local academic partners involved in designing and implementing the study; (5) local and international policy makers whose efforts may benefit from the findings; (6) students receiving hands-on research training in the course of the study; and (7) the mass public.

Ethical approaches to human-subjects research emphasize a cost–benefit calculus. That assessment often is interpreted narrowly, with attention to costs incurred by research subjects (e.g., time required to take a survey and risks of revealing personal information to a study enumerator) and anticipated benefits from advancing knowledge (e.g., see federal policy expressed in 45 CFR 46, Subpart A, 2009). With respect to the latter, researchers may be tempted to make “promises of benefit to society [that] are based on the vague hope that all research is bound to be helpful—somehow” (Sieber Reference Sieber, Renzetti and Lee1992, italics added). Whether out of optimism, impressive levels of self-confidence, or ego-boosting, many social scientists overestimate the potential for their conclusions to be helpful—somehow—to society at large.

We agree that attention to the merit of research findings is important, but we argue for a more expansive view of benefits.Footnote 1 We posit that through careful and thoughtful planning and practices and by adopting a broader perspective on potential beneficiaries of research, social scientists can maximize benefits before, during, and after implementation of a research study. In our case, considering how a survey-research study can have positive effects on seven potential beneficiary communities broadens how we think about its merits and increases the likelihood that the opinion study results in a robust set of benefits.

Our experiences with Vanderbilt University’s Latin American Public Opinion Project (LAPOP) inform our perspective on how international survey projects can provide a platform for realizing broader impacts. LAPOP’s most important study is the AmericasBarometer, a biennial survey of the Western Hemisphere. LAPOP is at the forefront of international survey research, continually working to improve its methods for collecting and analyzing survey data.Footnote 2 Through efforts by the LAPOP team, computer-assisted personal interview (CAPI) approaches are used in all countries with a sophisticated suite of programming and standardized protocols for quality control (Montalvo, Seligson, and Zechmeister, forthcoming). By employing state-of-the-art approaches, the project contains an inherent platform for maximizing broader benefits above and beyond important low-tech ways to realize returns on field-based survey research.

BUILDING CAPACITY AND TRANSFERRING TECHNOLOGY

LAPOP research uses a host of tools developed to address the particular challenges of face-to-face survey studies and to minimize survey error. Technology allows us to include experiments and audio and/or visual instruments. However, we also use it for fieldwork oversight: geo-fences to automatically detect out-of-sample interviews; audio or image capture to ensure that questions are read correctly and by the project-certified interviewer; and add-on features to monitor the timing of interviews and to record contact attempts.

In the course of implementing these tools in our studies, we deliberately endeavor to transfer these innovations. In each country in which we work, we conduct a minimum two-day training session with local team members. The training focuses on introducing them to our study protocols and also enhances their general knowledge regarding best practices and innovations in CAPI for face-to-face survey research. We support the adoption by our partners of the techniques we have developed not only for fielding our surveys but also for use in their own future work. The benefit to the field team is a set of expanded capabilities and experiences, which they then can translate into procuring new projects. Indeed, we know that some of our field teams currently offer their clients some of the practices we developed and trained them to use. For example, our partners in Mexico used our methods to collect higher-quality data for a large-scale project focused on issues related to justice and the rule of law in that country. Both the fieldwork team and their clients—often local government agencies or research institutes—are better off because they use best practices and cutting-edge techniques, and policy decisions will be informed by more accurate data.

This type of capacity building is not merely altruistic; there also are returns on this investment for the researcher. By accompanying the team into the field during training, researchers gain a qualitative understanding of dynamics on the ground. Engaging directly with the field team also develops personal relationships that build trust and loyalty among those working on the project. Personal relationships open a channel for input by the field team for potential revisions to programming and protocols. In our experience, local teams trained by LAPOP in our programming have become active problem solvers. In one case, a local team found a way to simplify a set of code, which generated an efficiency for that project and one that we then were able to carry into future studies. Local teams also have offered critical ideas for how to minimize risks to enumerators by creatively camouflaging electronic devices in the field and carrying a blank-paper questionnaire to have material to show to local officials suspicious of enumerators in the field.

By accompanying the team into the field during training, researchers gain a qualitative understanding of dynamics on the ground.

In addition to training local teams, we include students in all aspects of our project activities. In the most recent round of the AmericasBarometer, we involved Vanderbilt graduate and undergraduate students in pretesting in the field, training interviewers, programming, and auditing. In each round of the AmericasBarometer, we involve students in report development. Although many work as paid research assistants, we also developed an Undergraduate Fellows Program, which combines instruction on survey-research methods with opportunities to gain hands-on experience in survey implementation, analysis, and report writing. These experiences are crucial in helping students—both undergraduate and graduate—to develop the types of research skills that they can transfer to future professional endeavors, whether inside the academy, in the private sector, or in policy-making bodies.

DISSEMINATING RESEARCH FINDINGS

Another set of benefits relates to amplifying awareness of and access to project data and key findings. After we complete a round of the AmericasBarometer project, we design a series of reports oriented toward an educated non-academic audience. These reports present a set of basic findings from the project within a policy-relevant framework. Each report attempts to identify policy implications that might be put into practice by those working in international development. However, we recognize our limitations in this regard; therefore, we regularly team with practitioners in international development who provide feedback for or coauthor policy-oriented presentations of the data. All of these reports are made public on our website, and we use various means—presentations, social media, emails, and blog posts in news outlets—to draw attention to the documents and their findings.

Collaboration with non-academic partners provides unique opportunities to insert findings from our project into policy dialogue and reform. Our in-country dissemination of results from the 2014 AmericasBarometer national survey of Paraguay included briefings to high-level officials about corruption. Immediately after our presentations, the Paraguayan National Police announced that in response to our study, it would “take steps to try to eliminate the practice [of corruption] in their ranks.”Footnote 3

In another instance in March 2015, LAPOP’s directors were invited by the US government and Guyanese officials to deliver a series of briefings on the 2014 Guyana AmericasBarometer study. To avoid partiality, we presented the same content to the president and his cabinet, the opposition parties, the public at large, and the local university community. To maximize broader impact, we identified a few particularly important and actionable issues gleaned from the survey data. One issue concerned low and declining trust in the police, which we diagnosed as linked to low perceptions of police responsiveness.

Across the set of presentations, there was ample discussion by local stakeholders about the extent to which this thesis (i.e., that low police responsiveness and low police trust were related and in need of official attention) rang true to their own experiences. The briefings generated more than a dozen news stories and policy dialogue that further elevated the salience of the issue in the country. Shortly after the in-country dissemination activities, the president announced a 15-point police-reform program that addressed police trust and responsiveness and that was explicitly connected to the LAPOP findings. As the president reportedly remarked, “[t]he reality is that LAPOP has published its findings and these have been widely disseminated and therefore must be addressed objectively” (Ganesh-Ally Reference Ganesh-Ally2015).

In addition to developing reports and briefings on the project—and making these publicly available—we maintain a strict standard on data access for the AmericasBarometer project.

In addition to developing reports and briefings on the project—and making these publicly available—we maintain a strict standard on data access for the AmericasBarometer project. We do not embargo datasets. Each round of the AmericasBarometer is released when all data are collected and processed. The raw datasets from the project can be downloaded from the LAPOP website and also are available from several data repositories. We provide these datasets not only in formats most widely used by US-based scholars (e.g., Stata) but also in those typically used by students and local researchers (e.g., SPSS). Recognizing that not all potential beneficiaries of the project have the capacity to analyze the project datasets using commercial statistical software, LAPOP also worked with a partner university in Central America to create and host a portal for interactive, point-and-click queries and analyses. Users not versed in statistical software can generate frequency tables, cross-tabs, and graphs directly from the user-friendly interface. This means that our data are available to academics interested in using them for further research as well as to other groups—including students, journalists, and policy makers—interested in using them to inform policy debates.

RECOMMENDATIONS FOR RESEARCHERS

We recognize that a large research institute such as LAPOP has the capacity to do more to maximize broader impacts than individual investigators. Yet, many of these efforts to increase the number of beneficiaries of our activities can be realized on a smaller scale in individual research projects. We conclude with a list of ways that social scientists can realize broader impact by maximizing benefits within the course of international survey research. Although these practices are written with field-based public opinion studies in mind, we believe that several are transferrable to other projects. Our recommendations are as follows:

  • Collaborate closely with the fieldwork team to transfer an understanding of the project, methods, and protocols.

  • Train the fieldwork team not only in your survey but more broadly in best and novel approaches to research, explaining why you use certain protocols or designs. Be open to suggestions for making protocols more efficient and feasible for the field.

  • Pretest and review questionnaires with local experts to widen the scope of individuals who have input into the project.

  • Develop security plans, revise them with local input, and ensure that they are followed in the field to minimize risk to the fieldwork team.

  • Engage students as much as possible, at all levels and in all aspects of the project.

  • Develop (and seek funding for) a data-management plan before conducting fieldwork. Identify ways to disseminate access to the project data and findings.

  • Host a workshop to share an area of research expertise with local partners, stakeholders, and/or students. Alternatively, host a visiting researcher from a project country at your institution.

  • After completing the study, present your research findings at a local university or research institute. Include this dissemination trip into funding proposals in advance.

  • Write and disseminate a scientific policy-relevant brief about the project for a non-academic audience—ideally in both your home country and the field country.

  • Make the full dataset from your study and accompanying materials—not only an academic replication file—available to the public in multiple formats, including those widely used by local researchers.

Footnotes

1. See also discussion in Sieber (Reference Sieber, Renzetti and Lee1992). An alternative is to adjust the cost–benefit calculation by reducing the benefits by a predetermined amount to counter the propensity to overestimate societal rewards anticipated to flow from one’s research (Desposato Reference Desposato2015). We agree that scholars should take seriously and correct for this tendency. Our non-rival proposal is to increase simultaneously the paths by which a research project can deliver benefits to individuals and groups.

2. The project and many of the practices discussed here were initiated by LAPOP’s founder and senior adviser, Mitchell Seligson. For an overview of the cutting edge in survey research in the developing world, see Lupu and Michelitch (Reference Lupu and Michelitch2018).

3.Medidas Contra los Coimeros.” ABC Color, July 10, 2014. Available at www.abc.com.py/nacionales/policia-anuncia-medidas-contra-coimeros-1264705.html.

References

REFERENCES

45 CFR 46, Subpart A. 2009. “Public Welfare, Department of Health and Human Services: Protection of Human Subjects.” Basic HHS Policy for Protection of Human Subjects. Available at www.hhs.gov/ohrp/regulations-and-policy/regulations/45-cfr-46/index.html.Google Scholar
Desposato, Scott (ed.). 2015. Ethics and Experiments: Problems and Solutions for Social Scientists and Policy Professionals. New York: Routledge.Google Scholar
Ganesh-Ally, Rebecca. 2015. “President Reads the Riot Act! As He Opens Police Officers’ Conference.” Guyana Chronicle, March 12. Available at https://guyanachronicle.com/2015/03/12/president-reads-the-riot-act-as-he-opens-police-officers-conference.Google Scholar
Lupu, Noam, and Michelitch, Kristin. 2018. “Advances in Survey Methods for the Developing World.” Annual Review of Political Science. Available at https://doi.org/10.1146/annurev-polisci-052115-021432.Google Scholar
Montalvo, Daniel, Seligson, Mitchell A., and Zechmeister, Elizabeth J.. Forthcoming. “Data Collection in Cross-National and International Surveys: Latin America and the Caribbean.” In Advances in Comparative Survey Methods: Multicultural, Multinational and Multiregional Contexts, ed. Johnson, Timothy P., Pennell, Beth-Ellen, Stoop, Ineke, and Dorer, Brita. New York: Wiley.Google Scholar
Samuels, David J. 2011. “The Modal Number of Citations to Political Science Articles Is Greater than Zero: Accounting for Citations in Articles and Books.” PS: Political Science & Politics 44 (4): 783–92.Google Scholar
Sieber, Joan E. 1992. “The Ethics and Politics of Sensitive Research.” In Researching Sensitive Topics, ed. Renzetti, Claire M. and Lee, Raymond M., 1426. Newbury Park, CA: Sage Publications.Google Scholar