Hostname: page-component-6bf8c574d5-w79xw Total loading time: 0 Render date: 2025-02-20T23:40:29.872Z Has data issue: false hasContentIssue false

Polling the Pollsters: A Survey of Academic Survey Organizations

Published online by Cambridge University Press:  12 October 2016

Kenneth E. Fernandez
Affiliation:
College of Southern Nevada
Jason A. Husser
Affiliation:
Elon University
Mary G. Macdonald
Affiliation:
Emory University
Rights & Permissions [Opens in a new window]

Abstract

Organizations conducting survey research have remained of vital importance to the social sciences. However, these organizations increasingly face new challenges and opportunities. Survey operations housed in universities and colleges may face special challenges. We present a poll of pollsters, an original survey of leaders of academic survey organizations in the United States. Results explore the various methods used by academic survey organizations and perceptions of challenges in today’s academic and research environments. Responses provide an overview of the career path of academic survey leaders and how those leaders understand the primary missions of their organizations. We conclude with a discussion relevant to social scientists interested in the dynamics of operating these important academic research centers.

Type
Articles
Copyright
Copyright © American Political Science Association 2016 

In reference to survey research, Henry Brady, the former director of the Survey Research Center at the University of California, claimed that “no other method for understanding politics is used more, and no other method has so consistently illuminated political science theories with political facts” (Brady Reference Brady2000, 47). But who conducts this survey research? Quite often it is a survey organization affiliated with an academic institution. These Academic Survey Organizations (ASOs) have a long history bridging political science and other social science disciplines with the public survey research industry.

We see the first of these kind of organizations created in 1935. Paul Lazarsfeld established a research center at the University of Newark (now the Newark campus of Rutgers University). Partly because of inconsistent funding at Newark, Lazarsfeld took a position as the director of the newly formed Office of Radio Research (ORR) at Princeton University. The ORR relocated to Columbia University in 1939 and was eventually renamed the Bureau of Applied Social Research, where survey research was conducted until 1977 (Barton Reference Barton2001).

In the 1940s two of the most prominent academic survey organizations were formed. The National Opinion Research Center (NORC) was founded in 1941 at the University of Denver (but moved to the University of Chicago in 1947). In 1946, the Survey Research Center (SRC) at the University of Michigan was established. Although the founding directors of these first ASOs were not political scientists, these early “academic survey practitioners sought to refine and enrich election polling and surveys” (Wright and Marsden Reference Wright, Marsden, Marsden and Wright2010, 8). For example, Angus Campbell, co-founder of the SRC and a social psychologist, helped direct a national survey in 1948 which eventually became the American National Election Studies (ANES) in 1977.

By the 1960s survey research became a primary method of data collection in the social sciences. Groves (Reference Groves2011) describes the 1960s as the beginning of the era of expansion in the field of survey research. This is certainly true for ASOs. Wright and Marsden (Reference Wright, Marsden, Marsden and Wright2010, 10) note that “at least a score of….academic survey research organizations were established” during this time, including at the University of California (1958), University of Wisconsin (1960), University of Illinois (1964), and Temple University (1967).

Today, academic survey units range from large organizations with dozens of full-time staff and faculty to operations of just one or two staff members. ASOs can be found at small liberal arts colleges as well as major research universities. In 2008, over 40 of these organizations, large and small formed the Academic Association of Survey Research Organizations (AASRO). By 2015, AASRO had 66 member organizations and estimated that there were approximately 86 additional ASOs in the United States that were eligible for membership.

ASOs take on a variety of tasks: collecting key information for federal agencies such as the Centers for Disease Control, assisting other units of their universities with data collection, assisting state and local governments with important planning decisions, and generating results consumed by local, state, and national media outlets. These organizations produce high quality data that are disseminated to researchers around the world. Smith (Reference Smith, Best and Radcliff2005) found that the General Social Survey (GSS), conducted by NORC, had been used in over 7,500 scholarly publications since its inception in 1972. Furthermore, these organizations are critical in training future survey methodologists who then enter the government, commercial, and academic sectors. Smith (Reference Smith, Best and Radcliff2005) estimated that a quarter of a million college students use the GSS in classes each year.

Although the demand for survey data continues to grow, the past two decades have presented a number of challenges to organizations conducting survey research, including declining response rates, increased skepticism of the validity and reliability of polling results, and an uncertain fiscal environment. These challenges exist whether the organization is working inside or outside of academia. However, universities and colleges may face special challenges. Prewitt (Reference Prewitt, Rossi, Wright and Anderson1983) notes that survey organizations are frequently faced with two competing goals: to conduct good research and to survive. These goals can sometimes conflict with one another as Hunt (Reference Hunt1980) notes in his observation of the high mortality rate of many research centers. Similarly, Moore (Reference Moore2014, 1) suggests that for academic survey organizations, “it is the worst of times, it is the best times,” in that new technologies create new opportunities, but declines in institutional support create new challenges. This article reviews the findings of a survey of academic survey organizations. The purpose of the survey is to 1) explore the various methods used by survey organizations; 2) understand how academic survey organizations perceive current challenges in today’s academic and research environments; and 3) understand the primary missions of these ASOs.

Although the demand for survey data continues to grow, the past two decades have presented a number of challenges to organizations conducting survey research, including declining response rates, increased skepticism of the validity and reliability of polling results, and an uncertain fiscal environment.

POLLING THE POLLSTERS

This study is based on an online survey of directors of ASOs. Footnote 1 The list of ASOs and their contact information was created using a variety of sources including the Academic Association of Research Organizations (AASRO) and the list of Academic and Not-for Profit Survey Research Organizations (LANSRO). Footnote 2 After identifying a number of organizations that had closed, we were left with a list of 136 organizations that we believe might conduct survey research. Footnote 3 Only 115 organizations opened the email invitation to the survey (two of these immediately clicked the opt-out link). We used multiple email reminders and follow-up phone calls to boost the response rate. Ninety-four organizations started the survey and 80 completed the entire questionnaire. The 59% response rate is similar to other surveys targeting academic survey organizations (e.g., the surveys conducted by the AASRO). The survey was implemented between April 26, 2015 and June 1, 2015. Given that we attempted to reach all members of the population of ASO directors, this study is more akin to an enumeration or a canvas than a survey. True population parameters do not exist for the population of ASO directors. We are unable to determine precisely if those who did not complete the instrument are substantially different from those who did.

WHO LEADS ACADEMIC SURVEY ORGANIZATIONS?

Tables 1 and 2 present demographic information about the leaders of academic survey organizations. Nearly half (48%) of the directors who answered the survey had an academic background in political science. Sociology (10%), then psychology (9%) were the next most common disciplinary backgrounds. The remaining directors came from a wide variety of backgrounds: social science (5%), business (4%), public administration/affairs (4%), survey methods/statistics (4%), criminal justice/criminology (3%), and education (3%), with 9% mentioning some other field.

Table 1 Field of Highest Degree

Table 2 Characteristics of Directors of Survey Organization

Most ASO leaders are male (58%) and almost all are white (90%). Notably, none of the organizations that responded to our survey had a responding director who identified as black or African American. Directors typically have 13–15 years of experience in the field of survey research. Turnover seems relatively low for director positions given that most have worked between seven and nine years with his or her current ASO. Prewitt (Reference Prewitt, Rossi, Wright and Anderson1983, 142) notes that “it is not unusual to find survey organizations directed by individuals reluctant to be directing.” The results here do not generally support such a claim, but perhaps the reluctant directors (and new directors) were also reluctant to take part in the survey or more difficult to reach.

As we note later in this paper, training and educating undergraduate and graduate students is often an important element of a survey research organization. Our survey found that nearly three-fourths of the directors teach courses at their institution. Approximately half (52%) were tenured or tenure-track faculty, 22% non-tenure track faculty, 14% were non-faculty administrators, 10% were university/college staff members, and 3% held some other type of position. Female directors were far less likely than male directors to hold tenure-track or tenured positions (31% vs 67%). In contrast, 44% of female directors held staff or non-faculty administrator positions compared to 11% of male directors. Similarly, 83% of male directors taught semester length courses compared to 56% of female directors.

WHAT ARE THE METHODOLOGICAL NORMS OF ASOS?

Conducting survey research is arguably among the most labor intensive methodologies in all of the social sciences. In fact, Prewitt (Reference Prewitt, Rossi, Wright and Anderson1983) suggests survey research is more labor intensive than most methods used in the natural sciences. It requires a variety of tasks, including recruiting and training quality interviewers, maintaining software and equipment, storing, processing, and analyzing survey data, and disseminating results to multiple, diverse audiences. Kennedy, Tarnai, and Wolf (Reference Kennedy, Tarnai, Wolf, Marsden and Wright2010, 589) suggests “survey research mixes science, engineering, and art” and managers of survey organizations frequently must draw on a range of different techniques to collect valid and reliable data. Many of the organizations that we surveyed have been conducting survey research for over two decades (median age of the ASOs was 23 years, with the youngest being only a year old and the oldest having been in practice for 91 years) and the directors who manage these organizations typically have over a decade of experience in the field of survey research.

Over a half-century ago, Campbell (Reference Campbell1953, 228) advised that managing a research organization requires “an adaptability to changes in external circumstances.” We find that ASOs remain adaptable to a rapidly changing environment. ASOs currently employ a variety of methods and techniques to conduct survey research projects ranging from small projects costing $2,000 to larger projects costing $500,000. Table 3 provides an overview of some of the methodological norms among ASOs.

Table 3 Survey Methodology

Target Population

Organizations were asked which geographic area or population they typically target. Respondents could choose multiple populations. Ninety-four percent have conducted surveys of state populations while 69% said they conduct surveys of one or more towns, cities, or counties. About a third (35%) conduct national surveys, and over half (58%) conduct surveys of students or other university populations. A few organizations (8%) mentioned targeting some other population (i.e., specific industries). Only one ASO (1.25%) conducted multinational survey research. Within these geographic areas and populations, ASOs typically sample adults within households (78%), but 43% also target registered voters and 34% use likely voter models to identify eligible respondents. Footnote 4 Over half (54%) of these organizations said they do ask “horse-race questions” asking respondents for who they plan to vote. Prior studies have found evidence that polls greatly influence how the media covers elections, causing them to focus on who is winning rather than why (Rhee Reference Rhee1996; Rosenstiel Reference Rosenstiel2005); however one might also infer that the desire for increased media coverage may influence the choices survey organizations make. Later in the paper we note that promoting the university’s brand and producing results for media organizations are seen as important goals of ASOs.

Survey Mode

Approximately 69% of ASOs said their primary mode of conducting surveys was by telephone, followed by 19% primarily using online/Internet surveys. Seven percent use mail surveys and only 1% use face-to-face as the primary method of conducting surveys. Five percent stated they use mixed or multi-mode methods. A follow-up question asked what other survey modes were used. Although only 19% of ASOs use online surveys as their primary mode of collecting data, 73% said online surveys were a method used at least occasionally by the organization. In terms of non-primary modes, 58% said they use mail surveys, 51% use face-to-face, and 35% said they conduct telephone surveys in addition to their primary mode.

Of the 75 ASOs that conduct telephone surveys, almost all (96%) use live callers to conduct their interviews. Only one organization (1.33%) said it uses an Interactive Voice Response (IVR) system, popularly known as robopolls. Two additional ASOs (2.66%) used both live callers to reach cell phones and IVR to reach landlines. Most (70%) conduct their telephone surveys on-site while 15% contract out to another calling center and 15% use a combination of on-site and contract data collection. In-house live caller operations are clearly the norm for ASOs.

Almost all telephone surveys (93%) conducted by ASOs use a dual-frame (landline and cell phone) sampling frame with a median sample makeup of 60% cell phone numbers and 40% landline. The median number of telephone surveys conducted by these ASOs was eight per year. ASOs reported a median sample size per survey of 750 respondents with a median survey length of 13 minutes. The median cost reported per survey was $15,000.

The Internet appears well positioned to grow as a mode for ASOs. Eighty percent (72 of 80) of organizations reported using online surveys to collect data. Samples for these online surveys come from a variety of sources. Over a third of organizations (39%) said they used Survey Sampling International’s (SSI) online panel; 24% said they use Qualtrics; Footnote 5 10% use the Knowledge Network; 6% Amazon Mechanical Turk; 6% use YouGov/Polimetrics, and 1% used Survey Monkey. Thirteen percent mentioned some other source for online samples (e.g. GSR, Marketing Systems Group, ASDE, STS Samples).

WHAT ARE THE GOALS OF ACADEMIC SURVEY ORGANIZATIONS?

Academic survey organizations often have multiple, overlapping, and sometimes conflicting goals and missions. ASOs are in the business of conducting rigorous, social scientific research. However, this goal often requires attracting contract work, especially for the ever increasing number of organizations that are being required to be self-funded (Moore Reference Moore2014). In addition, the academic organizations work with faculty and students. An educational mission is consequently often explicitly or implicitly attached to the survey organization.

Organizations that “conduct research as a service at the specification of outside contractors” must balance the requirement of meeting a client’s needs and the desire to conduct research that addresses more traditional academic research goals, such as hypothesis testing (Stahler and Tash Reference Stahler and Tash1994). Furthermore, Campbell (Reference Campbell1953) notes that research organizations embedded in a “highly centralized hierarchy” [like that of a university] might face additional constraints. Academic organizations frequently must meet a higher threshold regarding the protection of the well-being of human subjects (e.g., Institutional Review Boards).

Directors were given a list of goals that some organizations may have. They were asked to state whether they considered the goal to be very important, important, somewhat important or not important (see table 4). Interestingly, although all of the organizations are “academic” organizations, training and educating undergraduate or graduate students and generating data for peer-reviewed research were not the highest rated goals. Instead, “promoting your university’s name, brand, or reputation” had the highest number of directors saying such a goal was very important (52%). This was followed by 46% rating very important a similar promotional goal of “producing results for state and local media organizations.” Prewitt (Reference Prewitt, Rossi, Wright and Anderson1983) notes that “prestige” is a common goal for organizations because it facilitates the achievement of other goals.

Interestingly, although all of the organizations are “academic” organizations, training and educating undergraduate or graduate students and generating data for peer-reviewed research were not the highest rated goals.

Table 4 Importance of Academic Survey Organizational Goals

WHAT CHALLENGES DO ACADEMIC SURVEY ORGANIZATION SEE FOR THEIR PROFESSION?

As noted above, survey research can be an extremely labor intensive method of collecting data; furthermore it is a method facing a rapidly changing technological environment (i.e., cell phones, Internet). These features create interesting challenges to those who direct academic survey organizations. Our survey instrument asked directors an open-ended question: “What do you think is the greatest challenge faced by public opinion survey research organizations in the United States today?” One of the most commonly mentioned challenges is the problem of declining response rates, especially in regards to telephone surveys. Related to declining response rates is the increasing costs of conducting rigorous survey research, combined with the reality of an uncertain funding environment. Another concern mentioned was the changing technological environment and the use of Internet surveys. In addition, directors voiced concern over the growth in the number of low quality surveys which compete for attention for the products produced by ASOs. Table 5 details responses about the challenges facing the academic survey research field.

Table 5 Challenges Faced by Survey Organizations

Declining Response Rates

Academic survey organizations generate survey data critical to the needs of scholars, government agencies and for profit and non-profit organizations. However, declining response rates are threatening the quality and reliability of survey information (Fulton Reference Fulton2016). By 2000 nearly half of households had caller-id (Dutwin et al. Reference Dutwin, Herrmann, Ben Porath and Sherr2011) and with the almost ubiquitous use of cellphones and voicemail, respondents are much more willing to screen calls. The result was a decline in response rates. A Pew study found a 9% response rate in 2012, a sharp decline from a 36% rate in 1997 (DeSilver and Keeter Reference DeSilver and Keeter2015). Nearly a third of our respondents mentioned declining response rates as the greatest challenge facing survey research. Related to the dilemma of declining response rate is the ability of ASOs to obtain a representative sample of a target population. We found 14% of directors mentioned sample quality or representativeness as the primary challenge their organization faced.

Cost/Funding Issues

Declining response rates generally mean an increase in the cost of collecting the needed sample size (Peytchev 2013). Rising costs, combined with a severe economic downturn in 2009, placed many academic survey organizations in a precarious position because of widespread cuts in funding to public universities (Moore Reference Moore2014). In addition, government agencies also saw cuts which often translated into a reduction in the number of research contracts (Guterbock Reference Guterbock2010). Gone are the days of ever increasing government funding for public higher education, a trend that fueled growth in many academic survey centers (O’Rourke, Sudman, and Ryan Reference O’Rourke, Sudman and Ryan1996).

We started this article with a quote from Henry Brady, who directed the Survey Research Center at Berkeley from 1999 to 2009; however, this ASO was closed by the university in 2010. Our survey uncovered a number of closures of other ASOs. These included operations at University of California, Santa Barbara (2014); University of North Texas (2014), Wake Forest University Footnote 6 ; SUNY Buffalo Footnote 7 ; Florida International University (2011); Ohio State University (2004); and Indiana University-Purdue University-Indianapolis (2014). The chronology of these closures coincides with the increased practice of contracting out to private survey research firms (Schoenherr, Ellram, and Tate 2015). Not surprisingly, 16% of directors mentioned cost or funding issues as the greatest challenge facing survey organizations.

Changing Technologies

Losch (Reference Losch2013) notes the challenge facing survey organizations to keep pace with the constantly changing technological landscape. Directors of academic organizations may be slow to adopt new technologies, having been trained using older methods. Babbie (2015) notes academics’ prior skepticism of moving from face-to-face surveys to telephone surveys. Similarly, academics resisted the use of focus groups for nearly 30 years after market researchers discovered their usefulness (Krueger and Casey 2009). Until recent years, most academics did not take Internet surveys very seriously. The New York Times collaboration with YouGov in 2014 was criticized as a reversal of its position that for a poll to be “worthy of publication” it “must be representative, that is, based on a random sample of respondents.” The president of the American Association of Public Opinion Research (AAPOR), the flagship organization in the field, wrote a letter warning of the implications of the adoption of such methods (Craighill and Clement Reference Craighill and Clement2014).

The Internet, of course, creates opportunities for new ways of reaching populations and getting responses from individuals, especially regarding visual information. However, academia has sometimes felt uneasy about changing technologies and methods. Guterbock (Reference Guterbock2013) recalls an AAPOR conference in the 1990s where Don Dillman, a professor of sociology at Washington State University and deputy director of Social and Economic Sciences Research Center (SESRC) joked: “I’m here today to talk about the Internet, otherwise known as the Survey Methodologist Re-employment Act of 1998.” The fear was that the Internet would create numerous “Do It Yourself” (DIY) survey packages or tools that would compete with ASOs.

Guterbock (Reference Guterbock2013) suggests that such fears were somewhat exaggerated. Evidence suggests academic survey organizations have started to embrace and adapt to the Internet, rather than seeing it only as a threat. Brown and Johnson (Reference Brown and Johnson2011) found that more and more organizations are using the Internet to sample populations, at least in conjunction with other survey modes. As noted earlier, our survey found 19% of ASOs use online surveys as their primary mode of data collection, while 73% use it at least occasionally. This suggests there is a growing acceptance of online surveys as a valid method of data collection; however, the survey also found approximately 5% of directors mentioned new technologies or online surveys as the greatest threat to survey organizations.

HOW DO ACADEMIC SURVEY ORGANIZATIONS VIEW POLL AGGREGATORS?

In response to our question about challenges facing ASOs, 10% of directors mentioned the challenge of communicating survey results to the public. Others mentioned the growing skepticism the public has of survey research. Although no one mentioned polling aggregators such as Fivethirtyeight or Huffington Post, specifically, as challenges facing ASOs, we would argue that the popularity of these websites is related to the challenges of communicating results to an ever skeptical public. Blumenthal (Reference Blumenthal2014) noted a USA Today headline that read: “Election aftermath: How’d pollsters like Nate Silver do?” The headline conflates Silver, an aggregator of polls, as a pollster. This highlights that not only does the public not always understand what survey organizations do, but journalists might also have misperceptions of survey research. Blumenthal also hints that individuals in the survey research profession might begin to resent poll aggregation because aggregators receive more attention from journalists than do the survey organization that actually collected the polling data. As one member of the profession suggested, “it is much easier, cheaper, and mostly less risky to focus on aggregating and analyzing others’ polls (p. 297).”

We asked directors a series of questions about poll aggregators and found a diverse set of opinions (see table 6). A majority (55%) of directors agreed the poll aggregators increased interest in survey research among the public and the media, an opinion suggesting a win-win for both aggregators and ASOs. However, some 49% also believed that aggregators led to an increase emphasis on “horse race” questions, which are not always seen by political scientists as of great interest to serious scholars. Many directors, 43%, agreed that such websites shifted attention from individual surveys to the aggregator (thus why journalists might consider Nate Silver a pollster). Some (34%) also agreed poll aggregators helped to give low quality surveys legitimacy.

Table 6 ASO Leaders Opinion of Poll Aggregators

Blumenthal (Reference Blumenthal2014, 298) seems to acknowledge this as well and observes that most of the “fuel for polling aggregation…..comes from inexpensive automated polls whose producers face a different set of costs and incentives than traditional pollsters.” However, these poll aggregators do score or weight the surveys they use in their aggregations. Therefore, such websites may actually help identify ASOs that conduct rigorous survey research. Nearly a quarter of directors seem to agree with this logic and believe poll aggregators enhance the perceived value of high quality surveys. Furthermore, the success of poll aggregators in predicting elections may also increase the confidence the public has in surveys (23% of directors agreed with this statement while 22% disagreed and 55% neither agreed nor disagreed).

SUMMARY

ASOs are more akin to research centers than academic departments. As such, they are relatively new organizations in the scope of the long history of universities (Stahler and Tash Reference Stahler and Tash1994). Converse (Reference Converse1987) estimated only about eight academic survey organizations existed in the United States in 1960 and 20 by 1970. By 1987 Sudman and Bradburn (Reference Sudman and Bradburn1987) found approximately 50 ASOs. And today our study has generated a list of 136 academic survey organizations. Guterbock (Reference Guterbock2010) argues that in no other country are academic survey centers as numerous as in the United States.

Yet, these survey organizations face numerous challenges. This study highlights several of these challenges including declining response rates, rising costs to obtain representative samples, an uncertain fiscal environment, a changing technological landscape, and overlapping and competing missions and goals. But ample reasons remain for optimism about the future of ASOs. The demand for high-quality survey data is increasing alongside the complexity of the methodologies used to collect such data. Academic survey organizations are clearly well-positioned to meet this demand. Moore (Reference Moore2014) argues that these organizations are such valuable research partners because they combine expertise in survey methodology with infrastructural capacity. Although ASOs are typically small compared to other research centers at universities, their small size provides a certain nimbleness allowing flexibility, innovation, and experimentation (Guterbock Reference Guterbock2010). Given the rapidly changing environment that public opinion scholars are facing today, the nimbleness and flexibility of ASOs are exactly what is needed in the fields of survey research and political science.

SUPPLEMENTARY MATERIAL

To view supplementary material for this article, please visit http://dx.doi.org/10.1017/S104909651600144X.Footnote *

Footnotes

1. The complete questionnaire is available in Online Materials.

2. See Online Appendix A for the description of sources used to generate the sampling frame of directors of ASO.

3. See Online Appendix B for a full list of organizations in our sampling frame

4. A majority of organizations conducted surveys in other languages besides English (60%). Among these organizations all but one mentioned Spanish as another language used. Others mentioned French, Creole, Portuguese, Korean, and Vietnamese.

5. Qualtrics uses multiple panel providers including the Survey Sampling International’s (SSI) online panel.

6. Closed sometime around 2007-2008.

7. Closed sometime in the 1970s (Hunt Reference Hunt1980)

* The URL to access Supplementary Material for this article has been corrected since the original publication. An Erratum detailing this change was also published (DOI: 10.1017/S1049096516002481).

References

REFERENCES

Barton, Allen H. 2001. “Paul Lazarsfeld as Institutional Inventor.” International Journal of Public Opinion Research 13 (3): 245269.CrossRefGoogle Scholar
Blumenthal, Mark. 2014. “Polls, Forecasts, and Aggregators.” PS: Political Science & Politics 47 (2): 297300.Google Scholar
Brady, Henry. 2000. “Contributions of Survey Research to Political Science.” PS: Political Science & Politics 33 (1): 4757.Google Scholar
Brown, Ethan and Johnson, Timothy P.. 2011. “Diffusion of Web Survey Methodology, 1995–2009.” Survey Research 42 (1): 13.Google Scholar
Campbell, Angus. 1953. “Administrating Research Organizations.” American Psychologist 8 (6): 225–30.CrossRefGoogle Scholar
Converse, Jean. 1987. Survey Research in the United States: Roots and Emergence 1890-1960. Berkeley, CA: University of California Press.Google Scholar
Craighill, Peyton M. and Clement, Scott. 2014. “How the Polling Establishment Smacked Down the New York Times.” The Washington Post, August 4. https://www.washingtonpost.com/news/the-fix/wp/2014/08/04/how-the-polling-establishment-smacked-down-the-new-york-times/ Google Scholar
DeSilver, Drew and Keeter, Scott. 2015. “The Challenge of Polling when Fewer People are Available to be Polled.” July 21. http://www.pewresearch.org/fact-tank/2015/07/21/the-challenges-of-polling-when-fewer-people-are-available-to-be-polled/ (July 4, 2016).Google Scholar
Dutwin, David, Herrmann, Melissa, Ben Porath, Eran, and Sherr, Susan. 2011. “A Collection of Caller ID Experiments.” Survey Practice 4 (4): 16.Google Scholar
Fulton, Brad R. 2016. “Organizations and Survey Research: Implementing Response Enhancing Strategies and Conducting Nonresponse Analyses.” Sociological Methods & Research. (Forthcoming—published online January 19, 2016).Google Scholar
Groves, Robert M. 2011. “Three Eras of Survey Research.” Public Opinion Quarterly 75 (5): 861–71.Google Scholar
Guterbock, Thomas M. 2010. “The Future of the Academic Survey Research Organization.” Survey Research 41 (1): 15.Google Scholar
Guterbock, Thomas M. 2013. “Welcoming the DIY Survey Sector to Campus.” Survey Research 44 (1): 15.Google Scholar
Hunt, Raymond G. 1980. “The University Social Research Center: Its Role in the Knowledge-Making Process.” Science Communication 2 (1): 7792.Google Scholar
Kennedy, John M., Tarnai, John, and Wolf, James G.. 2010. “Managing Survey Research Projects.” In Handbook of Survey Research, eds., Marsden, P. V. and Wright, J. D., 575590. Bingley, UK: Emerald Publishing.Google Scholar
Losch, Mary E. 2013. “Survey Frontiers: A Brief Romp through the Tech Terrain and Its Changing Landscape.” Survey Research 44 (3): 14.Google Scholar
Moore, Danna L. 2014. “Detecting Winds of Change: The Role of Academic Survey Research Organizations in Academe.” Survey Research 45 (2): 13.Google Scholar
O’Rourke, Diane, Sudman, Seymour, and Ryan, Marya. 1996. “The Growth of Academic and Not-for-Profit Survey Organizations.” Survey Research 27: 15.Google Scholar
Prewitt, Kenneth. 1983. “‘Management of Survey Organizations.” In Handbook of Survey Research, eds. Rossi, P. H., Wright, J. D. and Anderson, A. B., 123-144, New York: Academic Press.Google Scholar
Rhee, June Woong. 1996. “How Polls Drive Campaign Coverage: The Gallup/Cnn/USA Today Tracking Poll and USA Today’s Coverage of the 1992 Presidential Campaign.” Political Communication 13 (2): 213–29.Google Scholar
Rosenstiel, Tom. 2005. “Political Polling and the New Media Culture: A Case of More Being Less. Public Opinion Quarterly 69 (5): 698715.CrossRefGoogle Scholar
Smith, Tom. 2005. “General Social Survey.” In Polling America: An Encyclopedia of Public Opinion, Vol. 1: A – O, eds. Best, S. J. and Radcliff, B., 271275. Westport, CT: Greenwood Publishing.Google Scholar
Stahler, Gerald J. and Tash, William R.. 1994. “Centers and Institutes in the Research University: Issues, Problems, and Prospects.” The Journal of Higher Education 65 (5): 540–54.Google Scholar
Sudman, Seymour and Bradburn, Norman M.. 1987. “The Organizational Growth of Public Opinion Research in the United States.” Public Opinion Quarterly 51: S67–S78.Google Scholar
Wright, James D. and Marsden, Peter V.. 2010. “Survey Research and Social Science: History, Current Practice, and Future Prospects. In Handbook of Survey Research, eds. Marsden, P. V. and Wright, J. D., 326. Bingley, UK: Emerald Publishing.Google Scholar
Figure 0

Table 1 Field of Highest Degree

Figure 1

Table 2 Characteristics of Directors of Survey Organization

Figure 2

Table 3 Survey Methodology

Figure 3

Table 4 Importance of Academic Survey Organizational Goals

Figure 4

Table 5 Challenges Faced by Survey Organizations

Figure 5

Table 6 ASO Leaders Opinion of Poll Aggregators

Supplementary material: PDF

Fernandez et al. supplementary material

Survey

Download Fernandez et al. supplementary material(PDF)
PDF 187.9 KB

A correction has been issued for this article: