Hostname: page-component-745bb68f8f-grxwn Total loading time: 0 Render date: 2025-02-06T00:54:11.415Z Has data issue: false hasContentIssue false

Why they shared: recovering early arguments for sharing social scientific data

Published online by Cambridge University Press:  15 March 2021

Emily Hauptmann*
Affiliation:
Western Michigan University E-mail: emily.hauptmann@wmich.edu
Rights & Permissions [Opens in a new window]

Argument

Most social scientists today think of data sharing as an ethical imperative essential to making social science more transparent, verifiable, and replicable. But what moved the architects of some of the U.S.’s first university-based social scientific research institutions, the University of Michigan’s Institute for Social Research (ISR), and its spin-off, the Inter-university Consortium for Political and Social Research (ICPSR), to share their data? Relying primarily on archived records, unpublished personal papers, and oral histories, I show that Angus Campbell, Warren Miller, Philip Converse, and others understood sharing data not as an ethical imperative intrinsic to social science but as a useful means to the diverse ends of financial stability, scholarly and institutional autonomy, and epistemological reproduction. I conclude that data sharing must be evaluated not only on the basis of the scientific ideals its supporters affirm, but also on the professional objectives it serves.

Type
Research Article
Copyright
© The Author(s) 2021. Published by Cambridge University Press

Few institutions built by and for social scientists have had the enduring success and disciplinary power of the Institute for Social Research (ISR) and the principal data-sharing entity that grew out of it, the Inter-university Consortium for Political and Social Research (ICPSR).Footnote 1 ISR began in 1949 as the organizational “umbrella” for the Survey Research Center (SRC) and the Research Center for Group Dynamics (RCGD).Footnote 2 By the end of the twentieth century, ISR had become a powerful, prosperous, and semi-autonomous entity overseeing five separate research centers; ISR currently calls itself one of “the world’s largest and oldest academic survey research organizations.”Footnote 3 Like its parent institution, ICSPR is also a large, vigorous, even dominant institution in U.S. social science today: hundreds of institutions pay thousands of dollars in annual membership fees for access to ICPSR data as well as to its summer training programs in quantitative methods for social scientists.Footnote 4

When ISR began, however, its work had little standing among academic social scientists; if anything, political scientists knew even less about it. From the late 1940s to the early 1960s, the organizations that made up ISR occupied a small, insecure space on the periphery of the University of Michigan. The oldest of them, the Survey Research Center (SRC), was a spin-off of a federal government research unit in the Department of Agriculture created during World War II. Shortly after the U.S. Congress cut its federal funding in 1946, some of its staff affiliated with the University of Michigan. This was an arms-length arrangement on the university’s part; it made no financial commitment to SRC and offered no secure academic appointments to its staff (Converse Reference Converse1987, 340–349; Frantilla Reference Frantilla1998, 16–21). As SRC and RCGD, its sibling institution at ISR, struggled to gain a more secure place in the university and its academic culture during their first decade at Michigan, they were sustained by income from significant federal government contracts (Converse Reference Converse1987, 353–356; Frantilla Reference Frantilla1998, 22–24 and 29-30). ICPR followed a similar path in the 1960s, securing substantial National Science Foundation (NSF) support soon after its debut. By the 1960s, when the volume of federal money flowing into ISR and ICPR grew, being on the margins of the University of Michigan’s core academic structures was no longer a vulnerability. It had become a source of strength.

When people with ties to ISR and ICPSR explain their longevity and power, they focus less on these institutions’ development on the margins of the academy than on the distinctive data sharing practices of their founders. In the mid-twentieth century, this story goes, most social scientists thought of their data as their own personal intellectual property and were therefore unwilling to share it with others. The social scientists at ISR, however, particularly those central to building what became ICPR, argued that data could be shared without disadvantaging those who had produced it – and, for the sake of advancing social science, it should be shared. Going a step beyond Merton’s (1973, 273-275) celebration of that element of the “scientific ethos” that impels scientists to share their discoveries and treat “scientific knowledge as common property,” people affiliated with ISR and ICPSR praised their founders for sharing not just their results but also the data that were used to arrive at those results. Consider the following three examples.

The first comes from a 1998 University of Michigan documentary film made to commemorate ISR’s fiftieth anniversary. In the portion of the film devoted to ICPSR, its then director, Richard Rockwell, underscores how novel the institution was in its early years. At a time when many social scientists thought sharing their data was like “giving away your capital,” Rockwell emphasizes that ICPR cultivated a new ethic. Not only could data be shared without “anything being given away”; actively sharing data might even increase its value (Rockwell 1998). Counterintuitive though this ethic initially seemed, Rockwell says social scientists gradually began to see that sharing their data would not only benefit the social scientific community, but might also benefit the individual scholars as well.

The second example comes from long-time staff member Erik Austin’sFootnote 5 recent account of the early years of ICPR. According to Austin, the prevailing attitude towards data sharing in the mid-twentieth century was an unfavorable mix of hostility and incomprehension.

For most but not all of this passage, I’m quoting Austin who’s also quoting someone else. I have highlighted my own words. All the rest are Austin’s words; those that appear in single quotation marks are when he is quoting Warren Miller. Here is how I think this passage appears:

“Scientific data from all disciplines were jealously guarded by their developers/creators as private resources…. [Warren] Miller [one of the founders of ICPR] even likened [data sharing], in retrospect, to a violation of basic economic precepts: data were the scientist’s capital, and ‘they weren’t about to share their capital’.” Though many social scientists now readily accept “this data sharing motive as a prerequisite of the ‘scientific ethic’ of verification, replication, and validation,” this was not a widely held ethic when ICPR began. Then, “the concept of giving access to all interested scholars to one’s basic (micro) data was so foreign as to be considered ‘revolutionary’.” (Austin 2011)

Finally, Nancy Burns, Warren E. Miller Professor of Political Science and Chair of the University of Michigan’s Political Science Department, credits Miller’s “intellectual vision and organizational genius” with transforming the 1950s Michigan election studies first done by the SRC into the American National Election Studies (ANES). Today, the ANES is a massive, cumulative dataset that has been supported by NSF as a national resource since 1977.Footnote 6 Miller’s early work, Burns says, laid the groundwork for the widely used and financially secure “public good” that ANES is today (Burns Reference Burns2006, 11). Though it took some time to realize on a larger scale, Burns argues that this vision was already fully formed at ISR in the early 1950s: “In a departure from standard practice at the time, the Michigan scholars shared their data with other scholars from the beginning” (ibid., 8). Burns traces this data-sharing ethic back to almost the origin of ISR, the grant proposal written by SRC social scientists for what became the 1952 election study (ibid., 8, note 25).Footnote 7

Retrospective accounts of the origins of powerful institutions favor simple, elegant narratives that smooth out the untidy snarl of uncertainties and multiple motives that so often attend the construction of new things. The accounts discussed above emphasize ICPR’s founders’ prescient sense of the scientific merits of data sharing, underscoring both its novelty in the 1950s and 1960s and its salutary effects on social science in the decades to come. But on my reading of archived records and personal papers as well as the recollections of several people centrally involved in the creation of ICPR, data sharing during these early years was never understood primarily as an ethical imperative. Instead, at ISR/SRC and then later at the early ICPR, data sharing was regarded as a way to defray or disperse the costs of producing data and possibly as a path to greater intellectual autonomy. It was also understood as one way to promote the analytic methods and theoretical outlook its producers believed should be central to social science. Data sharing began as something ISR and SRC researchers did to survive. Gradually, as their fortunes improved, they built it into ambitious bids for greater institutional autonomy and epistemological reproduction. And by the early 1960s, they won substantial grants from federal government sources to support their data-sharing efforts (Frantilla Reference Frantilla1998, 51–52; 56). But neither at mid-century nor in retrospect did they say they advocated data sharing for the sake of ideals like Merton’s “communism” (1973) among scientists or the more recently affirmed standards of transparency, verification, and replication.Footnote 8 Clothing data sharing in social science’s noblest garb is the work of later generations.

As historically and sociologically suspect as the examples I cited above may be, they illustrate the power ICPSR now has to write its own history. Accounts that feature the heroic prescience of ICPSR’s early leaders, however, ultimately say more about its current power than about why it became powerful and how it and its forerunners’ early approaches to data sharing might have helped them become so. So why and to what ends did the founders of ISR and ICPSR share their data? Relying primarily on ISR and ICPSR’s archived records as well as the personal papers and oral histories of some of their central figures, I explore when and why the possibilities of sharing data were first discussed. I conclude that the earliest discussions of data-sharing in the 1950s and early 1960s unfolded to address two basic problems: the economic viability of academically marginal research institutes that made up ISR and the reproduction of the approaches to social science those affiliated with them favored.

Throughout this paper, I situate the early institutional lives of ISR and ICPR in the broader context of several developments affecting social scientific research in the U.S. during the Cold War. First, as both Joy Rohde (Reference Rohde2013) and Sonja Amadae (Reference Amadae2003) have shown, Cold War imperatives made it possible to do significant social scientific research on the margins of the academy. Like Rohde’s analysis of the U.S. Army’s Special Operations Research Office (SORO)’s loose affiliation with American University or Amadae’s discussion of the academic influence of the RAND Corporation and its numerous alumni, I discuss how ISR’s early contributions to academic social science depended upon the work its staff did for government and corporate clients. Second, I note how uncertainties about who the patrons of postwar social scientific research would be ultimately boosted the academic power and credibility of ISR and especially the newly formed ICPR. As Thomas Gieryn (Reference Gieryn1999), Hunter Crowther-Heyck (Reference Crowther-Heyck2006), and Mark Solovey (Reference Solovey2013) have all shown, social scientists had little access to NSF support before the 1960s; those who secured such support early were able to translate the often relatively large, long-term NSF grants into considerable disciplinary power. When ICPR secured its first large commitment from the NSF in the 1960s shortly after its founding, it became an important intermediary between social scientists and federal government funding.Footnote 9

Another relevant aspect of the story of ICPR – one that I only touch on towards the end of this paper – concerns how this institution and the approaches associated with it became so important to political science in the U.S. Initially, ICPR’s parent institutions were staffed by many more sociologists and social psychologists than political scientists; their relations with Michigan’s political science department were distant. But just before Miller officially launched ICPR, he also helped begin a graduate program in political behavior in cooperation with Michigan’s political science department.Footnote 10 ICPR and Michigan’s other social science research institutions would prove crucial to the success of that program and, eventually, to the study of political behavior in U.S. political science.

ICPR’s early, steady access to NSF patronage and its alliance with Michigan’s political science department each proved vital to its becoming so dominant an institution. Data sharing alone was not the key to ICPR’s survival; acquiring data sets and building a membership base of people trained to use them were more important in ICPR’s first decade. Indeed, during that time, ICPR had to contend with stiff competition from other institutions that also archived electoral and public opinion data.Footnote 11 In what follows, therefore, I argue the data-sharing at ISR/SRC that enabled the creation of ICPR was inspired less by the Mertonian ideal of the “communism” of science than by these not-quite-yet-academic researchers’ struggle to stay afloat financially and win academic credibility (Camic et al. Reference Camic, Gross and Lamont2011; Gieryn Reference Gieryn1999). Attending to what the researchers at ISR/SRC said about the particular pressures and constraints they faced and how they contrived to ease and overcome them gives us a fuller sense of why they shared.

Data sharing in the pre-ICPR years

Though emphasizing the scientific merits of sharing data is relatively new, talking about data in economic terms is not. To compare data to economic entities like capital, treasuries, and public goods as the examples cited above do is not peculiar to retrospective accounts; it has a history as long as ISR’s. The persistent recurrence of these comparisons between data sharing and economic practices offers a window onto the early institutional lives of these two entities when their solvency and their futures were far from assured. Looking through that window, I believe, allows one a glimpse not only of the early institutional lives of both organizations but also of how new data sharing practices and their institutionalization under ICPSR transformed political science.

Because ISR stood so precariously on a narrow financial ledge in the 1940s and 50s, its members first discussed the question of how much of the data they produced should be shared and also with whom in economic terms out of financial necessity (Converse Reference Converse1987, 340–349). That is, these issues were understood explicitly as economic ones from the outset, parts of the larger problem of how ISR was to continue to support itself. ISR’s uncertain prospects also prompted its leaders to think about how to cultivate a steady and robust demand for what it produced; the earliest sustained data-sharing plans, I believe, were embedded in strategies to cultivate such demand. By the early 1960s, arrangements for sharing data were being crafted in tandem with arrangements for sharing the costs of producing and distributing it as well as training people to use it. Understanding what ICP(S)R and the ANES have meant for political science, therefore, means pushing past retrospective praises of the scientific merits of the data sharing they enabled to make visible the significant costs of the institutional structure that made it possible – and how these costs were and are borne by the discipline as a whole.Footnote 12

Data shared at the discretion of the funder

Financial necessity rather than a commitment to social scientific transparency drove the earliest instances of data sharing at ISR. The centers that made up ISR in the 40s and 50s did not have the means to do survey research on their own; nearly all the survey work they conducted was supported by either contracts or grants from outside the university. At the same time, ISR’s arrangement with the University of Michigan barred it from acting like a commercial survey research firm; the University stipulated that ISR not enter into exclusive commercial contracts with corporations in which the survey data it produced became the exclusive property of the contracting corporation. So, for example, if a center within ISR took on a contract from a particular car company or insurance firm, the contract would explicitly state that the data produced become the property of a broader trade association, not the contracting corporation itself (ISR Records 1947, 1). Cases like these are the first important examples of data produced by ISR being shared; significantly, the beneficiaries of such sharing were large industrial or financial firms, not academics. These early sharing practices grew out of ISR and University of Michigan administrators’ desire to distinguish ISR from wholly commercial research firms as well as to protect the university’s non-profit status. Both because ISR could not pay for research its members wished to do itself and also because it was striving to find a niche for itself in the academic rather than in the commercial world, the data it produced was in many cases shared within the wider circle to which the funder belonged.

In these cases, what was shared and with whom was largely at the discretion of the funder, not ISR. This was also true of the projects funded by federal agencies. For example, in the case of a 1948 classified project funded by the newly created Air Force and CIA, researchers at ISR emphatically steered requests they received for reports and data back towards the sponsors – even when those requests came from other branches of the military (ISR Records 1948a).Footnote 13 Anyone who requested that ISR share its reports and data from this project received the same unambiguous reply: what was shared was not up to ISR, but to the sponsoring agency. Part of the reason ISR had no discretion over sharing data from this particular project was because it was classified. Many of ISR’s early projects, however, were not; for instance, there was nothing classified about the surveys ISR did for the Federal Reserve (Frantilla Reference Frantilla1998, 29–30) or for a range of large firms. In the case of all of these projects, ISR’s position was still so similar to a commercial survey research firm that its funders simply assumed they had a proprietary right to the data and the reports. The stipulation that these be shared within a wider corporate circle was the first small move ISR (and the University) made to distinguish ISR’s work from what for-profit research firms did. Still, how data and results were shared was the responsibility and at the discretion of the funder.Footnote 14

Data shared at ISR/SRC’s discretion

Unlike the corporate and federal contract projects discussed above, ISR/SRC did have more discretion in sharing data from its grant-funded projects. In these cases, the SRC researchers who shared their data with others did so for reasons much more complex than a desire to invite other social scientists to verify and build upon their work. Before the creation to ICPR, requests for data were directed to individual researchers at ISR/SRC; their responses varied widely, suggesting that in the 1950s there were as yet no well-established professional norms for sharing data. Still, there are a few discernible patterns in these disparate responses. First, SRC researchers were somewhat more likely to share their data with those in a position to reciprocate, like those working at similar research centers (like Columbia’s Bureau for Applied Social Research [BASR] or Chicago’s National Opinion Research Center [NORC]) or officials from the Bureau of the Census; still, they sometimes said “no” even to fellow survey researchers. More consistently, SRC researchers denied a good number of requests from others who had no data of their own to offer or no access to computing facilities to analyze ISR data.Footnote 15 In retrospect, both Warren Miller and Philip Converse complained of how time-consuming vetting the numerous requests for data had been for them in the 1950s and early 60s.Footnote 16 This too suggests there was no center-wide policy for how to deal with such requests; the de facto policy seemed to have been that only those primarily responsible for a study should be allowed decide when and with whom its data was shared. So, while several decades hence both Miller and Converse could speak derisively of those miserly mid-century scholars wary of sharing their data, these were judgments of a more recent vintage. In the absence of well-developed norms about how and what to share, both were initially cautious about sharing their own data in their first years with SRC.Footnote 17

In practice, finding a balance between holding onto SRC data long enough to be the first to publish analyses of it and sharing that data with other scholars was not easy. Prior to publishing their own analyses, SRC researchers were willing to share their data with others but only after being assured that other researchers’ interests differed sufficiently from their own to obviate the risk of being scooped. And although SRC researchers did often talk about their data as a quasi-public good to be shared within the broader social scientific community, they also made clear that they should have the final say about how and with whom it was shared.

Towards the end of the 1950s, several developing but not fully realized projects complicated this balancing act still further: the planned publication of The American Voter (1960) as well as the plans to develop what became ICPR. An early 1959 letter from Miller to Angus Campbell, one of the American Voter authors as well as the then-director of the SRC, dwelt upon these difficulties. Miller informs Campbell that Avery Leiserson, a prominent political scientist at Vanderbilt, had written to ISR Director Rensis Likert to request all of SRC’s data for the elections of 1952, 1954, 1956, and 1958; Miller calls Leiserson’s letter “slightly strange” for its “presumption that there would be no problems involved in sending all these data to him (Miller Papers 1959, 1).”Footnote 18 Miller then lays out his plan for responding to Leiserson – a plan that included speaking with him over the phone to conduct more “detailed negotiation” about the terms on which SRC might grant Leiserson access to the data from 1954 on. For Miller, the red flag was Leiserson making clear he wanted access to these data in part to publish his own analyses of them. This Miller deems clearly out of bounds and assumes Campbell shares his view.

Nevertheless, Miller proposes that he respond to Leiserson with something far more nuanced than a “no” or a “how dare you?” Instead, Miller suggests that he tell Leiserson that he can have the data from 1954 “if we can be allowed to veto analyses that would encroach on the material scheduled for ‘The American Voter’.” Reserving the SRC’s right to veto any analyses Leiserson might undertake, Miller says, amounts to “protect[ing] our investment in comparative presidential and congressional voting behavior.” Acting on the imperative to “protect” the group’s “investment” in the data prior to the publication of The American Voter was fairly simple; it required careful scrutiny of what data was shared and what others planned to do with it.

Miller acknowledges, however, that it was more difficult to know how best to advance plans for ICPR in this case. On the one hand, he says he does not simply want to give Leiserson (and, by implication, other requestors) what he’s asked for since “our apparent willingness to ‘give away’ data destroys a minor part of the prize which we were thinking of offering for participation in the consortium” (Miller Papers 1959, 2).Footnote 19 Although Miller’s plans for how to structure ICPR were still at an early stage at the time of this letter, his remark suggests that the kind of sharing he envisioned among members of “the consortium” was far from “giv[ing] away” data with no expectation of return. But on the other hand, Miller muses that given how much his plans for ICPR hinged upon people and institutions outside ISR joining it, it might be smart for ISR to do all it can to “enhanc[e] the popular image of us as a generous and cooperative institution.” Doing so, Miller says, might be particularly prudent since he reports hearing some grumbles from political scientists outside Michigan about ISR’s “non-participation in the Roper Collection at Williams.”Footnote 20 “This makes me think it even more necessary that we avoid creating any impression of a dog-in-the-mangerFootnote 21 attitude on our part.” And if that meant “giv[ing] away” data to a professionally prominent colleague who asked for it, so be it.

Miller’s response in this case was wary, carefully calibrated not to undermine any of the competing objectives he and his colleagues were trying to realize. It indicates, I believe, that at this time, no ethic that made data sharing a scientific imperative had yet emerged. Sometimes ISR/SRC researchers did share the data that it was up to them to share. But they did so cautiously and for a wide array of reasons in which the ethical imperative to share data played a minor role. Institution building and increasing their professional influence were more important.

Sharing data means sharing more than data

In the years just before and after the founding of ICPR, its architects discussed several different kinds of sharing regimes. All of these were more ambitious than simply sharing data. Some focused on how survey researchers from across the country could share one principal survey research facility; others tried to imagine how researchers from different parts of the country could share either the work of designing surveys or the question-space on those surveys. These ambitions were at least partially realized by ICPR. Even in its earliest years, ICPR was designed to do more than just share data – its members also shared the costs of cleaning, standardizing, preserving, and distributing that data as well as the costs of training people to make use of it.Footnote 22 For its advocates, such extensive sharing of (more than) data was not only financially prudent; it also promised positive transformations in social science.

Several plans actively promoted by Campbell and others proposed that survey researchers share not only the data from completed projects but what might be called the means or the factors of data production. One proposed sharing regime aimed to create and maintain a single central survey research facility in the U.S. whose resources, staff and technical support researchers across the country could share. To explain this vision, Campbell and others said this facility would offer something similar to what the Mount Palomar Observatory in southern California provided to astronomers: the opportunity to do research with equipment far too expensive for individual universities to provide. Such an approach to sharing the means of data production strongly appealed to many survey researchers; it promised them far more control over their work than doing contract research allowed.Footnote 23 This envisioned sharing regime, therefore, was a bid for greater intellectual autonomy, not more readily verifiable or replicable social science.

Campbell and others also advocated that survey researchers share the work of designing and contributing questions to one large “omnibus” survey administered annually to a sample of several thousand persons.Footnote 24 That researchers might share the work of designing a survey followed easily from the envisioned “Mount Palomar” national survey research facility; if there were such a facility, the researchers making use of it could do collaborative projects as well as individual ones. And sharing the question-space on one substantial survey, Campbell and others argued, would be much less costly and time-consuming than an array of individual researchers conducting surveys on their own (Campbell Papers 1957a, 6; Faris proposal, appended to Campbell Papers 1956a, 1). Even if they were not based at one national survey research center, contributors to such an omnibus survey could share some of the factors of data production: the interviewers, coders, and data cleaners who put surveys into motion and made their results widely usable. For those who advocated it, however, this sort of sharing had more than efficiency or cost-effectiveness to recommend it. They also suggested that, by bringing social scientists from different disciplines into close intellectual contact, such an omnibus survey might make possible the “general knowledge” to which many mid-twentieth century social scientists aspired (Campbell Papers 1957a, 6–7; Faris proposal, appended to Campbell Papers 1956a, 1).Footnote 25 But in the late 1950s, Campbell and others still had to speculate about how social science might be improved by a regularly conducted national survey and the data it would yield. The intellectual autonomy of being able to design and control a regularly conducted survey was itself still an aspiration for mid-twentieth century survey researchers. Before they could become concerned with the ethical imperative to share data, they would have to establish more control over its production than they had in the 1950s.

Exciting as these ideas for sharing regimes seem to have been to many social scientists in the late 1950s and early 60s, the NSF’s Henry Riecken explicitly rejected the Mount Palomar parallel during a 1960 conference devoted to planning the future of survey research. “The analogy with government provided facilities in the physical sciences – e.g., in astronomy – which social scientists bring up from time to time, is not a good one,” Riecken commented. “It should be remembered … that the United States government owns the telescope outright, and that a survey research center is not primarily bricks and mortar, but peoples’ salaries” (Campbell Papers 1960d, 7). With the NSF representative’s dismissal, this particular idea for sharing the means of data production reached a dead end.Footnote 26

But the other idea – sharing the design, space, and supporting infrastructure of one large omnibus survey – endured. Although not vigorously pursued until perhaps a decade later, this approach to sharing the means (or, in this case, the factors) of data production became central to the ANES and the General Social Survey (GSS). These well-known annual surveys have been large, collaborative efforts for some time; the work of developing and overseeing their content and use is done by teams of researchers (Miller Reference Miller, Baer, Jewell and Sigelman1988, 244; Burns Reference Burns2006). The sharing regimes on which both of these formidable data-producing projects are based may be traced back to these 1950s discussions of the various ways social scientists could share more than data.Footnote 27

When Warren Miller recalled the circumstances leading up to the founding of ICPR, he underscored that the “Consortium,” as it was called in its early years, was primarily an arrangement for sharing the costs of not only the data but also the training programs and technical expertise that ISR offered to social scientists across and outside the U.S. By working out ways for those who used ISR resources to “underwrite” or “subvent” their maintenance and production, Miller said that what members got in exchange was “access.” In response interviewer Erik Austin’s comment that “it must have seemed strange at the time” to ask institutions to pay to belong to ICPR, Miller responded that it did not seem so to him. Miller explained that he came to see that “knowledge is power, and that information is a commodity” and that when the latter is in short supply, “it ought to command a price” (Miller 1997, 4 [8]).

But to persuade other institutions that they ought to contribute to ICPR by sharing the cost of its operation, Miller also had to persuade them that they would be getting something valuable in return.Footnote 28 In two separate oral history interviews conducted nearly a decade apart, Miller likened what turned out to be his multi-year persuasive mission to the life of Willy Loman, Arthur Miller’s harried travelling salesman.Footnote 29 Calling himself “the Willy Loman of political science,” Miller recalled that his travels to universities across the U.S. in the 1950s had enabled him to build a broad network of contacts out of which grew the initial group of institutions that made up the consortium. As a “salesman,” Miller had given lectures with a “missionary-like, if not messianic message” about the “tremendous power” of survey research – the pitch being that the resources ICPR would soon be offering for a price were immensely valuable and therefore worth the significant annual fee (Miller 1997, 4 [6–7]).

In an earlier oral history interview for the American Political Science Association’s Oral History project, Miller (Reference Miller, Baer, Jewell and Sigelman1988) discusses how sharing data was part of his idea for ICPR – but hardly its sole purpose. Instead, Miller recalls that he wanted to “combin[e]” data sharing with training people “to do an ‘Operation Bootstrap’ and transform – at least for the study of American politics – the way in which political scientists did research” (ibid., 241).Footnote 30 Miller acknowledged, however, that other institutions would have balked at paying for such a project if it had been presented to them in these terms. The task then was to “come up with the right organizational format and the right scheme of governance … that one might inveigle some small subset of research universities into a commitment to provide support for this new creation” (ibid.). His Loman-esque time “on the road selling empirical political science,” Miller surmised, helped lay the personal foundation for buy-ins from others that made ICPR (and later the ANES) financially possible (ibid., 243). The new organization’s commitment to share data with members who requested it was significant – and clearly mattered to many who joined ICPR early. But as his comments suggest, Miller did not “invent” ICPR solely for the purpose of making data sharing easier. Nor did he do so principally to disperse the costs of data production and sharing among institutions beyond Michigan. ICPR’s grander purpose was to establish a new institutional center in political science that promised like-minded researchers a critical mass in the discipline far beyond what even the largest, most well-endowed individual department might be able to provide. Though it was not exactly the Mount Palomar for the social sciences that some had envisioned in the 1950s, the Consortium did enable its members who shared basic commitments to behavioral social science to share the costs of producing and distributing data as well as the costs of producing its future analysts. With substantial, consistent backing from the NSF, ICPR realized the earlier ambition for survey researchers to share the means of data – and knowledge – production.Footnote 31

Cultivating stronger demand for data

In an oral history interview, Philip Converse recalled how constructing ICPR was not an entirely “altruistic” undertaking, but also a matter of “self-protection,” designed to free researchers at ISR from the time-consuming work of responding to “people … beating down our door” for “ad hoc access to our data” (Converse 1997, 8 [12]). The correspondence in Campbell, Converse and Miller’s papers in the decade before the founding of ICPR does indeed include requests for data, including the one from Leiserson discussed above. Such requests were sometimes directed to specific researchers – and, during this period at least, were often answered by researchers themselves rather than their staff. Not only the requests but also the responses were “ad hoc,” determined on a case-by-case basis in which the researcher would weigh the merits of the request, the prestige of the requestor and how his response would be received by the broader social scientific community. Responding to a request like Leiserson’s was probably more time-consuming than most, given the difficulty of balancing Leiserson’s prominence as a political scientist against the large amount of data he had rather imperiously requested. But without a routinized approach for handling such requests, even writing a simple “no” to, say, a person with little capacity to understand what he was requesting still fell to the researchers or study directors themselves.Footnote 32 It is not surprising that Converse remembered these tasks as an annoying drain on his time and saw the creation of ICPR as a good practical way to be relieved of them.

Still, just because it took more time to meet the demand for their data than Converse and others would have liked says little about the level or intensity of the demand itself. Notably, letters and memoranda written in the years just prior to the founding of ICPR express concerns about the weak demand for data. In these records from the late 1950s and early 1960s, a number of would-be ICPR supporters worried that demand for data from ISR and others was not widespread or high enough to sustain the planned consortium – and that its architects therefore needed to work on increasing demand for what ICPR proposed to offer.

For example, in a 1956 exchange of correspondence about the viability of the omnibus national sample survey discussed above, Pendleton Herring, then president of the SSRC, pointedly asks Campbell to “estimate the demand there might be” to contribute questions to such a survey (Campbell Papers 1956a). Campbell initially responded cautiously, suggesting that he and Herring make “quiet inquiries” about the level of demand for such an undertaking among their colleagues (Campbell 1956b). A few months later, Campbell sent Herring an enthusiastic memorandum outlining his vision for how to carry out such a general social survey. The kind of sharing Campbell envisioned went beyond merely sharing data to encompass a sharing, as I termed it above, of the means of data production. Having researchers from across the social sciences contribute questions to the survey, Campbell argued, would ultimately enhance the value of the data to individual disciplinary communities (Campbell 1957a, 7).Footnote 33 All these prospects notwithstanding, Campbell concedes that the question about the demand for such a facility is a difficult one. To answer it, he gamely suggests that “this is a situation in which supply creates demand” even though this would be neither quick nor effortless: “It would very likely take some period of time to educate [many scholars who might make use of such a facility] to the possibilities the facility would make available to them.” By the time he wrote this memo, Campbell was clearly excited by the prospect of a well-funded research facility to oversee broadly based social scientific surveys. He also acknowledged, however, that there was not at present strong, reliable demand for such a facility and its products. Demand could be created – but, in Campbell’s judgment, in 1957 it did not yet exist.

The question of demand was still unsettled over three years later at an SSRC-sponsored conference on the capacities of existing survey research facilities. Memos drafted by several participants prior to the meeting suggest that attendees hoped to persuade the NSF to provide substantial regular support for survey research rather than just for particular projects (Campbell Papers 1960b, 1960c). Since the purpose of the meeting seems to have been “pitching” ideas to a prospective funder, it is no surprise that participants spoke optimistically about the demand for survey research. The most optimistic such comments come from Ithiel de Sola Pool, who characterized the level of such demand as “monumental,” adding, “Literally one-third of the profession, assuming no scarcity, would be clamoring at the door.” But, as the rapporteur for the conference noted, David Truman thought Pool’s projections excessive and spoke up to dampen them (Campbell Papers 1960d). Whose assessments proved more accurate is not the issue; it seems more significant that some disagreement about the level of demand for survey research was expressed in such a meeting at all. This suggests that the doubts Campbell acknowledged about the level of demand for survey research data three years earlier lingered, even among its strongest supporters.

Similar concerns were voiced the following year in a memo from William Riker and Charles Sellers to a long list of recipients that included both Campbell and Miller.Footnote 34 To this memo, Riker and Sellers append their proposal for a huge nation-wide effort to collect U.S. electoral data from the nineteenth century to the present with the aim of making cleaned, standardized versions of it available to social scientists and historians. Like the proposals discussed in the section above, Riker and Sellers advocated centralizing the oversight of data collection and distribution; they proposed this be done under the auspices of an SSRC committee. Centralized oversight, they argued, would not only make the collection more efficient and complete, but would also lay the best foundation for interdisciplinary work towards “a new general theory of electoral behavior and of the American electoral system” (Campbell 1961b).Footnote 35

Their enthusiasm for this enormous project notwithstanding, Riker and Sellers are still clearly aware of the problem of demand. To address it, they recommend that the proposed SSRC committee ought to begin its work by drumming up interest in the project; “encourag[ing] the use of the data” should be seen as “logically antecedent” to the great investment of time and money collecting and cleaning the data would require (Campbell Papers 1961a, 3). In this instance as well, the architects of this project stress that its success depends on cultivating substantial demand among social scientists and historians for such data. Even when a later version of this project became the first ICPR venture supported by the NSF in 1963, the concern with sustaining a solid level of demand for these data remained (ICPSR Records [no date]).Footnote 36

Parts of this “demand problem” were straightforwardly economic. Survey researchers wanted to convince the SSRC, the NSF, and other funders to invest some of their resources into big, long-term projects. In turn, funders understandably wanted not only estimates of how much these projects would cost in the short term but also a sense of how good the return on their investments might be.Footnote 37 But on another level, the persistence of this problem well into the 1960s indicates that approaches to social science that relied on analyses of large, shared datasets were far from as securely institutionalized as their advocates would have liked. To address this issue more fully, survey researchers sought to gain a foothold in established departments’ programs for graduate education.

Graduate Education

Several accounts of the formation of ICPR highlight the importance of two SSRC-funded summer workshops held in Ann Arbor in 1954 and 1958. The graduate students and young faculty who attended these workshops were given access to SRC data and also trained to analyze it. As Miller recalled in an oral history interview, the “notion of manipulating data for analytic purposes” central to these workshops “was a very, very new notion” (Miller 1997, 3 [5]). Miller’s comment underscores that in the 1950s, doing secondary analyses on data gathered by others was a novelty among social scientists.Footnote 38 Those who led these workshops, therefore, had to share more than their data – in addition to being taught new analytic approaches, students also needed access to ISR’s computing and technical support resources, since most did not have such resources at their home institutions. Even at the time, some social scientists not affiliated with ISR/SRC recognized that it was becoming something like “a specialized and important graduate school” (Rockefeller Foundation Archives 1957).Footnote 39

Supporters of the still novel idea of secondary analysis anticipated that it might have several distinguishing virtues. First, as Campbell argued in an initial pitch for Rockefeller support, offering young political scientists training at SRC might help broaden the disciplinary base of survey research beyond social psychology to include political science (Rockefeller Foundation Archives 1955b, 1). Second, the experience of the 1950s summer workshops encouraged the hope that political scientists might now be able to produce interesting research more quickly, given that secondary analyses allowed them to bypass the time-consuming burdens of collecting their own data (Rockefeller Foundation Archives 1955a, 13; 1955b, 1–3). Third, some early promoters of this approach imagined that just as SRC studies were conducted by teams of researchers, so a program of graduate education in political behavior would also favor clusters of doctoral dissertations “around a major common theme … result[ing] in a greater cumulative contribution to knowledge than is now the case with individual dissertations” (Campbell Papers 1959b).Footnote 40 Such clusters of doctoral dissertations would be the result of “group projects” – projects that almost certainly would share data and much more.

Promising as they seemed, the workshops and the ideas they sparked still fell far short of a full graduate curriculum; on its own, ISR/SRC faced too many financial and institutional constraints to be able to offer graduate degrees. Hence, to be able to design the training of the next generations of survey researchers, those affiliated with ISR/SRC sought to make space for this training in the graduate programs of the departments in which they held tenure. The curricular changes Miller and a sympathetic political science colleague, Samuel Eldersveld, proposed to Michigan’s graduate program in political science in early 1959 gave their still new approach a more secure institutional home. On the one hand, Miller and Eldersveld argued that their proposal to develop a program in the study of political behavior for doctoral students in political science at Michigan “should strengthen the departmental claims to national eminence” (Campbell Papers 1959a). But what they presented as an unambiguous benefit to the Political Science department’s graduate program was clearly meant to benefit ISR researchers as well. The process of obtaining tenure-track appointments in the Political Science department for people who worked primarily at ISR/SRC (Miller, Stokes, and Converse, for example) had been far from smooth (Jackson and Saxonhouse Reference Jackson and Saxonhouse2014, 16–18; Converse Reference Converse1987, 348). In this context, the new program in political behavior, approved by the department in May of 1959 and begun in the fall of 1960, at least offered the prospect of attracting graduate students to Michigan who would work primarily with faculty connected to ISR. On this level alone, the program promised several crucial benefits to ISR researchers: cohorts of political behavior graduate students for whose educational expenses ISR would not be (solely) responsible.

But where the money to recruit and support graduate students in this new program would come from was still an open question. One possible source of support for this new program that ISR hoped to tap was a grant from the Ford Foundation to the University of Michigan to encourage the development of the behavioral sciences. Though these prospects looked bright initially, Campbell felt Michigan’s Graduate College failed to make the most of Ford’s substantial but short-lived commitment in this area (Campbell Papers 1960a; 1957b). By 1960, it was clear that the new political behavior program would have to look elsewhere for funds to support its new graduate students.

Shortly thereafter, the program secured more stable support from the federal government via the National Defense Education Act (NDEA) (Miller Papers 1961). In an oral history interview, Miller recounted that the educational component of ICPR was made possible by funds from NDEA: “with a little fudging of what we were doing, the mathematics group funded by NDEA decided to improve the quantitative skills of political science” (Miller 1997, 4 [7]). These funds proved crucial to enabling Miller and others to offer potential graduate students multi-year support to join the Political Science department’s new political behavior program as NDEA fellows. This alliance with a traditional department, made possible by significant federal funds, probably did as much during the 1960s to ensure the long-term survival of SRC and ICPR as did their data sharing during that decade. And the NSF support that began in the 1960s and grew substantially in the next decade mattered even more.

Conclusion

ISR/SRC and ICPR devised approaches to sharing data that helped them survive and grow. The success of the data-sharing regimes now associated with ICPSR initially hinged upon its architects’ skill in gauging how much the infrastructure of sharing data would cost and persuading others to help pay for it. In addition to the successful construction of its now formidable $18 million plus per annum financial base,Footnote 41 this data-sharing institution also owes its existence to the cultivation of broad networks of like-minded scholars beyond Michigan as well as to the development of the political behavior program in Michigan’s Political Science Department. Those who promoted social scientific data sharing in the mid-twentieth century did so for a wide array of reasons, the survival of their research organizations and the reproduction of their approach to social science being primary among them.

As I have tried to show from the documentary evidence and oral histories I cite, the architects of what became ICPSR did not advocate data sharing as an ethical imperative nor to help social scientists produce more transparent, verifiable, or replicable research. Instead, they advocated data sharing when it advanced pragmatic strategies to sustain and promote the kind of social science they were trying to do. Sharing was sometimes a way to interest others in analyzing their data or learning to do so – and therefore (at least potentially) also a way to interest them in sharing the cost of producing and maintaining such data in the future. The persistently economic language Campbell, Converse, Miller, and others used underscores the pragmatic imperatives that fueled the earliest data-sharing plans.

I do not mean to suggest that there was something sordid about the degree to which the people who created these data-sharing plans thought about how much they would cost and how to get others outside of ISR or the University of Michigan to pay for them. Anyone serious about maintaining and expanding a new institution on ISR’s still far from rock-solid foundation would have had to address these issues. In the mid-twentieth century, ISR/SRC was one of the largest of the few producers of social scientific data outside the U.S. government. But researchers there still relied heavily on contracts and one-time grants that gave less them control over their work than they would have liked. Moreover, it was unclear in the 1950s and early 60s how and where new generations of survey researchers would be trained and how the new approaches to social science developed at ISR would be reproduced. When ISR/SRC researchers shared their data in the 50s and 60s, they did so with an eye to overcoming these difficulties. For Campbell, Miller and Converse, sharing data was less an ethical imperative than it was instrumentally valuable to securing their intellectual and professional goals.

Social scientists shared data for different reasons in the mid-twentieth century than they do today. But even though it is now a well-established practice, data sharing is still linked to broader intellectual and professional goals. And it continues to be costly in a number of different ways, both for those who participate in it and for those who do not. The architects of ICPR understood that sharing data meant sharing more than data. If one were to carry this story forward into the last few decades, it seems evident that what the contributions of the now extensive membership base of ICPSR reproduce is not only a data-sharing regime but also a particular set of approaches to collecting and analyzing that data. In its first decade, some projected that ICPR’s summer training program would become superfluous once more universities hired faculty capable of teaching courses in advanced quantitative analytic methods (ICPSR Records 1965b). But rather than shrinking in size and importance, ICPSR’s now enormous summer program coexists with methods courses taught at many institutions by generations of social scientists who are themselves ICPSR alums.Footnote 42 In 1957, David Truman remarked that the summer workshops organized by ISR made it an important de facto graduate school (Rockefeller Foundation Archives 1957); ICPSR’s summer training program inherited this role and has vastly expanded it. This is just one way in which ICPSR’s data-sharing initiatives promote sharing or coalescing around much more than just data.

As I hope the cases I have discussed in the body of this paper illustrate, these 1950s and 60s data-sharing plans sought not only financial and institutional stability. They also expressed strong commitments to particular approaches to social science. As Miller put it, sharing data was part of his “Operation Bootstrap” to “transform … the way in which political scientists did research” (Miller Reference Miller, Baer, Jewell and Sigelman1988, 241). Like “Operation Bootstrap,” a U.S. government-designed initiative to transform agrarian Puerto Rico into an industrial economy (Rodriguez Reference Rodriguez1999), Miller hoped ICPR, by making data accessible and analyzable in new ways, would transform the mode of research production in political science. It did.

Acknowledgments

I gratefully acknowledge the University of Michigan’s Bentley Historical Library for travel and research support, the Michigan Political Science Department for hosting me as a Visiting Scholar in the spring semester of 2015 and Erik Austin, Hank Heitowit, and Bill Zimmerman of ISR and ICPSR for speaking with me. Earlier versions of this paper were presented at the University of Michigan’s spring 2016 Political Theory Workshop and at the International Political Science Association’s 2016 World Congress in Poznań, Poland. I thank the participants in both, especially (in Ann Arbor) Lisa Disch, Ben Peterson, Arlene Saxonhouse, and Mariah Zeisberg, and (in Poznań) Erkki Berndtson and Thibaud Boncourt. I also thank Erik Freye, Dawid Tatarczyk, and Herb Weisberg for their help and comments.

A long-time professor of political science, Emily Hauptmann writes about the history of U.S. social and political science in the twentieth century. Her recent work in this area focuses on the influence of private foundation funding on academic political science and has appeared in the America Political Science Review, International History Review, the Journal of the History of the Behavioral Sciences and PS: Political Science and Politics. She is also completing a book manuscript, tentatively titled, Private Foundations, Public Universities and the Transformation of Postwar Political Science.

Footnotes

1 ISR was founded at the University of Michigan in 1949 and ICPR in 1962. ISR was designed to be an “umbrella” type organization for a range of social research centers (Frantilla Reference Frantilla1998, 7–10). ICPR was called the Inter-University Consortium for Political Research (ICPR) at its founding. Its name changed in 1975 to the Inter-University Consortium for Political and Social Research (ICPSR). When I refer to its first thirteen years, I use its earlier name; in referring to accounts of its more recent history, I use its current one.

2 Frantilla (Reference Frantilla1998, 22–24) offers a short account of the origins of the RCGD at MIT and its subsequent affiliation with the University of Michigan at the beginning of 1948. I discuss the origins of the SRC in greater detail below.

3 For ISR’s own account of its history and significance, see http://home.isr.umich.edu/about/history/. The cited phrase appears on this page. Accessed September 10, 2017.

4 According to the latest figures provided on ICPSR’s website, it has 779 institutional members in 2019. It charges academic and other institutions membership fees on a sliding scale, ranging from nearly $20,000 per year for the largest research intensive doctoral universities to $2,400 for colleges that grant only baccalaureates. Its latest reported revenues (for the fiscal year 2016–17) were $18.8 million. For an overview of the history of ICPSR’s membership and revenues, see https://www.icpsr.umich.edu/icpsrweb/content/membership/history/index.html. A current schedule of fees is available at https://www.icpsr.umich.edu/icpsrweb/content/membership/join.html. Last accessed September 21, 2019.

5 Austin joined ICPR in 1966 to assist with the project to make all U.S. election data computer readable. During and after his 41 years at ICPSR, Austin has also recorded the history of this institution and ISR. He was the principal interviewer for the oral histories done to mark ISR’s fiftieth anniversary in 1998. The 2011 document from which I cite appears on the portion of ICPSR’s website devoted to its history; it is not paginated.

6 The ANES is, in effect, a spin-off of a spin-off: as ICPR was created out of ISR/SRC and later became independent, the ANES is now independent of both ISR and ICPR and overseen by its own governing board.

7 Burns implies that this data-sharing ethic was already developing at SRC in the early 1950s – before Miller came to play a central role there. Miller came to Michigan in the early 1950s while he was still finishing his Ph.D. at Syracuse. For my discussion of political and professional circumstances surrounding the Carnegie Corporation’s support for SRC’s 1952 election study, see Hauptmann (Reference Hauptmann2016).

8 There is an ongoing debate among U.S. political scientists about the merit of these values as well as what approaches to data-sharing are best suited to fulfill them. King (Reference King1995a, Reference King1995b) offers early but still widely cited defenses of verification and replication as well as an outline of the kinds of data sharing that support them. See also the 18-piece symposium in the September 1995 issue of PS: Political Science and Politics devoted to verification/replication. More recently, a symposium entitled “Openness in Political Science: Data Access and Research Transparency” (DA-RT) appeared in the January 2014 issue of PS: Political Science and Politics.” The APSA website provides links to the DA-RT policy and some articles related to it. See http://www.politicalsciencenow.com/replication-and-data-research-transparency/. Last accessed August 23, 2016.

9 The first substantial NSF grant to ICPR was for a project to collect and make machine-readable all county-level U.S. electoral data from the early nineteenth century to the present (see Austin Reference Austin2011). According to ICPR’s first annual report, the NSF was by far ICPR’s most important patron with its initial grant of $95,000 for 18 months (ICPR Annual Report 1962–1963, 6, 10). Crowther-Heyck (Reference Crowther-Heyck2006, 421, 428, 436) notes that ISR was particularly successful at securing the patronage of both private foundations and federal agencies.

10 Miller had the strongest disciplinary ties to political science of any of the early SRC research staff, most of whom had degrees in sociology or social psychology. Miller’s Ph.D. from Syracuse was in Social Science; his M.S., however, was in political science.

11 The Human Relations Area Files (HRAF), one of the oldest nominally academic data centers devoted to cultural and anthropological data, received substantial support from all branches of the U.S. military and the CIA during the 1950s (Ford Reference Ford1970, 13–15). I suspect, therefore, that how it shared the data it amassed was not entirely up to its academic directors. ICPR’s most significant U.S. competitor in its first decade was the Roper Center, based at Williams College in Williamstown, Massachusetts, throughout the 1960s. ICPR and the Roper Center each devised its own approach to acquiring and then sharing the data under its control among its paying members. By comparison, many academics found federal government entities like the Census Bureau slow and uncooperative in making their data available to researchers (Kraus Reference Kraus2011; Bisco Reference Bisco1967, 56; 69–70). For an overview of U.S. and European social science data archives during the first 5 years of ICPR’s life, see Bisco (Reference Bisco1966). See also my assessment of ICPR’s position relative to its 1960s competitors (Hauptmann Reference Hauptmann2017).

12 See the overview of ICPSR’s current fee structure provided in note 4 above.

13 The folder in which this item appears includes several requests from the early 1950s from the Armed Forces Staff College, the Air Materiel Command, and the Central Air Documents Office; all are directed to Campbell. Campbell’s standard response is to refer the requestor to the representative of the contracting agency, John F. Stearns of the Air Research Unit, Aeronautics Division, Library of Congress. The Air Force/CIA office sponsoring this project seems to have been initially housed at the Library of Congress; the Library’s name was consistently used to refer to the project though it clearly had military objectives. One such objective was to interview Russians living abroad to help construct “a list of 30 Russian cities which ought or ought not to be bombed” (ISR Records 1948a, 1).

14 Firms sometimes bought analyses of portions of larger studies – as when GM offered to purchase SRC analyses of consumers’ car-buying habits. Although it’s not clear whether this particular contract was fulfilled, ISR/SRC did agree that GM could “withhold certain data for a period of one year” – because they had paid for it (ISR Records 1948b, 1).

15 There are a number of examples of Miller responding directly to people who requested data from the election studies in the correspondence folders in his papers. These examples show that sharing data with anyone who asked for it was not Miller’s default response. Sometimes he denied requests made by people who lacked the training and computing facilities to make sense of what they asked for (several examples appear in Miller Papers, Box 4, Folder: Correspondence August to December 1958) or from those whose analyses were too close to what he and his colleagues were working on (Miller Papers 1957a, 1957b). Other times, Miller fulfilled insistent requests for data from fellow survey researchers at Columbia’s BASR, but in ways his correspondents clearly saw as partial and unsatisfactory (Miller Papers 1957c, 1957d).

16 See for instance Converse’s comment that though the creation of ICPR was partly “altruistic” to help people, it was also a move of “self-protection,” since “people were beating down our door” for “ad hoc access to our data,” making it difficult “to get our own work done” (Converse 1997, page 8 of my notes on taped interview; page 12 of typed transcript). Misspellings and omissions mar the typed transcripts of these interviews. I therefore cite both the page numbers of the notes I took when I watched the recorded interviews as well as the page number of the typed transcript. In subsequent notes, the page number of the typed transcript appears second and in square brackets.

17 Since data was recorded on IBM punch cards in the 1950s and early 1960s, sharing it was also practically burdensome for those who had created it. Potential sharers had to consider the costs of duplicating and mailing the cards. And since some cards were often destroyed during their trips through the computer’s counter-sorter, it was common for two identical decks to be sent. I thank Herb Weisberg for this insight.

18 All subsequent quotations in the next few paragraphs are to the same page of this letter.

19 All quotations in this paragraph appear on p. 2. ICPR initially charged its member institutions $2,500 annually (ICPR 1962–3 Annual Report, Memorandum of Organization, p. 1).

20 The Roper Center (then in Williamstown, MA.) and Chicago’s NORC were ICPR’s chief data archive competitors in its early years. The Roper Center was the oldest and most widely recognized of these in the 1950s; therefore, those who built ICPR understood that what they were trying to do would be explicitly compared to it. See Hauptmann (Reference Hauptmann2017) for a detailed discussion of this competition.

21 The dog’s conduct in the fable to which Miller alludes is the antithesis of sharing – spitefully lying in a manger filled with hay, it deprives other animals of food by preventing them from having what is of no value to it.

22 During its first years of operation, ICPR’s most significant expenses were staff salaries, data preparation and processing, fulfilling members’ requests for data and research assistance, and summer training programs. Membership fees and NSF grants were its most significant sources of income. For example, see the budgets provided in ICPR Annual Reports for 1962–1963, 1965–66. All annual reports may be viewed at https://www.icpsr.umich.edu/icpsrweb/content/about/annual-reports/.

23 Guy Orcutt, an economist interested in developing survey data for economic forecasting, also deployed the Mount Palomar image to argue for more autonomy for survey research. He assessed the 1950s situation bluntly: “It is still a question of who is to be master and who is to be servant. At present only a handful of researchers have any real say about the collection of data, and they are forced to worry excessively about the wishes of paying clients” (Orcutt Reference Orcutt1957, 9).

24 One such proposal came from Bob Faris; Pendleton Herring, President of the SSRC, appended Faris’s two-page proposal to a letter he sent to Campbell (see Campbell Papers 1956a). Campbell also advocates an “omnibus” survey in his response (see Campbell Papers 1956b).

25 For a compelling account of the appeal of general theory and general knowledge in mid-twentieth century social science, see Isaac Reference Isaac, Solovey and Cravens2012.

26 This sort of image has persisted among people associated with ISR. A volume edited in 2004 devoted to the Survey Research Center is entitled Telescope on Society. Facing its title page, it features an odd image of survey researchers from around the world functioning as a “social Hubble” (a reference to the space telescope). In a brief discussion of the analogy, surveys themselves are compared to a telescope: “The survey can provide empirical images of the populations of organizations, communities, societies, and even the world. Like a telescope, its focus can be broadened or narrowed to allow examinations of supra- and subpopulations of individuals and organizations” (House et al. Reference House2004, 3).

27 These discussions, of course, involved people from many organizations outside of ISR. The GSS, for example, was founded and continues to be maintained by NORC at the University of Chicago. NORC representatives attended the survey research center planning conference referred to above. See http://www.norc.org/Research/Projects/Pages/general-social-survey.aspx.

28 See, for example, the 1971 “pitch letter” to Union College on the portion of ICPR’s website devoted to its history. On page 3, this unsigned letter acknowledges the significant annual cost of institutional membership – $2,300 initially, projected to rise to $2,900 in two years. But these costs, the letter writer argues, are well worth it: “No other investment can create more possibilities for student and faculty research than access to the data files and the training programs of the ICPR,” given the dearth of “public and foundation support” for individual research projects. Accessible at https://www.icpsr.umich.edu/files/ICPSR/fifty/Union.pdf.

29 The allusion is to the central character in Arthur Miller’s 1949 play, Death of a Salesman.

30 “Operation Bootstrap” was the name of a substantial postwar US project to transform the agrarian economy of Puerto Rico into an industrial one (Rodriguez Reference Rodriguez1999). It is not clear from the context in which the comment appears whether Miller is alluding to this project specifically. Thanks to Erik Freye for this point.

31 According to ICPR Annual Reports from the 1960s, the organization received over $1.4 million in grants for repository development, the summer training program, computing equipment and other projects from the NSF from 1962 through 1969. Though ICPR also received grants from the Ford Foundation, private corporations and the University of Michigan, the NSF was by far its biggest, most consistent grantor. ICPR Annual Reports for each year from 1962 to 1969 are available at https://www.icpsr.umich.edu/icpsrweb/content/about/history/histdocs.html.

32 See the material pertinent to this issue in note 15 above.

33 All subsequent quotations from this memo appear on the same page.

34 Riker, a political scientist, and Sellers, a historian, drafted this proposal when both were Fellows at the Center for the Advanced Study of the Behavioral Sciences in 1961.

35 Appended to the document cited is a three-page document titled “A Proposal for an SSRC Committee on Electoral Behavior,” 5.10.61. All quotations in the text above are from this three-page proposal.

36 There is some evidence that the concern over how much these data would be used lasted well into the 1960s (see, for example, the concerns expressed in ICPSR Records [1965a, 4]).

37 I discuss how staff at the Carnegie Corporation weighed similar issues in deciding whether and how to support the SRC’s 1952 election study (see Hauptmann Reference Hauptmann2016, 182–184).

38 Skepticism and resistance to secondary analyses of data collected by others was a formidable obstacle to its mid-twentieth century advocates (see, for example, Lucci and Rokkan 1957; Hauptmann Reference Hauptmann2017; and Scheuch Reference Scheuch2003). For a fascinating discussion of the mid-twentieth century effort to store and share data collected by anthropologists, see Lemov Reference Lemov2015.

39 The recipient of this letter, Kenneth Thompson, the Director of the Rockefeller Foundation’s Social Science program, had solicited a large number of assessments of the SRC’s work to shortly after Rockefeller granted it a substantial sum to support the 1956 election study.

40 Although no authors’ names appear on the document cited, internal references suggest that it was written by Samuel Eldersveld, Warren Miller, and also possibly Donald Stokes.

41 The over $18 million per annum figure comes from the latest annual revenues reported on ICPSR’s website for the year 2016–2017. https://www.icpsr.umich.edu/icpsrweb/content/about/history/index.html.

42 According to ICPSR’s website, its annual summer training program in Ann Arbor had more than 1,000 participants in 2012 and offered more than 70 courses. It also supports workshops offered at other universities in the U.S. and Canada. See https://www.icpsr.umich.edu/icpsrweb/content/about/history/sumprog.html.

References

References

Bentley Historical Library, University of Michigan (Ann Arbor, Michigan)

Campbell, Angus. Personal Papers.

1956a. Pendleton Herring to Campbell, 10.5.56. Folder: SSRC Correspondence 1950–59, Box 8.

1956b. Campbell to Pendleton Herring, 10.22.56. Folder: SSRC Correspondence 1950–59, Box 8.

1957a. Campbell, “A General Purpose National Sample for Behavioral Scientists,” March 1957. Folder: SSRC Correspondence 1950–59, Box 8.

1957b. Bernard Berelson to Donald Marquis, 7.12.57. Folder: Michigan, University of, Ford Foundation Grants 1957–58, Box 6.

1959a. Samuel Eldersveld and Warren Miller to Department of Political Science, Univ. of Michigan, 5.12.59. Folder: Michigan, University of, Ford Foundation Grants, 1959–60, Box 6.

1959b. “The Objectives and Needs of the Graduate Degree Program in Political Behavior,” October 1959. Folder: Michigan, University of, Ford Foundation Grants, 1959–60, Box 6.

1960a. Campbell to Freeman D. Miller, Assoc. Dean, Rackham School of Graduate Studies, 3.9.60. Folder: Michigan, University of, Ford Foundation Grants, 1959–60, Box 6.

1960b. Charles H. Berry, memo. November 1960. Folder: SSRC Correspondence, 1960–68, Box 8.

1960c. Peter H. Rossi, memo. 11.15.1960. Folder: SSRC Correspondence, 1960–68, Box 8.

1960d. Charles H. Berry, “Conference on Survey Research Facilities,” 12.16.60. Folder: SSRC Correspondence, 1960–68, Box 8.

1961a. William Riker and Charles Sellers. “A Proposal for an SSRC Committee on Electoral Behavior,” 5.10.61. Folder: SSRC Correspondence, 1960–68, Box 8.

1961b. William Riker and Charles Sellers to Lee Benson, Angus Campbell, Samuel P. Hays, V.O. Key, Richard McCormick, Duncan MacRae, Jr., Warren Miller, Richard Scammon, 5.12.61. Folder: SSRC Correspondence, 1960–68, Box 8.

Converse, Philip. 1997. Oral history interview. (Erik Austin, interviewer). August 19. VHS tape and typed transcript. Institute for Social Research (ISR) Oral History project.

Institute for Social Research (ISR) Records.

1947. ISR/SRC Executive Committee Minutes, 7.30.47, Folder: ISR/SRC Exec. Committee Nov. 1946 – Dec. 1948, Box 26.

1948a. Angus Campbell to Burton Fisher, 8.17.48. Folder: Project # 36, Library of Congress, Box 41 (SRC Projects).

1948b. ISR/SRC Exec. Committee Minutes 9.28.48. Folder: ISR/SRC Exec. Committee Nov. 1946 – Dec. 1948, Box 26.

Inter-University Consortium for Political and Social Research (ICPSR) Records.

No date. Index to Grant Proposals. Folder: Grants and Project Proposals: Grant Proposals Index, 1963–2002, Box 16.

1965a. Warren Miller and Lee Benson to Pendleton Herring, SSRC, May 1965. Folder: SSRC Committee on Archives for Quantitative Social Science Data, 1965, Box 3.

1965b. “Policy Statement on Research Training for Advanced Graduate Students and Faculty Members Participating in ICPR Activities,” November 1965. Folder: Educational Activities, Summer Training Program, General 1965-1992, Box 20.

Miller, Warren E. Personal Papers.

1957a. Miller to Joseph Greenblum, 6.11.57. Folder: Correspondence March to August 1957, Box 4.

1957b. Miller to George Belknap, 7.25.57. Folder: Correspondence March to August 1957, Box 4.

1957c. Miller to Richard Christie, 8.2.57. Folder: Correspondence March to August 1957, Box 4.

1957d. Miller to Bill Glaser, 8.16.57. Folder: Correspondence March to August 1957, Box 4.

1959. Miller to Angus Campbell, 1.14.59. Folder: Correspondence, Topical: Angus Campbell 1958–59, Box 3.

1961. Miller to Samuel Eldersveld, 3.8.61, Folder: Correspondence: Eldersveld, Samuel J., 1960–61, Box 3.

Miller, Warren. 1997. Oral history interview. (Erik Austin, interviewer). July 20. VHS tape and typed transcript. Institute for Social Research (ISR) Oral History project.

Rockwell, Richard. 1998. Interview appearance in “In the Public Interest: 50 Years of Social Research at the Institute for Social Research,” University of Michigan documentary film.

Rockefeller Archive Center (Sleepy Hollow, New York)

1955a. Survey Research Center, University of Michigan. “A Proposal for a Study of the Psychological Sources of Political Behavior,” 6.29.55. Folder 4993: University of Michigan - Survey Research Center - Voting Behavior (1955–1957), Box 583, Record Group 1.2, Series 200S, Rockefeller Foundation Archives.

1955b. Angus Campbell to Leland DeVinney, 9.29.55. Folder 4993: University of Michigan - Survey Research Center - Voting Behavior (1955–1957), Box 583, Record Group 1.2, Series 200S, Rockefeller Foundation Archives.

1957. David Truman to Kenneth W. Thompson, 9.9.57. Folder 4993: University of Michigan - Survey Research Center - Voting Behavior (1955-1957), Box 583, Record Group 1.2, Series 200S, Rockefeller Foundation Archives.

Amadae, S.M. (Sonja) 2003. Rationalizing Capitalist Democracy: The Cold War Origins of Rational Choice Liberalism. Chicago: University of Chicago Press.Google Scholar
Austin, Erik. 2011. “ICPSR: The Founding and Early Years.” Accessible at https://www.icpsr.umich.edu/web/pages/about/history/early-years.html (accessed November 14, 2020).Google Scholar
Bisco, Ralph L. 1966. “Social Science Data Archives: A Review of Developments.American Political Science Review 60:93109.CrossRefGoogle Scholar
Bisco, Ralph L. 1967. “Social Science Data Archives: Progress and Prospects.Social Science Information 6 (1):3974.CrossRefGoogle Scholar
Burns, Nancy. 2006. “The Michigan, then National, then American National Election Studies.” Accessible at https://cps.isr.umich.edu/wp-content/uploads/2020/03/ANES_history.pdf (accessed November 14, 2020).Google Scholar
Camic, Charles, Gross, Neil, and Lamont, Michèle, eds. 2011. Social Knowledge in the Making. Chicago: University of Chicago Press.CrossRefGoogle Scholar
Converse, Jean. 1987. Survey Research in the United States: Roots and Emergence, 1890-1960. Berkeley: University of California Press.Google Scholar
Crowther-Heyck, Hunter. 2006. “Patrons of the Revolution: Ideals and Institutions in Postwar Behavioral Science.Isis 97:420446.CrossRefGoogle ScholarPubMed
Ford, Clellan S. 1970. Human Relations Area Files, 1949-1969: A Twenty-Year Report. New Haven: Human Relations Area Files, Inc. Accessible at https://hraf.yale.edu/wp-content/uploads/2014/11/HRAF-1949-1969.pdf (accessed November 14, 2020).Google Scholar
Frantilla, Anne. 1998. Social Science in the Public Interest: A Fiftieth-Year History of the Institute for Social Research. Ann Arbor: Bentley Historical Library, University of Michigan.Google Scholar
Gieryn, Thomas. 1999. Cultural Boundaries of Science: Credibility on the Line. Chicago: University of Chicago Press.Google Scholar
Hauptmann, Emily. 2016. “‘Propagandists for the Behavioral Sciences’: The Overlooked Partnership between the Carnegie Corporation and SSRC in the Mid-Twentieth Century.The Journal of the History of the Behavioral Sciences 52 (2): 167187.Google Scholar
Hauptmann, Emily. 2017. “Competition and Coordination in the 1960s U.S. Data Center Boom.” Paper presented at the International Political Science Association Conference, “Political Science in the Digital Age.” Hannover, Germany (December). Unpublished manuscript. Available from the author by request.Google Scholar
House, James S. et al., eds. 2004. A Telescope on Society: Survey Research and Social Science at the University of Michigan and Beyond. Ann Arbor: University of Michigan Press.CrossRefGoogle Scholar
ICPR Annual Reports, 1962-1969. Accessible at https://www.icpsr.umich.edu/web/pages/about/annual-reports/index.html (accessed November 14, 2020).Google Scholar
Isaac, Joel. 2012. “Epistemic Design: Theory and Data in Harvard’s Department of Social Relations.” In Cold War Social Science: Knowledge Production, Liberal Democracy, and Human Nature, edited by Solovey, Mark and Cravens, Hamilton, 7995. New York: Palgrave MacMillan.CrossRefGoogle Scholar
Jackson, John E. and Saxonhouse, Arlene W.. 2014. “Not Your Great-Grandfather’s Department.” Unpublished manuscript. Available from the authors upon request.Google Scholar
King, Gary. 1995a. “Replication, Replication.PS: Political Science and Politics 28 (3):444–52.Google Scholar
King, Gary. 1995b. “A Revised Proposal, Proposal.PS: Political Science and Politics 28 (3):494499.Google Scholar
Kraus, Rebecca S. 2011. “Statistical Déjà Vu: The National Data Center Proposal of 1965 and its Descendants.” Paper presented at the Joint Statistical Meetings, Miami Beach, FL., August 1. Accessible at https://www.census.gov/history/pdf/kraus-natdatacenter.pdf (accessed November 15, 2020).Google Scholar
Lemov, Rebecca. 2015. Database of Dreams: The Lost Quest to Catalog Humanity. New Haven: Yale University Press.Google Scholar
Lucci, York, and Stein, Rokkan, with Meyerhoff, Eric. 1957. A Library Center of Survey Research Data: A Report of an Inquiry and a Proposal. New York: Columbia University School of Library Service.Google Scholar
Merton, Robert K. 1973. “The Normative Structure of Science.” In The Sociology of Science: Theoretical and Empirical Investigations, edited and with an introduction by Storer, Norman W., 267280. Chicago: University of Chicago Press.Google Scholar
Miller, Warren. 1988. Oral history interview. (Heinz Eulau, interviewer). In Political Science in America: Oral Histories of a Discipline, edited by Baer, Michael, Jewell, Malcolm, and Sigelman, Lee, 231247. Lexington: University of Kentucky Press.Google Scholar
Orcutt, Guy. 1957. “The Importance of Sample Survey Statistics for Economic Research.Michigan Business Review 9 (1):59.Google Scholar
Rodriguez, Ilia. 1999. “Journalism, Development, and the Reworking of Modernity: News Reporting and the Construction of Local Narratives of Modernization in Puerto Rico during Operation Bootstrap (1947-1963).” Ph.D. diss., University of Minnesota.Google Scholar
Rohde, Joy. 2013. Armed with Expertise: The Militarization of American Social Research During the Cold War. Ithaca: Cornell University Press.Google Scholar
Scheuch, Erwin K. 2003. “History and Visions in the Development of Data Services for the Social Sciences.International Social Science Journal 177: 385399.CrossRefGoogle Scholar
Solovey, Mark. 2013. Shaky Foundations: The Politics-Patronage-Social Science Nexus in Cold War America. New Brunswick: Rutgers University Press.CrossRefGoogle Scholar
Amadae, S.M. (Sonja) 2003. Rationalizing Capitalist Democracy: The Cold War Origins of Rational Choice Liberalism. Chicago: University of Chicago Press.Google Scholar
Austin, Erik. 2011. “ICPSR: The Founding and Early Years.” Accessible at https://www.icpsr.umich.edu/web/pages/about/history/early-years.html (accessed November 14, 2020).Google Scholar
Bisco, Ralph L. 1966. “Social Science Data Archives: A Review of Developments.American Political Science Review 60:93109.CrossRefGoogle Scholar
Bisco, Ralph L. 1967. “Social Science Data Archives: Progress and Prospects.Social Science Information 6 (1):3974.CrossRefGoogle Scholar
Burns, Nancy. 2006. “The Michigan, then National, then American National Election Studies.” Accessible at https://cps.isr.umich.edu/wp-content/uploads/2020/03/ANES_history.pdf (accessed November 14, 2020).Google Scholar
Camic, Charles, Gross, Neil, and Lamont, Michèle, eds. 2011. Social Knowledge in the Making. Chicago: University of Chicago Press.CrossRefGoogle Scholar
Converse, Jean. 1987. Survey Research in the United States: Roots and Emergence, 1890-1960. Berkeley: University of California Press.Google Scholar
Crowther-Heyck, Hunter. 2006. “Patrons of the Revolution: Ideals and Institutions in Postwar Behavioral Science.Isis 97:420446.CrossRefGoogle ScholarPubMed
Ford, Clellan S. 1970. Human Relations Area Files, 1949-1969: A Twenty-Year Report. New Haven: Human Relations Area Files, Inc. Accessible at https://hraf.yale.edu/wp-content/uploads/2014/11/HRAF-1949-1969.pdf (accessed November 14, 2020).Google Scholar
Frantilla, Anne. 1998. Social Science in the Public Interest: A Fiftieth-Year History of the Institute for Social Research. Ann Arbor: Bentley Historical Library, University of Michigan.Google Scholar
Gieryn, Thomas. 1999. Cultural Boundaries of Science: Credibility on the Line. Chicago: University of Chicago Press.Google Scholar
Hauptmann, Emily. 2016. “‘Propagandists for the Behavioral Sciences’: The Overlooked Partnership between the Carnegie Corporation and SSRC in the Mid-Twentieth Century.The Journal of the History of the Behavioral Sciences 52 (2): 167187.Google Scholar
Hauptmann, Emily. 2017. “Competition and Coordination in the 1960s U.S. Data Center Boom.” Paper presented at the International Political Science Association Conference, “Political Science in the Digital Age.” Hannover, Germany (December). Unpublished manuscript. Available from the author by request.Google Scholar
House, James S. et al., eds. 2004. A Telescope on Society: Survey Research and Social Science at the University of Michigan and Beyond. Ann Arbor: University of Michigan Press.CrossRefGoogle Scholar
ICPR Annual Reports, 1962-1969. Accessible at https://www.icpsr.umich.edu/web/pages/about/annual-reports/index.html (accessed November 14, 2020).Google Scholar
Isaac, Joel. 2012. “Epistemic Design: Theory and Data in Harvard’s Department of Social Relations.” In Cold War Social Science: Knowledge Production, Liberal Democracy, and Human Nature, edited by Solovey, Mark and Cravens, Hamilton, 7995. New York: Palgrave MacMillan.CrossRefGoogle Scholar
Jackson, John E. and Saxonhouse, Arlene W.. 2014. “Not Your Great-Grandfather’s Department.” Unpublished manuscript. Available from the authors upon request.Google Scholar
King, Gary. 1995a. “Replication, Replication.PS: Political Science and Politics 28 (3):444–52.Google Scholar
King, Gary. 1995b. “A Revised Proposal, Proposal.PS: Political Science and Politics 28 (3):494499.Google Scholar
Kraus, Rebecca S. 2011. “Statistical Déjà Vu: The National Data Center Proposal of 1965 and its Descendants.” Paper presented at the Joint Statistical Meetings, Miami Beach, FL., August 1. Accessible at https://www.census.gov/history/pdf/kraus-natdatacenter.pdf (accessed November 15, 2020).Google Scholar
Lemov, Rebecca. 2015. Database of Dreams: The Lost Quest to Catalog Humanity. New Haven: Yale University Press.Google Scholar
Lucci, York, and Stein, Rokkan, with Meyerhoff, Eric. 1957. A Library Center of Survey Research Data: A Report of an Inquiry and a Proposal. New York: Columbia University School of Library Service.Google Scholar
Merton, Robert K. 1973. “The Normative Structure of Science.” In The Sociology of Science: Theoretical and Empirical Investigations, edited and with an introduction by Storer, Norman W., 267280. Chicago: University of Chicago Press.Google Scholar
Miller, Warren. 1988. Oral history interview. (Heinz Eulau, interviewer). In Political Science in America: Oral Histories of a Discipline, edited by Baer, Michael, Jewell, Malcolm, and Sigelman, Lee, 231247. Lexington: University of Kentucky Press.Google Scholar
Orcutt, Guy. 1957. “The Importance of Sample Survey Statistics for Economic Research.Michigan Business Review 9 (1):59.Google Scholar
Rodriguez, Ilia. 1999. “Journalism, Development, and the Reworking of Modernity: News Reporting and the Construction of Local Narratives of Modernization in Puerto Rico during Operation Bootstrap (1947-1963).” Ph.D. diss., University of Minnesota.Google Scholar
Rohde, Joy. 2013. Armed with Expertise: The Militarization of American Social Research During the Cold War. Ithaca: Cornell University Press.Google Scholar
Scheuch, Erwin K. 2003. “History and Visions in the Development of Data Services for the Social Sciences.International Social Science Journal 177: 385399.CrossRefGoogle Scholar
Solovey, Mark. 2013. Shaky Foundations: The Politics-Patronage-Social Science Nexus in Cold War America. New Brunswick: Rutgers University Press.CrossRefGoogle Scholar