It is widely recognized that statistical, programming, data science, and analytic skills give students a strategic advantage on the job market and in their future career. The American Statistical Association predicted that by 2021, 70% of business leaders will prefer candidates with data science skills.Footnote 1 This demand is expected to keep growing, with projections of 28% growth during the last five years.Footnote 2 Jobs that require coding skills pay, on average, $22,000 more per year than those that do not, and almost half of all jobs paying more than $58,000 require at least some level of coding.Footnote 3 In acknowledgment of this trend, higher education has added degrees, programs, and courses in data science and emphasized a greater need for data literacy among students.Footnote 4 Students also are recognizing the utility of these skills because they now are often requirements for jobs beyond the business or physical science world, including education, healthcare, nonprofits, and even the wine industry.Footnote 5
Yet, for our discipline, teaching undergraduate political methodology courses remains a demanding task; faculty must instruct students to be writers, readers, and creators of political knowledge.Footnote 6 Although these courses have existed for decades, the growing emphasis on data analysis and data literacy places a greater demand on research methods courses to deliver these skills. To further complicate the situation, there is little common discussion among political scientists regarding the use of data software, skills, and approaches in these courses. As a discipline, although we recognize the growing need for greater data literacy among undergraduate students, we appear to have placed little emphasis on this trend in our own pedagogical discussions.
The criticism that research methodology lacks a strong pedagogical founding is not new. Previous studies recognize a dearth of discussion on the subject, despite acknowledging that methods courses often are some of the most academically challenging for students (Howard and Brady Reference Howard and Brady2015). For instance, Wagner, Garner, and Kawulich (Reference Wagner, Garner and Kawulich2011, 75) stated that research methods instructors “appear not to cover pedagogical questions at all.” This does not mean that discussions of pedagogy in social science and political science methodology are nonexistent (c.f. Hubbell Reference Hubbell1994; Nind, Kilburn, and Luff Reference Nind, Kilburn and Luff2015; Payne and Williams Reference Payne and Williams2011). However, the fact remains that there is little guidance for undergraduate political methodology instructors. As more students recognize the benefits of these courses, a deeper understanding of best practices in pedagogical approaches is urgently needed.
This article fills this void by addressing two questions: (1) How is statistical software incorporated into undergraduate political methodology courses?; and (2) How are these courses structured and taught? We answer these questions through two distinct approaches. First, a team of research assistants collected and coded course information from 93 syllabi from 78 separate institutions (some had two or more syllabi included) of undergraduate political methodology courses drawn from a cross section of institutions according to their rank in the US News & World Report (Brown, Bryant, and Philips Reference Brown, Bryant and Philips2021). Of the 93 syllabi, 25 were from the South, 24 from the North, 9 from the East, and 35 from the West. Second, we relied on an anonymous survey conducted in January 2021—advertised on both the PolMeth email listserv and the “Political Scientists” private group page on Facebook—of instructors who have taught a political methodology course at the undergraduate level. We received more than 140 responses, from a mix of R1 (53% of respondents), R2 (11%), and R3 institutions (9%), as well as liberal arts (26%) and community colleges (0.7%). Using these two original data sources, we provide a broad overview of the “state of the discipline” regarding not only how statistical software is incorporated into political methodology courses but also how it is presented to students. We conclude with advice for instructors of these courses.
SOFTWARE
Do instructors use software in political methodology courses and, if so, which types? We begin our analysis by asking whether methods courses are required for a political science major. There is clear evidence that political methodology classes are valued by departments and institutions; more than 73% of survey respondents and 59% of syllabi coded indicated that their course is a major requirement. Our syllabi coding also suggested interesting variation depending on institution type and location. Figure 1 reveals that regional universities are far more likely to require a methods course, followed by liberal arts schools and national universities. Additionally, 62% of private institutions required a methods course, whereas public institutions lagged slightly behind at only 57%. Finally, institutions in the middle of the country are more likely to require these courses (64%) compared to their coastal counterparts (54%).
There is clear evidence that political methodology classes are valued by departments and institutions; more than 73% of survey respondents and 59% of syllabi coded indicated that their course is a major requirement.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20220209190913542-0881:S1049096521001177:S1049096521001177_fig1.png?pub-status=live)
Figure 1 Course Requirements by Institution Type
Note: The figure shows whether undergraduate methods are listed as a required course in their syllabus, split by type of school. Syllabi coding was done by the authors.
Table 1 summarizes in more detail the various types of software used. Our syllabi coding suggested that more than 80% of courses use at least some statistical software. R and/or RStudio by far are most commonly used, with 29% of syllabi and 60% of survey respondents indicating that they are used in their courses. One explanation for R’s popularity may be cost; a substantial number of survey respondents remarked that this was one of the main advantages of using R, as well as the active online community that provides help when needed. Some respondents using Excel also noted that it often is free to students through their university. The next most popular are Excel, Stata, and SPSS (according to the survey)—or SPSS, Stata, and Excel (according to the syllabi)—with usage rates ranging between 9% and 30% of classrooms. In contrast, Python is seldom used by instructors in political methodology courses.
Table 1 Software Patterns
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20220209190913542-0881:S1049096521001177:S1049096521001177_tab1.png?pub-status=live)
Our survey also asked respondents which software they think is used most often in political methodology courses. As shown in table 1, instructors were fairly accurate, although they appear to overestimate the use of SPSS and underestimate the use of R and Excel (according to the survey). We also asked respondents if they would recommend their chosen software to a colleague; a substantial majority would recommend Excel, Python, R, and Stata and about 60% would recommend the use of SPSS. One common sentiment expressed by survey respondents to justify why a particular type of software was chosen is its potential for being useful not only in the classroom and academia but also in industry. This was most common in R, although other respondents argued the same for Stata, SPSS, and Excel.
Despite its popularity, one of the often-heard disadvantages of R (or Python) is the steep learning curve. Survey respondents called it “impenetrable,” “intimidating,” and “extremely daunting.” Yet, all software—except perhaps Excel—is likely alien to anyone who is not familiar with the interface. As one respondent colorfully stated, “My students hate Stata…but I think they would hate any statistical software program I assign.” In our experience, devoting the first few weeks of a course to orient students to downloading, installing, and showing basic functionalities (e.g., setting a working directory and how to open and save files) can help students become comfortable with the interface. Textbooks with embedded or included code also can be helpful to students.
“My students hate Stata…but I think they would hate any statistical software program I assign.”
We also found that almost a third of survey respondents used multiple software in a single course. As one instructor remarked, this strategy “allows students to use whatever they have around, guided by the same scientific knowledge.” Given that students already may know one type of statistical software, this approach might be more flexible and more realistic in the real world; as researchers, we certainly do not limit ourselves to a single software. Disadvantages include the need for the instructor to be competent in all software and issues with allocating class time to discuss multiple ways of doing the same function across different software. Relatedly, it is clear that familiarity is highly desirable. As one respondent remarked regarding their software of choice, “It’s the one I’m most familiar with, so it’s easy for me to problem solve when they get stuck.” Switching to another statistical program is likely challenging for instructors, given the time and effort involved in learning a new system.
STRUCTURE AND TOPICS
How are undergraduate methods courses structured and taught? To address this question, we examined the topics covered, course level and size, and types of assessment used by instructors. As shown in table 2, topics on research design, statistics, and data science are included in a majority of courses. However, substantial within-category variation often exists. For instance, according to our survey analysis, 94% of courses discuss descriptive statistics whereas only 18% cover analysis of variance (i.e., ANOVA/ANCOVA) techniques. Almost 60% of methods courses discuss how to manage datasets and 88% discuss graphics, yet only 29% explicitly include programming. Our syllabi analysis yielded even lower percentages in various categories, including general hypothesis testing, probability theory, and regression. However, the majority of syllabi cover elements of both research design and causal inference and more than half included discussions of epistemology.
Table 2 Included Topics
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20220209190913542-0881:S1049096521001177:S1049096521001177_tab2.png?pub-status=live)
Note: Not all categories were covered in both the survey and syllabi codings (marked as NA).
Our survey findings regarding course level and size are shown in figure 2. A plurality of courses are intermediate undergraduate courses, although they also are taught at other levels. Courses also differed substantially in class size, ranging from a seminar-style eight-person course to a 700-person course (with a median of 29). Large courses tend to occur in beginning undergraduate-level courses, with a median of 37 students versus 20 for an advanced undergraduate course.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20220209190913542-0881:S1049096521001177:S1049096521001177_fig2.png?pub-status=live)
Figure 2 Course Levels
Note: The figure shows the percentage of survey respondents reporting whether their course is listed as a beginning, intermediate, advanced, or other undergraduate course.
Finally, we examined various types of assessment used by instructors in these methods courses. Table 3 shows that the majority include individual homework, labs or problem sets, and some form of exam. Research papers also are a common type of assessment. Less common are group assignments or problem sets and presentations. It is clear that individual homework or labs might be useful; however, our experience suggests that group assignments—we prefer groups of three—are a good way to allow students to learn, often by helping one another. Weekly or biweekly group assignments that build on the previous week increase the knowledge base while still providing some familiarity with the software, given prior assignments. For instance, one week might involve bivariate visualizations (e.g., scatterplots) and the next week’s assignment requires adding an additional variable (e.g., scatterplots with colors, shapes, or sizes representing the additional dimension). We also believe that presentations are an excellent way for students to practice organizing and articulating their data findings, although these types of opportunities are likely to be limited for larger class sizes.
Table 3 Assessment Types
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20220209190913542-0881:S1049096521001177:S1049096521001177_tab3.png?pub-status=live)
CONCLUSION: SUCCESS WITH SOFTWARE
Just as it has greatly changed political science, statistical software has altered the classroom, particularly for methodology courses. We think that this is an improvement, despite the fact that incorporating these tools into courses is a challenging undertaking. The results of our analyses indicate that there is wide variation in the structure of these courses and the software that they use. We would be remiss not to mention the substantial amount of variation—in terms of institutional support and resources—for these courses. Whereas our survey indicated that many respondents had access to teaching assistants and lab sessions, the majority of syllabi we analyzed did not have those same resources. Given the high demands placed on faculty in teaching these courses, further investigation of this difference is warranted.
Drawing from both our experience as well as that of our colleagues, we offer some overarching pieces of advice and best practices regarding statistical software. First, there is a tension between learning a statistical program and learning statistics; they are not the same. Some students relish learning the ins and outs of programming, others abhor it. Still others do not want to be learning statistics in the first place—we often hear that they “didn’t sign up for a math class!”—which makes the enterprise especially fraught. Being clear about the goals and structure of the class early in the semester is tremendously helpful in this regard. Instructors should also be aware that different software packages and languages are best learned in different ways; some require higher upfront costs than others. When the instructor is open about the advantages and disadvantages of a program or package and why it was chosen, both reluctance and frustration are dampened. We often have pursued the strategy of simply typing “jobs in R” into the search bar on the first day of class to assure students that it is worth investing in these marketable skills.
Second, whereas some students can read a mathematical expression and know exactly what it means, this is probably the exception for a required class in political science. Some students rely on simple, plain language that leads them step-by-step through the process. Others are more visually oriented and can quickly learn commands and graphical techniques and to generate estimates only through example. Given this wide variation in how students learn, providing different ways of discovering how to use the software is important. Likewise, it is beneficial to have multiple resources available. Packages and languages that are well supported on the internet and actively maintained also are helpful. Instructors should be very explicit that one of the course skills students need to develop is the ability to find their own answers; they do not have to be found in the same textbook or website. Showing students how to search for help online can be an asset for them and also ease the burden on instructors.
Third, we are advocates of flipped or semi-flipped—that is, active rather than passive—teaching modalities. Experimental evidence suggests that they promote learning—although at the cost of reduced student satisfaction (Deslauriers et al. Reference Deslauriers, McCarty, Miller, Callaghan and Kestin2019). This requires condensing lectures into fewer course days (or moving some fully online as prerecorded lectures). However, devoting course time for individuals or groups to work on assignments using software diffuses frustration with small coding issues that can be resolved easily by instructors. Alternatively—especially in smaller class sizes—research projects involving the entire class may be the best way to engage with students (Winn Reference Winn1995).
Fourth, students’ ability to learn the software depends on observing (“the code is not working!”), generating hypotheses (“probably just a typo”), and testing that hypothesis (“change the spelling and rerun the command”). This is an oversimplified example of how students learn and practice the scientific method as they work with software. Although it is not the same as “taking a spoonful of sugar with the medicine,” if done correctly—for example, making learning goals explicit—students will find that they sharpen their problem-solving skills.
Despite the heterogeneity in undergraduate methods courses, a central prevailing theme is the growing demand for them. Therefore, political science must lean into supporting instructors as they confront the challenges and complexities that accompany teaching political methodology courses. Greater pedagogical dialogue, sharing best practices, and creating a repository of online resources are some of the many steps forward in fostering a more comprehensive and supportive approach to these integral courses.
Political science must lean into supporting instructors as they confront the challenges and complexities that accompany teaching political methodology courses.
ACKNOWLEDGMENTS
We thank Jeff Harden, Justin Esarey, and our team of excellent research assistants: Samuel Beck, Yucong Li, Julie Mcnees, Frida Muedsam, and Hannah Strassburger. We also acknowledge financial support from the University of Colorado department of political science STUDIO undergraduate research lab.
DATA AVAILABILITY STATEMENT
Research documentation and data that support the findings of this study are openly available at the PS: Political Science & Politics Dataverse at https://doi.org/10.7910/DVN/FENBA2.