Hostname: page-component-6bf8c574d5-956mj Total loading time: 0 Render date: 2025-02-20T10:41:15.541Z Has data issue: false hasContentIssue false

Partial Online Instruction and Gender-based Differences in Learning: A Quasi-Experimental Study of American Government

Published online by Cambridge University Press:  25 April 2006

Bruce M. Wilson
Affiliation:
University of Central Florida
Philip H. Pollock
Affiliation:
University of Central Florida
Kerstin Hamann
Affiliation:
University of Central Florida
Rights & Permissions [Opens in a new window]

Extract

College teachers are increasingly using instructional technology to supplement or substitute for face-to-face instruction. The incentives and arguments for doing so are many, including facilitation of higher education for non-traditional students and changing student demographics, skill building to improve student preparation for workplaces that are likely to use computer technology, space restrictions in universities with growing enrollment, the opportunities that Internet classrooms provide for organizing and monitoring student work and assignments, and so on. However, little is known about how online instruction affects the learner. Online instruction is, in many ways, fundamentally different from face-to-face instruction (see, for example, Lee 2003; McCormack and Jones 1998; Palloff and Pratt 1999). For instance, instructors are unlikely to simply post their lecture notes online in the hope that the students will read, take notes, memorize, and retain the information, which would be the closest equivalent to a traditional lecture classroom. Consequently, it cannot be assumed that different instructional modes (such as lectures or online instruction) necessarily have the same learner outcomes. This lack of knowledge concerning the effects of online instruction on learner outcomes also extends to the question of gender equality.Authors are listed in reverse alphabetical order. We gratefully acknowledge financial support for both the redesign and the evaluation of the course from the Pew Learning and Technology Program, Center for Academic Transformation, Pew Grant Program in Course Redesign.

Type
THE TEACHER
Copyright
© 2006 The American Political Science Association

Online Teaching in College Classes

College teachers are increasingly using instructional technology to supplement or substitute for face-to-face instruction. The incentives and arguments for doing so are many, including facilitation of higher education for non-traditional students and changing student demographics, skill building to improve student preparation for workplaces that are likely to use computer technology, space restrictions in universities with growing enrollment, the opportunities that Internet classrooms provide for organizing and monitoring student work and assignments, and so on. However, little is known about how online instruction affects the learner. Online instruction is, in many ways, fundamentally different from face-to-face instruction (see, for example, Lee 2003; McCormack and Jones 1998; Palloff and Pratt 1999). For instance, instructors are unlikely to simply post their lecture notes online in the hope that the students will read, take notes, memorize, and retain the information, which would be the closest equivalent to a traditional lecture classroom.1

Obviously, most lecture courses will also contain some discussion, question and answer time, etc. When referring to “lecture class,” we refer to the lecture portion of face-to-face classes.

Consequently, it cannot be assumed that different instructional modes (such as lectures or online instruction) necessarily have the same learner outcomes. This lack of knowledge concerning the effects of online instruction on learner outcomes also extends to the question of gender equality.

Many studies have confirmed the existence of a gender gap in face-to-face classrooms, which is manifested in different ways. For example, male students tend to dominate classroom discussions (see, for example, Diller et al. 1996; Drew and Work 1998). Drew and Work (1) cite studies that show that the way female college students are treated in the classroom often leads to them “feeling unable to participate fully in the learning process.” In contrast, few analyses have assessed the consequences the online environment has on various aspects of gender equality. The existing literature presents mixed results; some studies report a replication of gender differences with males dominating and performing better, while others find evidence for a shrinking gender gap (see Kramarae 2001, 39–41; Herring 1993; 1994; Savicki et al. 1996; Blum 1999; Hum 2002; Wolfe 1999; 2000; Gunn et al. 2003). It is frequently assumed that the somewhat more anonymous nature of the online environment has a democratizing and equalizing effect that extends to gender, but some studies have also shown that established gender-specific patterns of behavior are transferred to the online context (see Kramarae 2001, 39–41). Barrett and Lally (1999) analyze discussion postings in a postgraduate setting and conclude that online discussion “may reproduce gender differences in a learning community” (48). Pollock, Hamann, and Wilson (2005), on the contrary, find that online discussion groups in political science courses do not replicate patterns of male dominance in discussions often found in the traditional classroom, and that overall men and women display remarkably small differences in discussion behavior especially when the gender composition of the groups is relatively balanced. Herschel (1994, 219) examines gender differences in brainstorming or idea-generation in the online environment, and finds a remarkable degree of similarity between men and women “in access and voice in the critical initial phase of a decision-making process.” Despite the existing research on gender in online classes, though, we know little about whether student enrollment and performance in courses that are partially or entirely online is distinct for male and female students.

To further our understanding of the effects of online teaching modules on women's academic performance at college, we designed a reduced seat-time (RST, taught half online and half in a traditional classroom) section for the American National Government class, an introductory course that fulfills a General Education Program (GEP) requirement. We evaluate course selection, enrollment patterns, self-perceived computer literacy and political attentiveness, and objective learner outcomes for both the RST sections and a traditional face-to-face section to detect the effect of gender. The results of this quasi-experimental design show that for this course, all students did better overall in the partial online environment than in the traditional classroom setting. This effect was particularly pronounced for women.

Study Design

This study is based on a comparison of students in two sections of the same course during the same semester. The test group is composed of students who enrolled in the RST section of American National Government. Students in this section attended lecture once a week and completed online assignments and learning modules in lieu of a second day of face-to-face instruction. We compared that group with a control group, which comprised the students who enrolled in a non-RST section of the same course. Students in the control group attended lecture twice a week. The same instructor taught both sections—the test group and the control group—and both groups were assigned the same textbook. Both sections had identical in-class midterm and final exams, but the students in the RST section also had to complete a series of online assignments for credit. Thus, students in the RST section read slightly more since some of the online modules required additional online readings, but in turn, they had less lecture material to draw on. They also engaged in online discussions and other online activities, such as short essays. The online portion of the class required the students to engage more in active learning behavior in order to obtain credit for the assignments. Opportunities for active learning in the non-RST section were provided through in-class discussions during class time; however, participation in those discussions was voluntary and was not linked to credit. There were no additional discussion sections outside of the regular classtime.

Students had the choice to enroll in either format and instructors had no control over the registration process.2

University policy did not permit the random assignment of students to non-RST or RST sections of the course.

Students received information about the format of instruction through the course schedule, which listed the traditional, face-to-face section of the course as having two class meetings of one hour and fifteen minutes per week. Similarly, the schedule informed the students that the RST section met one day a week for one hour and fifteen minutes and specified that “For above course: Web enhanced; reduced class time. WWW access, browser & e-mail skills required.” The instructor was listed identically for both sections. The traditional, face-to-face section enrolled quickly and closed at capacity; in contrast, the RST section retained open seats even after registration closed. We were interested in assessing three areas: objective political knowledge and political attentiveness as the two general goals of the class, and computer literacy as a skill contributing to lifelong learning and future employability. The evaluation of political knowledge, political attentiveness, and computer literacy was conducted through identical surveys administered to students during the first and last weeks of the semester. Objective political knowledge was assessed through an 18-item questionnaire.3

One of the authors compiled the survey questions, which were reviewed by colleagues in the department. The questions were similar to standard questions used in testbanks for American government textbooks. The instructor had no input into creating the survey questions. The survey is available upon request.

To avoid the effects of potential instructor bias for either format, the instructor had no access to the survey questions prior to their distribution in class and the survey was distributed, administered, and analyzed by someone other than the instructor. Thus, biasing the survey results of either section through “teaching to the test” was not possible.

When Does Gender Matter?

We were interested in finding out what motivations and constraints shaped student selection, and we wanted to identify the differences in resources and experience that students may have brought with them to each format. These variations in student background help to establish benchmarks against which student outcomes may be assessed.

Initial enrollment data clearly pointed to at least one demographic difference: Females were less likely than males to enroll in the redesigned, RST format. In our study, 54% of the RST students were female, compared to 60% of the non-RST students. Thus, aggregate comparisons between formats may—or may not—be shaped by these different gender compositions. When does gender matter?

By way of summary, we have divided pre-survey selection variables into two general categories. The first category includes variables for which the differences between non-RST and RST sections were larger than the gender differences within each format. These variables appear to have shaped format selection independently of gender, and they help to sketch descriptive profiles of non-RST and RST students. The second category includes variables for which differences between non-RST and RST sections were smaller than the gender differences within format. These variables become especially important for assessing the relative effects of each format on learning outcomes for males and females. Table 1 presents information on the first class of variables.

Format Selection Differences Not Related to Gender

What do these comparisons tell us about the motivations and backgrounds of RST and non-RST students? Overall, students who enrolled in the traditional, face-to-face section appeared to have done so because it was their preferred choice and felt that section availability had not been a problem. The reasons for their preference for this traditional format varied and ranged from pragmatic (expected an “easier grade”) to substantive (to “learn about American politics”). In addition, students in the traditional section were less likely to have freshman standing. They were also somewhat more likely to have had some previous experience with online classes. About one third of the students registered for the RST section, in contrast, felt that the section choice was constrained. By and large, these motivational and attitudinal differences suggest a more receptive and experienced student audience in the non-RST section than in the RST format. Other studies have revealed that despite initial differences between formats, students in the RST setting outperformed their non-RST counterparts on several objective and attitudinal measures of student outcomes (Pollock and Wilson 2002).

The present analysis takes aim at gender-dependent aspects of format selection and student performance. Table 2 presents evidence for the importance of gender-related differences in format selection. Measurements for each of four competencies—self-assessed computer knowledge, attentiveness to politics, self-assessed political knowledge, and objective political knowledge—were regressed on gender (females coded 1), format (with RST coded 1), plus a gender-format interaction term. With one exception (attentiveness to politics), the female dummy alone achieves statistical significance.

Format Selection Variables Related to Gender

Indeed, in terms of format selection, key technical skills and cognitive resources were strongly related to gender. Female students in both formats were far less likely than their male counterparts to consider themselves competent with computers. Men professed somewhat higher levels of attentiveness to politics and gave themselves significantly higher assessments of current political knowledge. These assessments were borne out in responses to the pre-survey knowledge questionnaire: Regardless of format, males returned higher and similar scores; females scored lower.

Clearly, females brought fewer skills and resources to both pedagogical settings. But did males and females respond differently to the two learning environments? If so, in what ways? We address these questions in two parts. First, we examine scores for objective political knowledge at the beginning and end of the semester. Second, we report changes in self-assessed technical skills and political interest.

Objective Political Knowledge

How did men and women improve their objective knowledge in the two instructional settings? Figure 1 illustrates pre- and post-semester performance on the 18-item measure of political knowledge, by format and gender, and shows how many questions each group answered correctly.

Pre-semester and Post-semester Political Knowledge Scores, by Format and Gender

Women knew less about politics at the beginning of the semester than men did, regardless of the format. Even more so, women enrolled in the RST section knew the least and scored by far the lowest prior to the start of classes. All groups had increased their knowledge by the end of the term, but not by the same amount. The traditional format shows differential effects for males and females. Males gained a little (a 1 point increase), while females gained more (a 2 point increase), which brought them to rough parity with males by the end of the term. Although both of these increases are statistically significant, neither is particularly striking. The RST format, though, shows no differential effects between genders. The gains posted by males (a 2.8 point increase) and females (a 3 point increase) are both large—and virtually identical. Thus, controlling for initially large male-female discrepancies in political knowledge, female students and male students benefited in equal measure from the RST environment.

In absolute terms, RST females achieved post-semester survey equality with non-RST students of either gender. That is, the highest increase in learning was obtained by RST women, who increased their objective knowledge score by a whole point more than women enrolled in the traditional format.

Skills and Resources

The RST format had similar salutary effects, as well, on females' self-assessed technical skills and political interest. Table 3 reports pre- and post-semester survey data on computer literacy and attention to politics. The changes in computer literacy are especially noteworthy. The traditional section, which did not require a working knowledge of the Internet, provides a solid basis of comparison. The males and females in the non-RST section began and ended the term with virtually the same level of competence. For RST students, the gains were large and, again, very similar for men and women. The percentage of males who assessed their literacy as an “A” increased from 43% to 56%; females jumped from 26% to 36%. To be sure, the absolute values of these percentages continued to display a sizable gap, but the relative increases are hardly differentiated by student gender—a 9 percentage point difference for women (comparing RST and non-RST) compared to 10 percentage points for men. The patterns for political attentiveness are interesting—if not as systematic as the findings for objective knowledge and computer literacy. Even so, RST females posted larger gains than non-RST females in attentiveness. We observed a 16 percentage point increase for RST females versus a 9 percentage point increase for non-RST females, which is quite remarkable considering that male students in the traditional section increased their political attentiveness by a whopping 20 points compared to a 6 point increase in the RST section. Men seem to have been more inspired to follow politics through traditional classroom instruction, which is consistent with the literature pointing to gender differences in the classroom. Women, on the other hand, developed more attentiveness when the class was partially online.

Pre-semester and Post-semester Computer Literacy and Attention to Politics, by Format and Gender

Discussion and Conclusion

Clearly, the introduction of partial or complete substitution of classroom time with online learning has implications on students' learning processes. Here, we have been particularly concerned with gender differences in the classroom. For the American Government GEP course, women appear to be less likely than men to choose a section that is partially taught over the web, which might be related to their self-perceived lower computer literacy. Our study shows that partial online instruction has overall positive effects on learner outcomes for both men and women compared with the traditional format. In particular, it can increase both computer literacy and political attentiveness for female students and can help narrow the gender gap in higher education. Even if differences between men and women continue to exist, overall female students enrolled in the RST section had a relatively higher increase in their scores in objective knowledge, computer literacy, and political attentiveness compared to women registered for the traditional section. This is especially important in GEP courses, which students generally take early on during their college education. Skill formation, such as computer competency, during the first year(s) in college can have long-term benefits for the students in other classes. The results are also remarkable given that about one third of the students in the RST section chose that format because it was the “only section available” and may therefore have resented that particular learning environment.

These results have been confirmed by other studies. For example, Moskal and Dziuban (2003) have found that at the University at Central Florida, and perhaps at many other institutions of higher education, women enroll disproportionately in online courses. Thus, if gender equality in education is considered a desirable goal, it is pertinent to analyze in more detail how online instruction affects women. According to existing studies, women benefit from online instruction: when measured in terms of success (defined as a grade of A, B, or C), women had a 85% success rate compared to 77% for men; in comparison, the success rate for women in face-to-face courses was 84% and 82% for men. In those courses that delivered the course contents partially online and partially in the classroom, 91% of women succeeded in contrast to 84% of men. Dziuban et al. (2003) found that between fall 1997 and fall 2000, women's success rates were consistently higher than men's—between 5% and 12% higher per semester. Women, then, are more likely to succeed than men in courses that are at least partially delivered online. When comparing withdrawal rates, a similar trend is evident: 6% of women withdraw from online courses compared to 8% of men (this corresponds to 4% of women and 5% of men in face-to-face courses) (Moskal and Dziuban 2001). Our study contributes to this literature by conducting a controlled comparison between traditional face-to-face and RST sections of the same course and by looking at both substantive, discipline-specific knowledge and skill building.

Finally, we want to make a more differentiated argument concerning the effects of online teaching (whether partially or entirely online) for skill building on the one hand and learner outcomes on the other. Gains in computer literacy, a skill, can be obtained in any class that uses some form of online instruction. If students are required to use a computer in ways they had not previously, they will learn new skills, independently of what the substance of their assignment is. Substantive gains—such as political knowledge—however, are likely to be more closely linked to the nature of the online assignments. Assignments that require students to engage in active learning behavior (interactive activities, discussion, writing, and critical reflection) are more likely to result in increased knowledge, whether in an online or in-class environment, than instruction that puts the student in a more passive role (see Hamann and Wilson 2003). Thus, we conclude, it is not necessarily the online medium as such that is by nature beneficial to the students. Rather, the online environment provides the opportunity to engage a large number of students in active learning. The opportunities for active learning in large classes provided by the online environment is thus likely to be the cause of the observed learner outcomes in political knowledge rather than the fact that students used computers in and of itself.

Biographies

Bruce M. Wilson (Ph.D., Washington University) is associate professor of political science at the University of Central Florida. His scholarship of teaching and learning research has been published in PS: Politics and Political Science and the Journal of Political Science Education, among others. His research on Latin American politics and judicial reform has been published in many journals including Comparative Political Studies, Comparative Politics, and the Journal of Latin American Studies. He is currently editor of The Latin Americanist.

Philip H. Pollock, professor of political science at the University of Central Florida, specializes in American public opinion, voting behavior, and techniques of quantitative analysis. His recent publications include a methods textbook (The Essentials of Political Analysis) and accompanying workbook (An SPSS Companion to Political Analysis), both by CQ Press. His Stata Companion to Political Analysis, also by CQ Press, will appear in May 2006.

Kerstin Hamann (Ph.D., Washington University) is associate professor of political science at the University of Central Florida. Her research on West European politics has been published in journals such as Comparative Political Studies, British Journal of Industrial Relations, and Industrial Relations Journal. Her research on teaching and learning has been published in the Journal of Political Science Education, among others. She currently serves as the vice chair of the APSA Undergraduate Education section.

References

Barrett, E., and V. Lally. 1999. “Gender Differences in an On-line Learning Environment.” Journal of Computer Assisted Learning 15: 4860.Google Scholar
Blum, Kimberly Dawn. 1999. “Gender Differences in Asynchronous Learning in Higher Education: Learning Styles, Participation Barriers and Communication Patterns.” Journal of Asynchronous Learning Networks 3 (1): 4666.Google Scholar
Diller, Ann, Barbara Houston, Kathryn Pauly Morgan, and Maryann Ayim. 1996. The Gender Question in Education: Theory, Pedagogy, and Politics. Boulder: Westview Press.Google Scholar
Drew, Todd L., and Gerald G. Work. 1998. “Gender-Based Differences in Perception of Experiences in Higher Education: Gaining a Broader Prospective.” Journal of Higher Education 69: 542556.Google Scholar
Dziuban, Charles et al. 2003. “Developing a Web-based Instructional Program in a Metropolitan University.” In Web Wise Learning: Wisdom from the Field, eds. B. Geibert and S. H. Harvey. Philadelphia: Xlibris Publishing.Google Scholar
Gunn, Cathy et al. 2003. “Dominant or Different? Gender Issues in Computer Supported Learning.” Journal of Asynchronous Learning Networks 7 (1): 1430.Google Scholar
Hamann, Kerstin, and Bruce M. Wilson. 2003. “Beyond Search Engines: Enhancing Active Learning Using the Internet.” Politics & Policy 31 (3): 533553.Google Scholar
Herring, Susan. 1993. “Gender and Democracy in Computer-Mediated Communication.” Electronic Journal of Communication 3(2). Available at http://shadow.cios.org:7979/journals%5CEJC%5C003%5C2%5C00328.html.Google Scholar
Herschel, Richard T. 1994. “The Impact of Gender Composition on Group Brainstorming Performance in a GSS Environment.” Computers in Human Behavior 10 (2): 209222.Google Scholar
Hum, Susan. 2002. “Performing Gendered Identities: A Small-group Collaboration in a Computer-Mediated Classroom Interaction.” Journal of Curriculum Theorizing 18 (2): 1938.Google Scholar
Kramarae, Cheris. 2001. The Third Shift: Women Learning Online. Washington, D.C.: American Association of University Women Educational Foundation.Google Scholar
Lee, Donna. 2003. “New Technologies in the Politics Classroom: Using Internet Classrooms to Support Teaching and Learning.” Politics 23 (1): 6673.CrossRefGoogle Scholar
McCormack, Colin, and David Jones. 1998. Building a Web-Based Education System. New York: John Wiley & Sons.Google Scholar
Moskal, Patsy D., and Charles D. Dziuban. 2001. “ Present and Future Directions for Assessing Cyber-Education: The Changing Research Paradigm.” In CyberEducation: The Future of Long Distance Learning, eds. Larry Vandervert, Larisa Shavinina, and Richard Cornell. Larchmont, NY: Mary Ann Liebert Publishers.Google Scholar
Palloff, Rena M., and Keith Pratt. 1999. Building Learning Communities in Cyberspace. San Francisco: Jossey-Bass.Google Scholar
Pollock, Philip H., Kerstin Hamann, and Bruce M. Wilson. 2005. “Teaching and Learning Online: Assessing the Effects of Gender Context on Active Learning.” Journal of Political Science Education 1 (1): 116.Google Scholar
Pollock, Philip H., and Bruce M. Wilson. 2002. “Evaluating the Impact of Internet Teaching: Preliminary Evidence from American National Government Classes.” PS: Political Science and Politics 35 (3): 561566.Google Scholar
Savicki, Victor, Dawn Lingenfelter, and Merle Kelley. 1996. “Gender Language Style and Gender Composition in Internet Discussion Groups.” Journal of Computer-Mediated Communication 2 (3). Available at www.ascusc.org/jcmc/vol2/issue3/index.html.Google Scholar
Wilson, Bruce M., Philip H. Pollock, and Kerstin Hamann. 2000. “The Best of Both Worlds? Web-Enhanced or Traditional Instruction in American National Government.” Political Chronicle 12 (2): 6575.Google Scholar
Wolfe, Joanna L. 1999. “Why Do Women Feel Ignored? Gender Differences in Computer-Mediated Classroom Interactions.” Computers and Composition 16: 153166.Google Scholar
Wolfe, Joanna L. 2000. “Gender, Ethnicity, and Classroom Discourse.” Written Communication 17 (4): 491519.Google Scholar
Figure 0

Format Selection Differences Not Related to Gender

Figure 1

Format Selection Variables Related to Gender

Figure 2

Pre-semester and Post-semester Political Knowledge Scores, by Format and Gender

Figure 3

Pre-semester and Post-semester Computer Literacy and Attention to Politics, by Format and Gender