Diversity and Democracy

Democratization of Education for Whom? Online Learning and Educational Equity

With the advent of Massive Open Online Courses (or MOOCs), it is theoretically possible for anyone with an Internet connection to access course materials from elite universities—a possibility that some commentators have hailed as a “democratization” of education. Of course, MOOCs are very new, and it is not yet clear how they will affect postsecondary access or attainment, particularly among traditionally underserved populations. To provide some perspective on how MOOCs and other online delivery methods might affect educational equity, in this article I examine recent research on online college coursework (that is, courses in which most or all of the learning experience takes place online) and discuss potential implications for postsecondary access and learning.

Online Education and Postsecondary Access

Online coursework is thought to improve postsecondary access by allowing some students to enroll in college who otherwise would be unable to do so, and by allowing enrolled students to take more courses than they otherwise could. Unfortunately, there are no empirical studies that address either of these points directly. However, some indirect evidence suggests that these improvements in access benefit only certain segments of the population.

According to large-scale studies of online learning conducted in two different community college systems, students who enroll in at least one online course are quite different from those who opt for an entirely face-to-face schedule (Jaggars 2012). As one might expect, students in online courses are older, more likely to have dependents, and more likely to be employed full-time. Yet they are also more advantaged: they are less likely to be ethnic minorities, less likely to be low-income, and less likely to be academically underprepared at college entry. In part, these students’ inclination to enroll in online courses could be rooted in their greater comfort with, and access to, computers and technology. The “digital divide” is still very real in the United States. A recent federal study found that only 55 percent of African American households and 56 percent of Hispanic households (compared with 74 percent of white households and 81 percent of Asian American households) and 58 percent of rural households (compared with 72 percent of urban households) had broadband Internet at home (US Department of Commerce 2013, 26).

Moreover, among community college students who take online courses, most take only one or two per semester, filling the remainder of their schedule with face-to-face courses (Jaggars 2012). In qualitative interviews, Virginia community college students enrolled in online courses—many of whom had children and full-time jobs—explained that one or two online courses per semester allowed them to maintain a full-time college schedule, which otherwise would be difficult or impossible (Jaggars, forthcoming). Still, very few of these students were interested in taking all their courses online. Similarly, a recent Public Agenda (2013) survey of community college students taking online courses found that these students were more likely to want to take fewer classes online (41 percent) rather than more classes online (20 percent).

When we asked online students why they preferred to take some courses face-to-face, their responses implied that they did not learn the material as well online as they did face-to-face (Jaggars, forthcoming). Similarly, students in the Public Agenda survey were more likely to say that they learned less in an online course (42 percent) than that they learned more (3 percent) (2013). These students’ perceptions may accord with reality.

Student Success in Online Coursework

In 2010, a US Department of Education study synthesized the results of dozens of rigorous studies of online learning and concluded that online courses were just as effective as face-to-face courses in terms of student learning (Means et al. 2010). At the time, however, the research on online learning in college was largely confined to studies of well-prepared students attending selective universities who were enrolled in small online courses identified as particularly worthy of study (Jaggars and Bailey 2010). More recently, researchers have begun to consider the wider landscape of online coursework, and to disaggregate results among different types of students.

In particular, Di Xu and I recently conducted two large-scale studies examining outcomes for tens of thousands of students enrolled in hundreds of thousands of courses at fifty-seven community colleges in Virginia and Washington State. In Virginia, completion rates in face-to-face courses were 81 percent, while online completion rates were 68 percent; in Washington, the rates of completion were 90 percent for face-to-face versus 82 percent for online (Jaggars 2012). Students who completed an online section also tended to earn lower grades in the course than they would in a face-to-face section. For example, in math courses in Virginia, only 67 percent of online completers earned a C or better, compared to 73 percent of face-to-face completers who did so (Xu and Jaggars 2011). Students who took more online courses were also less likely to successfully graduate or transfer to a four-year school (Jaggars 2012).

Proponents of online learning suggest that the relatively high withdrawal rates and poor grades in online courses are due to the characteristics of the students who take them (see, for example, Howell, Laws, and Lindsay 2004). To test this hypothesis, we conducted analyses comparing courses taken by the same student, and found that the typical student performs less well in online courses than in face-to-face courses, even after controlling for course subject and difficulty (Xu and Jaggars, forthcoming). We also speculated that a student might choose online coursework during semesters when her life is more complicated—for example, when she is taking a higher credit load or employed more hours per week. Our investigations revealed that this does happen for some students; but even after taking those life circumstances into account, the negative effects of online learning remain (Xu and Jaggars 2013).

Faced with these findings, we wondered whether some types of students perform even better in online than in face-to-face courses, while others struggle. Looking separately at different types of students (based on ethnicity, gender, age, and previous academic performance) and different academic subject areas, we found that all subgroups tended to perform worse in online courses (Xu and Jaggars, forthcoming). However, some students—in particular, males, African American students, and students with lower levels of academic preparation—had much more difficulty in online courses than they did in face-to-face courses. These results are consistent with smaller-scale studies suggesting that the gap between online and face-to-face outcomes is wider among males, students with financial aid, those with lower prior grade point averages, and Hispanic students (Brown and Liedholm 2002; Coates et al. 2004; Figlio, Rush, and Yin 2013; Kaupp 2012). Thus the performance gaps that some demographic groups experience in face-to-face classrooms become even wider in online courses—a troubling finding for those concerned with educational equity.

When we turned to student age, we found more nuanced results (Xu and Jaggars, forthcoming). While both older and younger students performed more poorly in online courses than in face-to-face courses, the decline in performance among older students was not as strong as it was among younger students. As a result, the small performance gap between younger and older students began to flip: within face-to-face courses, older students’ dropout rates were 1 percentage point higher than those of younger students; however, within online courses, older students’ dropout rates were 1 percentage point lower than those of younger students. While older students still performed more poorly in online than in face-to-face courses, for this population a slight decrease in performance may represent a rational trade-off for the ability to enroll in more courses overall. For younger students, however, the academic costs may not be worth the added flexibility—particularly if the student is already struggling academically.

Supporting Diverse Students’ Learning and Achievement

In an attempt to understand why community college students tend to perform more poorly online, we conducted a qualitative study of twenty-three online courses in Virginia, interviewing faculty and a sample of enrolled students. Students told us that they received less instructor guidance, support, and encouragement in their online courses; as a result, they did not learn the material as well (Jaggars, forthcoming).

For highly confident, highly motivated, and high-achieving students, this relative lack of interpersonal connection and support may not be particularly problematic. However, low-income, ethnic minority, or first-generation students—that is, most community college students—are often anxious about their ability to succeed academically, and this anxiety can manifest in counterproductive strategies such as procrastinating, not turning in assignments, or not reaching out to professors for help (see, for example, Cox 2009). An array of studies suggest that instructors’ caring, connection, encouragement, and guidance are critical to help alleviate these students’ anxiety, build their academic motivation, and support their success (see, for example, Barnett 2011). Accordingly, one might suspect that in order to support diverse students’ learning and achievement, online courses need to incorporate stronger interpersonal connections and instructor guidance than most currently do.

How can instructors integrate such connection and guidance into their online courses? Certainly, it helps to keep online class sizes small. Certain technologies may also help. In our qualitative study, when students discussed the value of technologies such as video- or audio-taped narrations, they mentioned not only the technologies’ intrinsic value in building skills (e.g., by visually walking the student through specific activities), but also the fact that the technologies personalized the instructor, allowing the student to see the instructor’s image and hear the instructor’s voice (Jaggars and Xu 2013). Perhaps most importantly for students who lack confidence in their academic abilities, instructors in our study who expertly leveraged interactive technology tools did so in ways that made clear that they cared about their students.

Community college leaders recognize the importance of instructor connection and encouragement for their students, and thus many are skeptical of the “massive” nature of MOOCs. However, some are experimenting with applying MOOC content within a hybrid or “flipped” classroom model, in which students learn material on their own using an online interface, and then attend small-enrollment face-to-face class sessions to review and apply material with an instructor. To the extent that these flipped classrooms incorporate strong connections between the instructor and students, they could be as effective as—or even more effective than—traditional face-to-face courses. However, college administrators and practitioners must keep in mind that the content and activities that motivate students at the elite universities for which MOOC materials were initially designed may not motivate students at other colleges (see, for example, Bear 2013).

Overall, recent research on online education suggests that MOOCs might indeed improve access to college-level learning among technology-savvy working adults who hope to upgrade their skills. We do not yet have evidence, however, that such methods of delivery will improve both access and success among other traditionally underserved populations.

References

Barnett, Elisabeth A. 2011. “Validation Experiences and Persistence among Community College Students.” The Review of Higher Education 34 (2): 193–230.

Bear, Charla. 2013. “Is Online Education Widening the Digital Divide?” MindShift. http://blogs.kqed.org/mindshift/2013/08/is-online-education-widening-the-digital-divide/.

Brown, Byron W., and Carl E. Liedholm. 2002. “Can Web Courses Replace the Classroom in Principles of Microeconomics?” The American Economic Review 92 (2): 444–48.

Coates, Dennis, Brad R. Humphreys, John Kane, and Michelle A. Vachris. 2004. “‘No Significant Distance’ between Face-to-Face and Online Instruction: Evidence from Principles of Economics.” Economics of Education Review 23: 533–46.

Cox, Rebecca D. 2009. The College Fear Factor: How Students and Professors Misunderstand One Another. Cambridge, MA: Harvard University Press.

Figlio, David N., Mark Rush, and Lu Yin. 2013. “Is It Live or Is It Internet? Experimental Estimates of the Effects of Online Instruction on Student Learning.” Journal of Labor Economics 31 (4): 763–84.

Howell, Scott L., R. Dwight Laws, and Nathan K. Lindsay. 2004. “Reevaluating Course Completion in Distance Education.” Quarterly Review of Distance Education 5 (4): 243–52.

Jaggars, Shanna S. 2012. “Online Learning in Community Colleges.” In Handbook of Distance Education (3rd ed.), edited by Michael G. Moore, 594–608. New York: Routledge.

———. Forthcoming. “Choosing between Online and Face-to-Face Courses.” The American Journal of Distance Education 28 (1).

Jaggars, Shanna S., and Thomas Bailey. 2010. Effectiveness of Fully Online Courses for College Students: Response to a Department of Education Meta-Analysis. New York, NY: Community College Research Center, Teachers College, Columbia University.

Jaggars, Shanna S., and Di Xu. 2013. “Predicting Online Student Outcomes from a Measure of Course Quality (Working Paper 57).” New York, NY: Community College Research Center, Teachers College, Columbia University.

Kaupp, Ray. 2012. “Online Penalty: The Impact of Online Instruction on the Latino–White Achievement Gap.” Journal of Applied Research in the Community College 12 (2): 1–9.

Means, Barbara, Yukie Toyama, Robert Murphy, Marianne Bakia, and Karla Jones. 2010. Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Washington, DC: US Department of Education.

Public Agenda. 2013. What Employers and Community College Students Think about Online Education. New York, NY: Public Agenda.

US Department of Commerce. 2013. Exploring the Digital Nation: America’s Emerging Online Experience. Washington, DC: US Department of Commerce.

Xu, Di, and Shanna S. Jaggars. 2011. “The Effectiveness of Distance Education across Virginia’s Community Colleges: Evidence from Introductory College-Level Math and English Courses.” Educational Evaluation and Policy Analysis 33 (3): 360−77.

———. 2013. “The Impact of Online Learning on Students’ Course Outcomes: Evidence from a Large Community and Technical College System.” Economics of Education Review 37: 46–57.

———. Forthcoming. “Performance Gaps between Online and Face-to-Face Courses: Differences across Types of Students and Academic Subject Areas.” Journal of Higher Education.


Shanna Smith Jaggars is assistant director of the Community College Research Center at Teachers College, Columbia University.

Previous Issues