Select any filter and click on Apply to see results
Table of Contents
Using Adaptive Learning Courseware as a High-Impact Practice to Improve Students’ Learning Outcomes in General Chemistry II at an HBCU
The Clark Atlanta University (CAU) Department of Chemistry offers general chemistry as a high-enrollment foundational course for STEM majors, the majority of whom are biology majors. Two of the authors have been coteaching general chemistry (Chem I and Chem II) for several years. Despite employing various pedagogical approaches to improve teaching and learning, trying to understand why these classes yield low pass rates—Chem I varies between 40 and 65 percent and Chem II has a rate of around 70 percent—has been emotionally draining at the end of each semester. In this article, we present the development, implementation, and findings from a pilot of a redesigned Chem I course that incorporated adaptive learning courseware (ALC) as part of our efforts to improve learning, retention, and graduation rates of STEM majors.
CAU is a Historically Black College and University (HBCU) that offers bachelor of science and master of science degrees in biology, chemistry, computer science, mathematics, and physics, and doctoral degrees in biology and chemistry. With an enrollment of approximately four thousand students, CAU is the largest private institution among the HBCUs in the state of Georgia. Sixty-one percent of our undergraduate students are from low-income families making $48,000 or less, 70 percent are eligible for Pell Grants, and 35 percent are first-generation students.
CAU, like most institutions for higher education, struggles with low retention rates of undergraduate STEM majors. General chemistry is one of the key courses that pose a significant barrier to success for STEM majors. Though CAU has averaged a pass rate of approximately 70 percent in Chem II over the past three spring semesters, Chem I historically has much lower pass rates (43 percent in fall 2015, 64 percent in fall 2016, and 47 percent in fall 2017). Therefore, it is critical that we continue to assess the quality of our instructional delivery and wraparound support services and examine new approaches to improve students’ learning outcomes. CAU chemistry students have continuously expressed the need for active learning and real-time assistance in identifying areas where they are having challenges with problem solving. The size of classes, the inordinate amount of time required of the instructor(s), the shortage of qualified tutors and teaching assistants, and the cost for such assistance make it nearly impossible to provide adequate real-time personalized interaction for our students.
Digital learning can overcome these limitations. Students today are digital natives, and higher education must embrace the differentiation of instruction for individual students that digital learning enables. Digital courseware is now becoming increasingly available as online homework systems become more sophisticated and more advanced with “adaptive learning” (adaptive-responsive) technology that can provide instructions tailored to each student’s needs. In fact, adaptive learning is accepted as one of the three components of the Persistence Framework, the benchmark among best practices for increased retention of STEM majors (Graham et al. 2013). Digital and adaptive learning technology is being implemented at many predominately White institutions, as demonstrated by the Personalized Learning Consortium of the Association of Public and Land-Grant Universities (2018). HBCUs must also place themselves at the forefront in leading this digital movement to improve student learning outcomes and increase retention and graduation rates, particularly for African American STEM students.
It is against this background that we conducted an active learning project (ALP) with support from the Center for the Advancement of STEM Leadership, which is funded by the National Science Foundation, to introduce ALC in our general chemistry sequence and measure its impact on learning as well as on students’ perceptions of the learning platform. This ALP is part of an institutional initiative called Course Redesign with Technology (CRT), which is supported by the CAU Office of Academic Affairs to integrate innovative digital and adaptive courseware into the curricula to increase student learning, retention, and degree completion rates. The conceptual framework for this ALP is grounded in the three elements of the Association of American Universities Framework for Systemic Change in Undergraduate STEM Teaching and Learning: (1) pedagogy, (2) scaffolding, and (3) culture change (2018). The implementation of ALC as part of the CRT provided scaffolding, an evidence-based technique representing the pedagogical underpinning of the ALP. Scaffolding refers to the support necessary to first incubate and then sustain this evidence-based teaching. Concurrent with the development of this ALP, course redesign with ALC was also undertaken in core courses in biology and mathematics.
All elements of Bolman and Gallos’s (2011) four frames of leadership—structural, human resource, political, and symbolic—were employed in the development, implementation, and institutionalization of this ALP. In this phase of the project, the political and symbolic frames were paramount for addressing faculty and student buy-in. The political frame required us to be compassionate leaders working with an intensely political aspect of academic life as advocates, power brokers, and strategists who engaged in setting agendas, building coalitions, and managing conflicts. Having been introduced to adaptive learning pedagogy based on artificial intelligence, we were easily convinced that it warranted exploration for enhancing student learning. However, some colleagues remained skeptical and viewed this as a futile effort in implementing a pedagogical approach that they assumed has not been proven to be effective. We decided that careful execution of a well-designed project was an important first step toward successfully maneuvering in this political minefield and simultaneously demonstrating symbolic leadership. Following our commitment to redesign Chem II, we established collaboration and communication with our colleagues who were redesigning courses in biology and mathematics in order to build a strong STEM coalition. Increasing student engagement was also crucial to earn their buy-in. We communicated to students the importance of the adaptive-learning component of their course for improving their learning outcomes and that the courseware would become a portion of their graded assignments.
Adaptive Learning Courseware
Online learning platforms have now altered and augmented learning. However, despite approximately 80 percent of US households owning at least one desktop or laptop computer (File and Ryan 2014; Pew Research Center 2017), educational technology has not met its potential for improving educational outcomes (i.e., test scores), especially in mathematics (National Center for Education Statistics, n.d.; Program for International Student Assessment 2015; Jackson and Kiersz 2016). Yet, there is an encouraging shift on the horizon for two reasons. The first is that educational technology is increasingly able to interact with students in sophisticated ways; the second is the experience of a growing number of schools, like the Khan Lab School, which is not just bolting technology onto the existing way of doing things but is also using new software to change how pupils and teachers spend their time (Hamer 2014; Office of Educational Technology 2012).
Colleges and universities must make changes in instructional delivery by including ALC to increase student retention of knowledge and skills, which will decrease the number of students who fail to master foundational STEM concepts. Newer programs use machine learning to find student-specific patterns of strengths and deficiencies. Key vendors include Assessment and Learning in Knowledge Spaces (ALEKS), Knewton, CogBooks, and DreamBox Learning. These companies use AI techniques to deliver personalized instruction, replacing the one-size-fits-all traditional learning model.
Knewton, used in our Chem II course, is an adaptive learning platform that powers digital education based on a proficiency model that is used to “infer each student’s knowledge state” (Binger 2018). This is accomplished by combining a “knowledge graph,” time-tested psychometric models, and additional pedagogically motivated models. The foundation for the proficiency model is an educational testing theory known as Item Response Theory (IRT). One important aspect of IRT is that it accounts for network effects (the system learns more about the content and the students as more people use it), leading to continually better student outcomes. In addition, it incorporates features like temporality (older responses count the same as newer responses), instructional effects (subject matter content read in the system), and multiple concepts (and their interrelationships) in the knowledge graph.
Chem I and II are coordinated across years, sections, and instructors. Both are four-credit-hour courses comprising lectures, recitation, homework problems, and laboratories. In the spring 2018 semester, we piloted the redesigned Chem II by incorporating Knewton in one of two sections of the course. There was a common syllabus for both sections, no change in course content, and a common final exam. The Chem II class met three days per week for fifty minutes for traditional lectures and once a week for a ninety-minute recitation during which students engaged in problem-solving exercises under the guidance of a professor and at least one teaching assistant. Fifty-one students enrolled in the pilot section, forty-four of whom completed the course. Thirty Knewton assignments were generated spanning six topical areas worth 10 percent of students’ final grades.
We used a quasi-experimental, interrupted-times series design in which grades were compared between students who used Knewton (Chem II in spring 2018) and students who did not use it in the three prior semesters (Chem II in spring 2015, 2016, and 2017). The study was approved by the CAU Institutional Review Board (IRB number-HR2017-11-760-1). To examine the relationship between mastery attained in Knewton and final grades in the course, the researchers calculated a Pearson correlation coefficient since the scatterplot revealed a linear association between the two variables. Finally, a confidential web-based survey was administered during the last week of the semester to capture students’ attitudes and perceptions of the adaptive learning intervention. The student perception survey included eleven Likert-type statements and five free-response questions.
Differences in Grade Distribution
Figure 1 compares outcomes among Chem II students for the past four spring semesters. The spring 2018 pass rate (70.6 percent) was similar to the average (70.3 percent) of the prior three spring semesters. Among students passing the course, there was a significant increase in the percentage of Bs earned (55.6 percent) in the redesigned course compared to the baseline courses (an average of 11.1 percent) and a concomitant decrease in the percentage of Cs (33.3 percent in spring 2018 compared with 82.9 percent in the baseline courses).
The Relationship between Mastery Attained in Knewton and Final Grades
Students averaged 13.4 hours engaging in the Knewton adaptive-learning platform during the semester and, as shown in figure 2, there was a strong correlation between mastery attained in Knewton (defined as students attaining a 90 percent correct response rate on a series of questions in an individual assignment) and students’ final course grades (Pearson correlation coefficient r(42) = 0.77, p < 0.05). This is consistent with students’ survey responses, in which 47 percent agreed that Knewton helped them earn a better grade in Chem II than they would have received without access to the system.
Students’ Perceptions of Knewton
There was a 98 percent response rate for the end-of-semester survey. Most respondents were first-year students and biology majors, and 72 percent were female. Forty-nine percent of respondents indicated that they generally enjoy chemistry, 47 percent found the questions and activities in Knewton to be interesting, and 46 percent indicated that they enjoyed the material that they were studying in Knewton. Forty-seven percent indicated that they spent more time studying for Chem II than for Chem I because they were able to use Knewton. Fifty-four percent indicated that they understood the material in Chem II because they used Knewton, and 72 percent indicated that the questions in Knewton were relevant to what they were learning in the classroom. Of the thirty-seven respondents to the question, “What do you think was the purpose of Knewton in this course,” students indicated that it was to assist in getting a better understanding (sixteen respondents), expanding knowledge (three respondents), or serving as a study aid and providing extra practice (thirteen respondents) for the course material. Twenty-six students indicated that Knewton was beneficial to their learning, whereas nine indicated that it was not, for various reasons (but mainly because of the extended time that was often required to complete the assignments). In general, the feedback indicates that Knewton helped in understanding and learning but that it takes an extended time to complete and attain mastery of assignments.
Our course redesign with ALC did not produce an increase in the percentage of students passing the course; however, it led to great progress toward mastery of Chem II concepts by students successfully completing the class. Most students were initially dismissive of the use of Knewton in the redesigned Chem II, and they preferred to focus on completing CANVAS-based assignments to which they had become accustomed in Chem I in fall 2017. Full engagement with Knewton gradually increased and was driven by the instructor’s frequent reminders that it contributes significantly to the final grade. Ultimately, students developed a positive response to Knewton and indicated that, had it been available for Chem I, they would have utilized it more consistently and effectively.
Based on the results of this pilot course, the Department of Chemistry implemented redesigned Chem I and Chem II courses incorporating ALC in fall 2018. Chem I students engage with the system an average of 43.5 hours (a 225 percent increase) versus the 13.4 hours in the piloted course. Knewton has improved the analytics available to faculty, which allows increased personalized responses to students on designated topics and adjustments to classroom lectures and recitation sessions to better address students’ areas of deficiency. We are now training the staff in our living and learning centers (campus housing) on how to use the ALC analytics so that they can encourage their residents—our students—to further engage with the ALC toward improving student learning outcomes, retention, and degree completion rates.
This work was supported in part by the Association of Chief Academic Officers Digital Learning Initiative sponsored by the Bill and Melinda Gates Foundation, the United Negro College Fund Career Pathways Initiative, and the National Science Foundation through the Historically Black Colleges and Universities-Undergraduate Program (HBCU-UP).
Association of American Universities. 2017. Framework for Systematic Change in Undergraduate Teaching and STEM Learning. https://www.aau.edu/sites/default/files/STEM%20Scholarship/AAU_Framework.pdf.
Association of Public and Land-Grant Universities. 2018. “Projects and Initiatives: Personalized Learning Consortium.” https://www.aplu.org/projects-and-initiatives/personalized-learning-consortium/index.html
Binger, Michael. 2018. “How does Knewton’s Proficiency Model Estimate Student Knowledge in Alta?” Knewton Blog. April 2, 2018. https://www.knewton.com/blog/mastery/how-does-knewtons-proficiency-model-estimate-student-knowledge-in-alta.
Bolman, Lee G., and Joan V. Gallos. 2011. Reframing Academic Leadership. San Francisco: Jossey-Bass.
File, Thom, and Camille Ryan. 2014. Computer and Internet Use in the United States: 2013. Washington, DC: United States Census Bureau. https://www.census.gov/history/pdf/2013computeruse.pdf.
Graham, Mark J., Jennifer Frederick, Angela Byars-Winston, Anne-Barrie Hunter, and Jo Handelsman. 2013. “Increasing Persistence of College Students in STEM Science Education.” Science 341 (6153): 1455−1456.
Hamer, Irving. 2014. “11 Ways to Make Data Analytics Work for K12.” Education Week, October 14, 2014. https://www.edweek.org/ew/articles/2014/10/15/08hamer.h34.html.
Jackson, Abby, and Andy Kiersz. 2016. “The Latest Ranking of Top Countries in Math, Reading, and Science Is Out—and the US Didn’t Crack the Top 10.” December 6, 2016. Business Insider. https://www.businessinsider.com/pisa-worldwide-ranking-of-math-science-reading-skills-2016-12.
National Center for Education Statistics. n.d. “Fast Facts, International Comparisons of Achievement.” Accessed May 22, 2019. https://nces.ed.gov/fastfacts/display.asp?id=1.
National Science Foundation. n.d. “STEM Education Data Trends. How Do U.S. 15-year-olds Compare with Students from Other Countries in Math and Science?” Accessed May 22, 2019. https://www.nsf.gov/nsb/sei/edTool/data/highschool-08.html.
Office of Educational Technology. 2012. Enhancing Teaching and Learning through Educational Data Mining and Learning Analytics: An Issue Brief. Washington, DC: US Department of Education. https://tech.ed.gov/wp-content/uploads/2014/03/edm-la-brief.pdf.
Pew Research Center. 2017. “A Third of Americans Live in a Household with Three or More Smartphones.” Fact Tank, May 25, 2017. https://www.pewresearch.org/fact-tank/2017/05/25/a-third-of-americans-live-in-a-household-with-three-or-more-smartphones/.
Program for International Student Assessment. 2015. “Excellence and Equity in Education.” OECD Publishing. https://www.oecd.org/education/pisa-2015-results-volume-i-9789264266490-en.htm.
Conrad W. Ingram, Assistant Professor, Department of Chemistry; Eric Mintz, Professor, Department of Chemistry; Daniel Teodorescu, Director of Unit Assessment, Department of Educational Leadership—all of Clark Atlanta University