Membership Programs Meetings Publications LEAP Press Room About AAC&U
Association of American Colleges and Universities
Search Web Site
AAC&U
Resources on:
Liberal Education
General Education
Curriculum
Faculty
Student Success
Institutional Change
Assessment
Diversity
Civic Engagement
Women
Global Learning
Science & Health
PKAL
Connect with AACU:
Join Our Email List
RSS Feed
Facebook
Follow us on Twitter
LEAP Blog
LEAP Toolkit
YouTube
Podcasts
Support AACU
Online Giving Form
 
PR Spring 2007 Cover  

Spring 2007, Vol 9, No. 2

Multiple Drafts of a College's Narrative

By Paul Sotherland, professor of biology and chair of the faculty assessment committee; Anne Dueweke, director of faculty grants and institutional research; Kiran Cunningham, professor of anthropology; and Bob Grossman, professor of psychology--all of Kalamazoo College


Writing a story about how well a college helps its students become better educated is an endless helix of "counting and recounting" (Shulman 2007), yielding a series of narratives that track a college's educational trajectory. When discussed openly, both within and among institutions, these iterative accounts gleaned from measures of student learning can improve undergraduate education by making it more transparent (Bok 2006). In this spirit, we offer part of Kalamazoo College's draft narrative as a case study, based on explorations of information from the Collegiate Learning Assessment (CLA) and the National Survey of Student Engagement (NSSE), and invite colleagues at other institutions to share insights from their own investigations.

Results from the CLA and NSSE can be enlightening, challenging, and affirming. Trying to understand our students' CLA performance has led us to examine features of our curriculum that might bring about changes we see in students between matriculation and graduation. In so doing, we are addressing questions, expressed by Hersh (2006), about how we might learn from the CLA. A similar approach to interrogating NSSE results revealed patterns that corroborated our hunches about variation in CLA data. Through these analyses we are finding that at least some of our students' experiences seem to have a "value-added" effect, and we are beginning to discern how this effect might be expanded to reach more students.

Performance of Kalamazoo College Students on the CLA

Through a grant from the Teagle Foundation, and as part of an assessment collaboration with Colorado College and Earlham College, we administered the CLA to first-year students and seniors during the 2005--6 academic year. First-years had a mean performance at the 80th percentile (at the lower end of the "at expected" range) of the CLA, even though their mean SAT scores were at the 92nd percentile compared with first-years who took the CLA in 2005--6. Seniors had a mean performance at the 99th percentile (at the upper end of the "above expected" range) of the CLA, whereas their mean SAT scores were at the 92nd percentile compared with other seniors who took the CLA. The "value-added" (mean senior CLA score minus mean first-year CLA score) of a Kalamazoo College education was "well above expected."

While examining these CLA results, two questions guided our inquiry: (1) What attributes of a Kalamazoo education might account for this overall performance? (2) What variations in students' educational pathways might account for differences in CLA performance at Kalamazoo? To explore these questions we employed several approaches, including comparing "typical" indicators of students' academic abilities (i.e., GPA and SAT) to CLA performance, disaggregating CLA scores among academic divisions, performing similar analyses of NSSE data, and interviewing students about their college experiences.

Indicators of Academic Ability and CLA Scores

We began with the easiest comparisons by looking for correlations between CLA performance and SAT scores and cumulative GPAs. CLA scores of both first-years and seniors were positively, but weakly, correlated (r = 0.37 and 0.24, respectively) with SAT scores (fig. 1). Similarly, cumulative GPAs of our seniors showed a weakly positive correlation with CLA score. Thus, students over a range of "abilities" performed well (and not as well) on the CLA, suggesting that students selected for admission to an institution perhaps should be those most likely to thrive in the college's environment and not just those with the (presumably) highest academic ability. However, to find out why some students seemed to thrive more than others, as measured by the CLA, we had to dig deeper.

Figure 1

A Disaggregated View of Kalamazoo's CLA Performance

In post-CLA surveys and interviews, our seniors described educational experiences that they believed contributed to their CLA performance, but our attempts to identify predictors of CLA performance through analyses of academic transcripts and comparisons of scores by academic division revealed little about what might cause some students to perform well and others to perform less well. While acknowledging that these analyses probably suffer from our small sample size, and acknowledging that the CLA was designed to yield one aggregated score for each institution, we were disappointed with our lack of insight.

Because CLA scores tend to increase with higher SAT scores (as illustrated in fig. 1 and in the CLA Institutional Report; see www.kzoo.edu/ir), we needed to account for variation in SAT scores when interpreting the CLA performance of our students. So, instead of using actual CLA scores (i.e., scores earned by students), we computed "adjusted" CLA scores (AdjCLA) by calculating each student's "expected" CLA score using the equation from the interinstitutional regression of CLA score on SAT score (CLA = 0.69(SAT) + 448), and then subtracting it from that student's actual score (AdjCLA = Actual CLA - Expected CLA). Thus, a student with a positive AdjCLA had a CLA score above the interinstitutional regression line and a student with a negative AdjCLA had a CLA score below the interinstitutional regression line. Adjusting CLA data in this way presumably attenuates variation in CLA scores attributable to variation in SAT scores and thereby exposes other potential sources of variation in CLA scores, such as educational experiences. This method of identifying students who "over-performed" and "under-performed" on the CLA revealed interesting patterns.

We created three categories similar to those used in the institutional report for grouping institutional scores-- "below expected" (AdjCLA more than one standard error below "expected" CLA), "at expected" (within one standard error below or above "expected"), and "above expected" (more than one standard error above "expected")--and sorted student CLA performance into these groups. (We used data from the interinstitutional regresion for these analyses because the "nationally normed individual regression" data were unavailable to us, so this was the best available and most consistant way to explore variations in students' CLA performance.) The mean SAT score of students in the "below expected" group was about 5 percent greater than the mean SAT score of students in the "above expected" group, but we found no statistically significant differences among SAT scores of the students in the three groups. However, "above expected" students had CLA scores that were 24 percent greater than those of "below expected" students, and CLA scores varied significantly among all three groups. And we were pleasantly surprised to discover seemingly "less capable" students (i.e., those with SATs and GPAs below the college mean) among those in the "above expected" group with high actual CLA scores. Thus, something more than intellectual ability, as measured by the SAT, seems to have led to high CLA performance for some students. With this new way of looking at students' performance, we set out once again to look for patterns. This time, we had more success.

At Kalamazoo College, CLA performance seems to vary with the academic division in which students majored. Adjusted CLA scores differed significantly among divisions, even though actual CLA scores did not, with students in natural sciences having the lowest AdjCLA. This observation is corroborated by the distribution of students among the three performance categories. The natural sciences showed a bimodal distribution (fig. 2), with eight "below expected," three "at expected," and eleven "above expected" scores, whereas all other divisions showed uni-modal distributions, with the vast majority of scores in the "at expected" and "above expected" ranges. The bimodal distribution in natural sciences led to hypotheses about causes for the "below expected" performance of some science majors and prompted us to examine NSSE results more closely.

Figure 2

Interdivisional Differences in NSSE Performance

We hypothesized that student engagement in "programs and activities that institutions provide for their learning and personal development" (nsse.iub.edu/html/ quick_facts.cfm) would correlate positively with CLA scores. However, data from seniors who completed both the NSSE and the CLA (n = 48) revealed no significant correlations between any measures of engagement (benchmarks or individual questions) and performance on the CLA. In retrospect, these results are not surprising given that NSSE data are self-reported whereas CLA data are direct measures of abilities. And our analyses again probably suffer from the small sample size and a relatively homogeneous group of students. (Homogeneity, in this case, is in terms of experiences--for example, all Kalamazoo students complete a language requirement, take comprehensive examinations, and complete a senior project, and over 80 percent study abroad.) However, our success with comparing adjusted CLA scores among academic divisions led us to perform similar analyses of NSSE data from a larger sample of seniors.

We reexamined data from all seniors who took the NSSE in 2005--6 (the response rate was 76 percent) by comparing responses from students majoring in each of the five academic divisions. We found that the "Level of Academic Challenge" (LAC) benchmark differed significantly among divisions. The LAC "score" for natural sciences was significantly lower than scores for humanities and for social sciences, prompting us to examine responses to each question comprising this benchmark. Students in humanities and social sciences scored significantly higher than students in natural sciences in three areas: (1) number of written papers between five and nineteen pages; (2) number of assigned textbooks; and (3) making judgments about the value of information. If these responses truly highlight different experiences of students in these disciplines, then we might be seeing reasons for interdivisional differences in CLA performance and possibilities for improving our curriculum. Students who write well and who have had more experience making judgments about the value of information would theoretically perform better on the CLA.

Insights from Student Interviews

Interviews of Kalamazoo seniors provide additional information about effects of various educational experiences. Students in a qualitative research methods course administered, transcribed, and analyzed interviews of thirty-one seniors who took the CLA. Examining the interview transcripts from students with high CLA scores and students with low CLA scores revealed intriguing intergroup differences that corroborate insights gained from examining disaggregated CLA and NSSE scores. The following "patterns" emerged: foreign language proficiency seemed to correlate positively with CLA scores; students who used phrases like "personal initiative" generally did better on the CLA; and some science majors seemed to get "lost" in their major, but those who did explore other disciplines tended to do well on the CLA.

The interviews also caused us to wonder about transformational learning at Kalamazoo. We are intrigued by Kiely's (2006) finding that transformational learning may be catalyzed by experiences of "high-intensity dissonance" that essentially force students to change the parameters of their thinking. We wonder if Kalamazoo's distinctive focus on integrated, experiential learning might provide students not only with many opportunities to encounter high-intensity dissonance, but also with critically important structures for processing these experiences so that transformational learning is captured. In the interviews we found evidence of transformational learning occurring through, for example, challenging courses, service learning, and long-term, immersive study abroad programs. Moreover, the interviews suggest that students who perform well on the CLA might be those with the confidence, initiative, and (with regard to study abroad) language ability to place themselves in situations where they not only experience high-intensity dissonance, but experience it in such a way that they develop habits of mind that help them perform well in situations like those encountered on the CLA.

Preliminary Inferences

Clearly, a college education enhances critical thinking, analytical reasoning, and effective writing, and the trajectories students take through that education seem to affect the degree to which those abilities develop. Although small sample sizes preclude our reaching definitive conclusions about factors affecting CLA performance, at this point in our explorations we surmise the following: a high "value-added" education emphasizes all skills measured by the CLA and creates opportunities for students to experience, reflect on, and learn from "high-intensity dissonance." Analytical reasoning and critical thinking are essential for performing well on the CLA, but without effective writing students cannot fully demonstrate those skills.

Several questions remain. What causes some "high-ability" students to under-perform on the CLA, and what experiences help students with "weaker" academic records perform above expected? Could it be that some natural science students do not get as much practice with writing as students in other divisions (as noted in Bok 2006), and are therefore unable to demonstrate their abilities to think and reason on the CLA? If encounters with "high-intensity dissonance" bring about developmental leaps, how do we ensure that all students benefit from those experiences? Moreover, what are the conditions under which encounters with high-intensity dissonance actually lead to transformational learning? And how can we best use lessons learned from investigations like those described here to inform curricular decisions?

Data and stories from assessment of student learning provide "ground truth" that allows our heads to believe what our hearts tell us. We in the academic realm live, at some level, in the cerebral sphere of influence that makes us skeptical of hunches born outside of our heads. And yet, we "know" in our hearts--from noticing changes in demeanor, new twinkles in eyes, and more conviction in voices--that we effect significant growth in our students. Assessment of student learning helps cause the spheres of the head and heart to fuse into a powerfully convincing whole. Through that fusion, we find affirmation of the learning that takes place during college and develop the impetus for writing the next draft of our institution's narrative.


References

Bok, D. 2006. Our underachieving colleges: A candid look at how much students learn and why they should be learning more. Princeton, NJ: Princeton University Press.

Hersh, R. H. 2006. What now? What can we do once we have the CLA results? www.cae.org/content/pro_collegiate_reports _publications.htm

Kiely, R. 2006. A transformative learning model for service-learning: A longitudinal case study. Michigan Journal of Community Service Learning 12 (1): 5--22.

Shulman, L. S. 2007. Counting and recounting: Assessment and the quest for accountability. Change 39 (1): 20--25.

spacer