Tool Kit Resources: Campus Models & Case Studies

Hamline University

Minnesota Collaborative Builds Campus Cultures of Assessment

In the summer of 2014, representatives from ten Minnesota institutions came together with a singular purpose: to build and strengthen a culture of assessment on their campuses.

These campuses formed the Minnesota Collaborative as an offshoot of the Multi-State Collaborative to Advance Learning Outcomes Assessment (MSC)—a partnership among the Association of American Colleges and Universities (AAC&U), the State Higher Education Executive Officers (SHEEO), twelve state higher education systems, and eighty-eight public institutions in those states. Unlike the MSC, which only includes public institutions, the Minnesota Collaborative includes private and public, four-year and two-year institutions.

Participating Institutions

Four-Year Private Institutions

  • Augsburg College
  • Gustavus Adolphus College
  • Hamline University
  • St. Olaf College
  • The College of St. Scholastica

Four-Year Public Institutions

  • University of Minnesota-Morris
  • St. Cloud State University
  • Southwest Minnesota State University

Two-Year Public Institutions

  • Minneapolis Community and Technical College
  • North Hennepin Community College (2014–2015)
  • Inver Hills Community College (2015–Present)

Faculty, staff, and administrators from the ten institutions gathered for their first workshop in the summer of 2014, where they began brainstorming strategies for collecting, assessing, and distributing artifacts across their campuses using AAC&U’s VALUE rubrics—free downloadable rubrics that guide institutions in evaluating student learning and performance on certain learning outcomes.

One of the first tasks at that meeting was to identify which VALUE rubrics the project would focus on. Using a shared white board, representatives indicated the rubrics that were most relevant to their campuses. This exercise allowed participants “to identify our priorities for the collaborative based on where the majority of our institutions already had some of those outcomes integrated into our curriculum,” said Caroline Hilk, director and faculty development coordinator of the Center for Teaching and Learning at Hamline University.

Hilk added that the exercise involved not just identifying “what all of us were already doing,” but also “noticing what all of us were not doing well and picking that up [and saying], . . . ‘This looks like a gap on our campus and other campuses, and we want to try to figure out a way to do this in a meaningful way and then share it back with the other members of the Minnesota Collaborative.’”

After their initial meeting, the collaborative agreed to (1) choose six LEAP Essential Learning Outcomes and their associated VALUE rubrics (Civic Engagement, Critical Thinking, Ethical Reasoning, Intercultural Knowledge and Competence, Quantitative Literacy, and Written Communication); (2) collect authentic artifacts from their campuses that related to those outcomes; (3) send faculty and administrators to training sessions in scoring VALUE rubrics; and (4) hold annual meetings to discuss and brainstorm methods of improving assessment on campus.

A report on initial data from AAC&U’s VALUE initiative, including the Minnesota Collaborative, was released in February 2017.

A Diversity of Voices

According to Mike Reynolds, associate dean in the College of Liberal Arts at Hamline University, the various types of institutions in the Minnesota Collaborative, their various stages of implementing assessment initiatives, and the various types of representatives (including provosts, assistant provosts, institutional researchers, assessment committee members, and faculty) that attend meetings all contribute to an invaluable diversity of voices.

“I do think we were like the fabled blind men circling the elephant,” Reynolds said. “We came at this from slightly different ways of thinking about what this thing was, so what was exciting in the initial meetings was having different sorts of institutional representatives from very different sorts of institutions say, ‘This is what VALUE means to us, or this is how we’re thinking about it.’”

Hilk agreed. “It’s a professional development opportunity for all of us,” she said, “in terms of hearing the perspectives from directors of institutional research, directors from centers on teaching and learning, faculty who are heading up certain efforts on their campus. And when we come together to have some of these conversations, we get to see how the decisions made by an assessment committee affect how institutional research is able to collect or analyze the data, and those individuals are able to see . . . how that data are likely to be used in conversations [with faculty] about curriculum and development. And so to me, it’s a rare opportunity for all of these different players across different institutions to gather and talk about something as important as learning outcomes.”

Bringing Innovations Back to Campus

Hamline University (Private, Four Year)

For many participating institutions, the most valuable aspect of the collaborative was the opportunity to hear about how other institutions implemented their assessment programs.

“I think one of the things that has been so valuable about [this collaborative] is it’s always imbedded in what we’re already doing on our campus,” Hilk said, “and it’s finding ways to have the VALUE project complement what we’re doing.”

“Written communication matters to everybody, and we all had our own rubrics that were relatively robust compared to [those measuring] some of the other general education skills,” Reynolds added. “Nonetheless, we assumed different things about how to evaluate it and what we expected from our students, and that [conversation] became really invaluable.”

Hilk also mentioned that the Minnesota Collaborative meetings have led to a university-wide revamping of “our general education, which is called the Hamline Plan.” Hamline was struggling with how to identify artifacts to assess using the rubrics, each of which included many different components. “We were able to, in essence, prioritize the six rubrics that we were using as part of the Minnesota collaborative and say, ‘This year for us is the year of diversity and civic engagement,’” Hilk said. “We are hopeful that the information, the training, the faculty development we’re doing on campus is going to . . . influence what we send on to [the national evaluation team], and then when we get that data back from AAC&U, we're going to be able to provide another layer of depth to the assessment work that we’ve done on campus.”

Reynolds added that it was also valuable to learn from schools that had more experience with assessment. “I’ll also say that we are greener than [St. Olaf College] and when we began, their faculty director of assessment was very useful for us to think about how they had built this robust assessment structure that moved from pretty effective program-level assessment to a cross-program general education system, which we were just trying to figure out how to do. And those first conversations with them really informed us.”

St. Olaf College (Private, Four-Year)

“We have a pretty robust program that’s intended to be very comprehensive in covering the academic programs as well as the GE [general education] curriculum on an annual basis,” said Laura Maki, associate director of educational research and assessment at St. Olaf College, a private four-year institution. “And so we do collect quite a bit of data on what faculty are doing either individually in their classrooms or in their programs or as academic departments.”

And, like Hamline, “annually we have a focus. This year is majors, and every academic program with a major will undertake an assessment project of their choosing.”

However, she added that faculty often use data that the college has collected over the last eight or nine years to look at programs “more holistically” and evaluate staffing, course sequencing, or program-wide curricula. Maki said that these faculty might say, “OK, this is what we know we’re seeing in the program, what experiences our students are having, and we’re going to make this change based on the data. . . . And that’s pretty empowering. . . . The faculty in those programs have a lot of flexibility to do work that improves student learning at that program level.”

Maki specifically cited the Math, Statistics, and Computer Science department, which began using a statistical software manual “because they realized that students didn’t have quite the competence level at using this software as they wanted [them] to have when they graduated. Or, Chemistry started using a lab manual that was really important for safety procedures in their chemistry labs.”

Inver Hills Community College (Public, Two-Year)

In the spring of 2016, the Minnesota faction of the MSC met at Normandale Community College, a two-year college located in Bloomington, Minnesota. The meeting featured an assessment charrette by Laura M. Gambino, professor and faculty scholar for teaching, learning, and assessment at Guttman Community College in New York, in which faculty shared and provided feedback about authentic assignments from their classes.

Steven Hartlaub, Spanish and French instructor and chair of one of the two assessment subcommittees at Inver Hills Community College, adapted (with his subcommittee’s help) Gambino’s half-day presentation into a one-hour assessment charrette and offered it as a companion to an “assessment salon” held by Carrie Naughton, mathematics instructor and head of Inver Hills’s other assessment subcommittee.

In Hartlaub’s charrette, faculty shared assignments from their courses and offered advice on improving them to better assess student learning. Hartlaub tailored charrette questions toward “what we wanted to accomplish on campus, which was to give faculty the opportunity to provide each other feedback regarding their assignments and to better align them with the VALUE rubrics.” 

In her one-hour salon, Naughton asked four faculty members who used assessment throughout their courses to discuss their methods: “So we had one instructor who discussed how he’s been working to use [specifications] grading in his class, we had our English instructor talk about how she does formative assessment, another instructor talked about incorporating assessments and student reflections into her assignments, and then we also had one whole department talk about how they are collecting data . . . and working alongside career services, counseling, and advising to figure out if students are on the right pathway.”

One of the most important aspects of successful assessment initiatives “is to have faculty score their own artifacts,” Naughton said. Then, faculty reflect on and report those results during their program review process and begin “making choices about what can be changed to better impact success the next time around. For example, in my own class, I gave an assignment . . . related to quantitative literacy. And it didn’t go that great, so I made changes to the lectures that I did, I made changes to when I introduced the assignment, and how I had students talking about it.”

In the salon and charrette, “the faculty really enjoyed . . . sharing what they were doing in their course, how they were organizing their courses, how they were designing assignments, how they were assessing students, and a significant number said they would like to do something like this again,” Hartlaub said. “I think that it was a really positive experience for all involved, and I think it helped to develop further what we’re trying to do here to . . . grow a stronger culture of assessment at the college.”

University of Minnesota-Morris (Public, Four-Year)

In 2016, the University of Minnesota-Morris sent faculty and staff to scorer training for the Written Communication VALUE rubric. According to Kristin Lamberty, associate professor of computer science, “That was something that helped us as a campus, because we had mostly been using indirect measures of assessment. We had been having students self-report as they were graduating or as they were entering about our general education requirements, including writing, but rather than just having the self-reported indirect measures, we now have a better idea of how to incorporate direct measures.”

At Morris, the push to create a campus-wide program of direct assessment is relatively new. There, assessment data haven’t been very helpful yet because, according to Melissa Bert, senior director of institutional effectiveness, “We’re just beginning to understand the importance of creating assignments that help [students] meet the criteria of the rubric. . . . And I think it’s useful to think about, ‘Oh, they specifically have a criterion about evidence in written communication,’ and, ‘Oh! Maybe I haven’t been expecting my students to know what to do there, and if I want them to actually score better on this rubric I might actually need to take a moment, pause, and teach that or emphasize that on an assignment.’”

The Value of Conversation

At Morris, the scorer training experience also prompted participants to spread the word about assessment on campus. “To have those scorers come back and be sort of evangelists, or be willing to share their experience about that calibration process, and about scoring together, that was something that we have tried to use,” Lamberty said.

Bert agreed that the ability to talk about assessment is one of the biggest advantages of Morris’s participation with the Minnesota Collaborative. “Some of the feedback we’ve gotten is that it’s given faculty the opportunity to have conversations they wouldn’t normally have, because you have folks from different disciplines that don’t normally interact, they’re talking about assessment and really thinking through how it is that they approach assignments.”

Others participating in the Minnesota Collaborative agree that starting conversations about assessment and building or strengthening their campuses’ cultures of assessment are among the largest benefits of their participation.

“The VALUE project has not caused this [culture of assessment] on our campus,” Hilk said of Hamline University. “But our participation in VALUE has to some extent scaffolded the conversation that we're having on campus, and it has propelled the work that we're doing to make it more meaningful. And I think that has been the value for us. We’d be having these conversations, but by participating in these conversations with others in the Minnesota Collaborative, participating in the national norming sessions, and really digging into the VALUE rubrics, we’ve been able to add a greater depth to the conversations we're having with faculty on our own campus.”