Toolkit Resources: Campus Models & Case Studies

The VALUE Institute: Assessment as Transformative Faculty Development

Figure 1. The Sixteen VALUE Rubrics

 

Intellectual and Practical Skills

 

Personal and Social Responsibility

Integrative and Applied Learning

Figure 1. The Sixteen VALUE Rubrics

In the ten years since AAC&U released the VALUE (Valid Assessment of Learning in Undergraduate Education) rubrics, they have helped to redefine higher education assessment practices.

“That was a time when everybody thought standardized testing of various kinds was the solution,” said Daniel F. Sullivan, senior fellow at AAC&U and president emeritus of St. Lawrence University. But educators across the country wanted alternatives to testing that could be personalized to their campus and would examine students’ authentic work products.

From 2007 to 2009, teams of faculty and other higher education experts from more than one hundred institutions developed fifteen VALUE rubrics for skills and competencies aligned with AAC&U’s LEAP Essential Learning Outcomes. In 2012, a sixteenth Global Learning VALUE rubric was added (see figure 1). Each rubric contains five or six dimensions as rows and four columns representing progressive levels of skill and knowledge proficiency from benchmark (1) to capstone (4). See figure 2 for a breakdown of the parts of a VALUE rubric.

Since 2014, the rubrics have been downloaded 69,647 times, representing more than 5,895 organizations (including more than 2,188 colleges and universities). The rubrics can be used as assessment and pedagogical frameworks for a variety of needs, including general education courses, cocurricular programs, capstones in the major, institutional accreditation, and state systems. They help educators determine whether, and how well, students are meeting graduation-level mastery in learning outcomes that both employers and faculty consider essential.

“I became convinced a long time ago that the most important thing we need to be focusing on with regard to college is what and how much students learn,” Sullivan said. “It actually matters in finding a job, it matters in living a good life, it matters in how students adjust to a changing economy.”

Figure 2. Parts of a VALUE Rubric (click to expand)

Figure 2. Parts of a VALUE Rubric

PFigure 2. Parts of a VALUE Rubric

The VALUE Institute: Expanding Access to Nationally Normed Assessment

While the VALUE project began as a grassroots effort on individual campuses, it quickly grew into a nationwide phenomenon through larger state-wide and interstate projects like the Multi-State Collaborative (a consortium of twelve states), Minnesota Collaborative, and Great Lakes Colleges Association. In these projects between 2014 and 2017, institutions submitted student work products to be scored by trained scorers from other institutions.

Through the collaboratives and partnerships with State Higher Education Executive Officers (SHEEO) and Watermark, these efforts led to “a decade of testing on the reliability and validity of VALUE,” Sullivan said. Additional information about VALUE’s validity study can be found in the 2017 report, On Solid Ground.

“We spent a lot of time in the Multi-State Collaborative, the GLCA, and Minnesota Collaboratives testing how easy or difficult it would be for campuses to incorporate rubrics into a serious assessment of a student learning program on campus,” Sullivan said.

Since fall 2017, the VALUE Institute has made the benefits of participating in a nationwide assessment initiative available to all institutions. In collaboration with Indiana University’s Center for Postsecondary Research (IUCPR), the institute enables any national and international higher education institution, department, program, state system, or consortium to utilize the VALUE rubrics approach to assessment by collecting and uploading samples of student work to a digital repository and having the work scored by at least two certified VALUE Institute scorers for external validation of institutional learning assessment.

“We have a model that works,” said Jillian Kinzie, associate director of IUCPR and the National Survey of Student Engagement (NSSE) Institute, which has decades of experience leading national efforts such as NSSE and the Carnegie Classification of Institutions of Higher Education. “One of the things that IUCPR is known for in our work is providing a simple, streamlined process for the exchange of information and sending reports back that are comprehensive, useful to institutions, and speak institutional language.”

Institutions can personalize their involvement in the institute by choosing to participate annually or cyclically (every two to three years), and they can choose to measure one outcome or multiple outcomes for the student work they select for assessment. Once student work is scored, the VALUE Institute compiles an in-depth report that includes student performance and demographic data along with guidelines for interpreting the results.

Reports are designed to help campuses, departments, and faculty translate results into actionable improvements for teaching and learning, address concerns about equity, and ensure “that all students are achieving at the levels that you and your faculty expect,” Kinzie said. “We really encourage campuses to do their own contextualizing and thinking about how this fits with what they hope to accomplish with their student learning or strategic initiatives.”

A Faculty Development Power Tool

Institutions that participate in the VALUE Institute use the experience to facilitate transformative on-campus discussions about teaching and learning.

“In the end, just about all faculty would like their students to learn. They would like them to become better prepared for post-college life,” Sullivan said. “And this gives faculty a handle on how to tackle that process in partnership with a huge number of other people who are struggling with the same questions. AAC&U brings them together, they talk about it, they learn from each other, and they gravitate toward best practices.”

The assessment data garnered from the VALUE Institute help guide faculty to better align and redesign assignments so they mirror the dimensions of learning represented in the VALUE rubrics.

“Not all assignments are the right assignments for the VALUE rubrics,” said Tara Rose, director of assessment at Louisiana State University and a longtime participant in VALUE projects at the institution and state level.

Assignments used with VALUE rubrics should be meaningful to students, represent students’ best work, and demand substantial student investment.

Assignments that are more demanding “elicit far better performance from their students,” Sullivan said.

Most importantly, they need to be aligned with the outcomes and dimensions from the specific VALUE rubrics that will be used for assessment.

“If those are off, then it's no wonder that a student's work might not really represent aspects of either the outcome or the dimensions in the rubric,” Kinzie said. “One of the beauties of this project is that it really shines the spotlight on faculty expertise in terms of designing assignments that elicit good student work.” 

One way to align assignments with rubrics is to annotate assignments to directly tell students about the outcomes and dimensions their work should address. AAC&U, with support from the Sherman Fairchild Foundation, is working with seven private colleges and universities to explore intentional assignment redesign and alignment with rubric dimensions to examine the effects on student learning improvement. Sullivan and Kate D. McConnell, assistant vice president for research and assessment at AAC&U, recently published their research on the importance of focusing on assignment design and rubric alignment in assessment efforts in Change: The Magazine of Higher Learning.

Once institutions receive their reports and raw data files from the VALUE Institute, they have additional opportunities for professional development. Rose encourages institutions and faculty to score their own students’ work using the VALUE rubrics before meeting with her one-on-one to compare their assessment scores with those from certified external scorers. She tells them, “This is how your students did in the VALUE Institute. What does this mean to you? What does this mean to your classroom? And how can you look at your teaching and learning practices in the classroom based on this data?"

“It was really eye-opening for some faculty,” Rose said. “It helps them improve their teaching practices, their pedagogy, their curriculum.”

Most of the faculty who submit student work to the VALUE Institute are working within a larger system, campus, or departmental initiative. However, groups of individual faculty can participate as well. In 2017, Rose convened a consortium of nineteen faculty members from six institutions across Kentucky. They submitted 151 student work samples to the institute. Three institutions in Kentucky also participate in the Multi-State Collaborative.

Whether faculty are participating individually or as part of a larger project, it’s always important to consider their other responsibilities to avoid a sense of “initiative fatigue” on campus.

“Faculty are very busy,” Rose said. “They are overworked, underpaid, and yet we continue to throw more initiatives at them. You've got to find the right initiative, the right person to lead the initiative on your campus, and for the VALUE Institute you've got to find the right assignment. And it has to truly be meaningful for your campus. You have to want to do this.”

A Different Kind of Standardization

To ensure that scores are reliable and generalizable, the VALUE institute calibrates and certifies scorers, some of whom become experts at using a single rubric.

“The faculty are already experts in evaluating student work, and now you've added this layer of their facility with a tested rubric,” Kinzie said. “You've really just amplified faculty expertise.”

Terry Dean, associate professor of musicology and gender studies at Indiana State University, is a veteran scorer using the Critical Thinking VALUE rubric. When starting to score a new student work product, “I always go back and review the rubric,” he said. “I like to re-ground myself, even if it's been just a couple of days [since I last read the rubric].” 

He also reviews the institute’s scoring guidelines “to make sure that I'm not building or allowing bad habits to set in.”

Identifying information is removed from all student work artifacts to ensure fair and blind scoring.

“We don't want to standardize what students produce, but we want to standardize our process to ensure that data are comparable,” Kinzie said.

The VALUE Institute continues to find new ways to use data and refine its processes. Soon, it plans to draw from the NSSE framework to become a tool for institutions to compare their own data with those from peer institutions across the country.

“There's this tendency to forget we are contributing students to a larger workforce,” Dean said. “We aren't the only institution and our students aren't the only ones. If we can draw from data that we're seeing in reports that are published, we can measure our own local students' data against the national trends and determine where we stand or where we need to make improvements.”

Using VALUE in the Classroom

VALUE rubrics can have substantial effects on faculty teaching and student learning even outside large assessment initiatives. Like many faculty across the country, Dean draws on aspects of the rubrics when creating grading rubrics for his own courses.

“In doing that, I'm able to show my students what I value from the very first time I have them in a course. They can immediately go to the rubric language and say, ‘Okay, this is where my score came from. These must be the issues I'm having. Let's talk about how I can fix those problems.’”

He also draws inspiration from the VALUE Institute’s training for scorers, which is conducted online through conference calls and a video curriculum.

“I've been training faculty to score using rubrics with the same model that AAC&U presented in their VALUE training,” Dean said. “We're starting to see the VALUE rubrics being applied in ways that, on my campus at least, I never anticipated. And that, to me, is really, really exciting.”

Part of VALUE’s broad appeal is its approachable, interdisciplinary nature, which allows rubrics on any outcome to benefit faculty from any discipline.

“The language is pretty easy to understand. They have glossaries that explain concepts,” Dean said. “They are interdisciplinary or multidisciplinary in nature, so you don't have to have expertise in a particular field to be able to use the rubrics not just correctly, but effectively.”