The Missing Piece in Assessing the VALUE of a College Education
I confess that I believe assessment is central to every person’s life. Every day we assess ourselves and the world around us—whether to donate time or money for a cause, what to wear, what to assign students to illustrate their learning, or what words to use in a performance review or even in conversation. Our decisions are reflections of who we are and how others see us. We assess and make judgments constantly as we move through our days.
In higher education, assessment has become the formal catch-all term for lifting up who we are and how well we are doing as educators and institutions. Indeed, assessment has become an industry, an answer, and an integral part of higher education over the past couple decades in the face of rapidly changing social, economic, and political environments often characterized as “threats.”
Higher education assesses student learning using two primary approaches—indirect and direct measures. Indirect measures have prevailed to this point. There are several surveys of student self-reporting that annually document what students think they are learning. In addition, we have an increasing number of employer surveys, including AAC&U’s periodic nationwide surveys (e.g., “Fulfilling the American Dream: Liberal Education and the Future of Work”), which indicate that graduates are prepared for entry-level jobs but need more if they are to persist and move up in the workplace. The findings from student and employer surveys are decent. Could the results be better? Sure. Are they bad? No.
What about direct measures of learning in college? One of the first large-scale quasi-direct efforts to examine student learning during the first two years of college relied on a standardized test removed from the actual teaching and learning of the program of study. Despite the limitations surrounding the findings, the message from the authors of this effort was that college was delivering “not much.” And since college costs a lot of money and takes a lot of time, is it truly worth the sacrifice and cost?
Despite the studies that show the actual earnings and well-being advantages for people who pursue a higher education, the ensuing headlines both inside and outside higher education continue to focus on the shortcomings in the findings: the non–100 percent achievement levels, the missing pieces.
As an assessment person, I understand. Part of my assessment job is to look at what accounts for why every graduate is not achieving excellence. Are there patterns of success, and are they associated with existing practices among faculty, educators, and institutions? Are there patterns associated with characteristics students bring with them to college that account for disproportionate performance? Can college educators identify actions they can take to create greater equity in desired outcomes?
Still missing in these discussions and studies is the direct evidence of students’ ability to demonstrate learning on the key skills and abilities that educators and employers alike demand from college graduates. Evidence is being produced every day that can be used to address these gaping holes in the measurement of student learning—evidence that allows those inside and outside of higher education to formulate meaningful questions about the state and quality of student learning and begin to formulate responses to these questions. To find this evidence, we need only to look at the artifacts—or real student work—produced in response to assignments in college classes and cocurricular programs.
What students say about their learning is important and revealing, but through their responses to assignments, we can capture what students do with their learning. The VALUE approach to assessment, the VALUE rubrics, and the VALUE Institute were developed to fill this void.
The VALUE approach acknowledges that all students bring an array of learned knowledge, skills, and abilities with them to formal educational arenas. The assets students bring are a ready-made, if often not recognized, foundation for building new knowledge, skills, and abilities. Faculty and other educators have experience and expertise that are critical for creating opportunities and next steps to help students, wherever they start, scaffold the higher-order abilities they need to complete postsecondary credentials. It is these essential learning outcomes that employers and our students need to obtain the first job, the next job, and the jobs that have not yet been named, and that prepare individuals to adapt their current learning to new and unscripted situations.
The VALUE rubrics articulate what those essential learning outcomes look like when observed in student work so that educators and students know they are making progress in achieving the quality of learning needed to achieve personal, social, and economic success. VALUE rubrics gain meaning through the expertise faculty bring in using shared measures of quality learning.
The VALUE Institute adds the ability for programs and institutions to have their local quality assessments validated externally by colleagues trained to use the VALUE rubrics in the absence of individual student or institutional identities, revealing the patterns of learning exhibited in the work of students. VALUE Institute findings in aggregate reflect the overall generic—yet nuanced—landscape of learning for undergraduate students that creates a context for the rich conversations single institutions, programs, systems, or states can have as they analyze and examine their own results. VALUE Institute scores aren’t so much about the numbers, but rather the questions that the numbers allow us to address through disaggregation and integration with other measures. VALUE results are not about winning, losing, or rankings, but about creating a shared space for figuring out how to improve student, faculty, and institutional quality and success.
In short, the VALUE approach, rubrics, and Institute are the evidence of the quality of learning in higher education that has been missing. They are assessment at its best.