I have never met a faculty member who was excited about
doing assessment, although rumor has it they exist.
In fact, most have been resistant if not downright hostile
to the notion. I fall in the resistant category. I have
too much work to do to welcome any new task. Surprisingly,
the wrong reason--minimizing the additional work--has
led to the right way to do program assessment. Analyzing
work students produce in the capstone is simply easier
than most other assessment options. Fortunately, it
also provides better measures of student learning. Since
I am not the only faculty member to have come to this
conclusion, capstones are becoming central components
of assessment plans.
Using Capstones to Assess Undergraduate
The capstone course provides a venue for "assessing
how successfully the major has attained the overall
goals" (Wagenaar 1993, 214). Indeed, according to Rowles
et al. (2004), assessment is the primary organizing
principle of some capstones. As Black and Hundley note,
when students look back on their four years of college
in a capstone course, they "provide invaluable information
to faculty about the quality of instruction and of programs"
(2004, 3). Many programs are taking advantage of this
rich source of data (Berheide 2001; Brock 2004; Forest
and Keith 2004). National surveys of departments reveal
that in political science as well as in sociology, capstones
are the most common assessment (Kelly and Klunk 2003;
Spalter-Roth and Erskine 2003).
Henscheid (2000) finds
that almost half of 707 regionally accredited colleges
and universities use capstones as part of their institution's
assessment program. While Henscheid also finds that
smaller colleges and universities are more likely to
use capstones for assessment than larger ones, at the
University of Washington, about 60 percent of the departments
use "some kind of senior experience--including capstone
courses, design courses, and senior seminars--to evaluate
student's learning in the majors" (Beyer 2001, 1). At
Valdosta State University, nineteen of twenty-four academic
units evaluate performance in capstone courses as a
method of assessment, making it the third most frequently
used method behind final exams and evaluation of course
presentations (Yates 2004). Similarly, at Seton Hall,
twentytwo out of thirty-three academic units use capstone
courses as part of their assessment programs. Across
disciplines, private institutions are more likely than
public ones to use products from capstone courses to
assess undergraduate education.
Assessing Capstone Products
Currently departments use capstone products to assess
their majors in a variety of ways, ranging from rudimentary
to rigorous. Beginning at the most basic level, some
departments require students to publicly present their
work as an exhibition, performance, poster, etc. (Bachand
et al. 2006, 21). These displays "provide the most direct and most unfiltered picture of
students' capabilities" (Hartmann 1992, 128).
presentations are judged in some way, the assessment
process has moved to the next stage. For example, some
institutions--including Saginaw Valley State University
and Skidmore College, where I teach--submit projects
for presentation at conferences or to undergraduate
paper contests, providing external validation of the
quality of student work. Some programs, including the
engineering programs at Saginaw Valley State University,
even use external evaluators to "grade" the projects.
Best practice, though, involves going a step further
to analyze the projects systematically for the evidence
they provide about program quality and to use that evidence
to make curricular improvements. For example, the sociology
department at the University of Wisconsin--Milwaukee
uses five Likert scale items to assess how well the
capstone papers demonstrate achievement of the department's
learning goals (2006). A more elaborate approach involves
applying an existing rubric, such as Primary Trait Analysis
(Jervis and Hartley 2005), or a locally developed one
(Cappell and Kamens 2002) to capstone products. This
more systematic approach can provide useful insight
into the strengths and weaknesses of the curriculum.
A Case Study
Having dragged our feet as long as we could,
my departmental colleagues and I finally were forced
to conduct an assessment in spring 2003. We reluctantly
agreed to use senior seminar papers for our program
assessment because all the other alternatives looked
like more work. We chose the theory goal because we
were already concerned about the issue. The two sociologists
teaching the required theory course examined one strong,
one average, and one weak paper. This first stab at
assessment led to three main conclusions:
- All three
papers, including the weakest one, demonstrated "basic
facility with many of the crucial concepts in social
- The theory goal needed to be revised.
- The department needed to teach the connection between
theory and methods not only in the theory and senior
seminar courses, but also in the introductory, methods,
and at least some elective courses. (Brueggemann 2003)
The following year, the sociologists who teach statistics
and research methods evaluated how three more papers
achieve our methodological goal--concluding that "students
generally succeed in achieving our methodological goals"
(Fox and Karp 2004, 7). They made several recommendations
"to strengthen further an already effective program,"
including suggesting that the program revise its goals.
In the third year, the sociologists decided to look
at how well students could articulate how the discipline
contributes to understanding social life, concluding
that "senior sociology majors, at all levels of ability,
are applying sociological perspectives to issues of
concern to them" (Berheide and Walzer 2005, 4). The
2005 assessment identifies two general areas for improvement:
- Encourage students to be even more explicit in linking
their specific concerns with implications for sociological
theory and knowledge.
- Help students to improve their
ability to move from simply cataloguing findings to
writing about them in prose that reflects more synthesis.
(Berheide and Walzer 2005, 4)
Overall, with relatively
little effort, my department has learned a remarkable
amount about what our students know and can do after
majoring in sociology. First, we have learned that at
least on these three goals, we are doing a good job.
Second, we have learned that our theory and methods
goals need some revision. Third, we have learned that
we need to create greater "sequencing" within the major,
especially around theory and methods. Even our minimal
approach to assessment has provided vastly better data
than we typically draw upon for making curricular decisions.
In short, faculty do not have to spend a lot of time
and effort to get very useful data.
wide range of disciplines have used capstone products
to assess the majors with favorable results. Some departments,
such as industrial engineering and aeronautics at the
University of Washington, have capstone projects evaluated
by industry experts; others, such as sociology at Bowling
Green State University, have them evaluated by both
department members and outside experts. The sociology department at Bowling
Green has found that the outside evaluator usually,
but not always, agrees with inside evaluators (Bowling
Green University 2007).
Capstones are not just used
to assess majors; they can also be used to assess general
education. Some institutions, such as Millikin University
and Portland State University, have interdisciplinary
general education capstone requirements (e.g., Brooks,
Benton-Kupper, and Slayton 2004; Rhodes and Agre-Kippenhan
2004). At Southeast Missouri State University, sixty
senior seminar faculty analyzed over three hundred capstone
products to assess general education goals related to
information, thinking, and communication skills. They
concluded that student achievement on these three learning
objectives ranged from performances in which students
were unable to formulate a thesis, produce an edited
writing sample, or cite source material accurately to
artifacts that demonstrated clear mastery of the ability
to locate and use relevant source material, evaluate
others' arguments and construct their own, and produce
polished pieces of writing. (Blattner and Frazier 2004,
As a result of this assessment, faculty "have begun
to redesign the writing assignments they give to students
by requiring more than a single draft of papers and
by specifying requirements for citation of sources and
inclusion of reference lists" (Blattner and Frazier
Capstone experiences in the disciplines can
also be used to assess general education goals. A senior
thesis assessment project at my college revealed that
at the draft stage before their thesis advisers have
provided feedback, students have trouble specifying
the question guiding their thesis, defining key concepts,
and organizing it. Simon et al. also conclude that "students
in the sciences and social sciences who have experience
with research come to the senior thesis better prepared
than in those disciplines that do not reinforce research
skills" (2006, 1).
According to Weiss (2002), sociology
department chairs rate work in the capstone course as
the second most valuable assessment tool. Moriarty (2006)
finds that 51 percent of criminal justice programs consider
capstones a very effective assessment instrument. One
reason for the effectiveness of capstone products for
assessment is that they are a direct measure of student
learning. Other assessment experts (e.g., Angelo and
Cross 1993; Banta et al. 1996) consider direct methods
of assessment the best way to measure student learning.
Capstone products are also authentic embedded assessment
methods, since they are created as part of normal classroom
activities. Finally, capstone products are an efficient
assessment method, since they take advantage of an existing
source of data. In short, capstones courses provide
a venue for assessing how successful a curriculum is
in achieving its learning objectives.
The final step is to use the data collected about student
performance to improve the major. Yates (2004) finds
that, at Valdosta State University, capstone-based assessment
led most frequently to the addition of new courses and
other changes in curriculum as well as changes in pedagogy
or course format. For example, performance in capstone
courses as well as on final exams, and pass rates of
licensing exams, portfolios, and juried exhibitions, led the art department to include visual assessment,
analysis, and writing projects in one of its courses.
Similarly, the University of Indianapolis Department
of Communications has found the capstone to be an excellent
mechanism for assessing the quality of its academic
program. As is the case in my department, evaluating
senior projects has raised concerns about the connections
between the capstone and the rest of the student's course
of study. According to Catchings, "the issues of alignment
among curriculum, learning, and the capstone have prompted
concerted efforts to improve the quality of both the
curriculum and the capstone," including "redesign of
department core curriculum courses in order to reinforce
expectations in writing and oral communication" (2004,
After five years of assessing the capstone, Leach
and Lang report that the department of anthropology
at the University of North Dakota has added methods
and theory courses to the curriculum because "our students
have provided relatively weak evidence of their understanding
of how theory affects observation and interpretation
in scientific and humanistic research." They also report
"an improvement in the clarity and strength of written
and oral communication, as a result of assessment recommendations"
(2006, 5). As these examples demonstrate, departments
that have used capstones to assess their majors have found that it leads
to improved student learning and can actually make faculty
work lives easier.
Assessment, therefore, is not an
end in and of itself, but rather a means to an end.
The end is the improvement of student learning at the
individual, program, and institutional levels. Analyzing
capstone projects is an efficient and effective approach
to achieving that end.
Angelo, T. A., and K. P. Cross. 1993. Classroom assessment techniques: A handbook for college teachers. San Francisco, CA: Jossey-Bass.
Bachand, D. J., D. Huntley, M. Hedberg, C.
Dorne, J. Boye-Beaman, and M. Thorns.
2006. A monitoring report to the Higher
Learning Commission on program assessment,
general education assessment, and
Banta, T. W., J. P. Lund, K. E. Black, and F. W.
Oblander. 1996. Assessment in practice. San
Francisco, CA: Jossey-Bass.
Berheide, C. W. 2001. Using the capstone course
for assessment of learning in the sociology
major. In Assessing student learning in sociology,
2d ed., ed. C. F. Hohm and W. S.
Johnson, 164--76. Washington, DC:
American Sociological Association.
Berheide, C. W., and S. Walzer. 2005. Sociology
assessment: Contributions of sociology to
understanding social life. Saratoga Springs,
NY: Skidmore College.
Beyer, C. H. 2001. Assessment in the majors,
Black, K. E., and S. P. Hundley. 2004. Capping
off the curriculum. Assessment Update 16
Blattner, N. H., and C. L. Frazier. 2004.
Assessing general education core objectives. Assessment Update 16 (4): 4--6.
Bowling Green State University. 2007.
Assessment reports. www.bgsu.edu/offices/
Brock, P. A. 2004. From capstones to touchstones:
Preparative assessment and its use in
teacher education. Assessment Update 16
Brooks, R., J. Benton-Kupper, and D. Slayton.
2004. Curricular aims: Assessment of a university
capstone course. The Journal of
General Education 53:275--87.
Brueggemann, J. 2003. Assessment in sociology:
Theory component. Saratoga Springs, NY:
Cappell, C. L., and D. H. Kamens. 2002.
Curriculum assessment: A case study in
sociology. Teaching Sociology 30:467--94.
Catchings, B. 2004. Capstones and quality: The
culminating experience as assessment. Assessment Update 16 (1): 6--7.
Forest, J., and B. Keith. 2004. Assessing proficiency
in engineering and technology within
a multidisciplinary curriculum. Assessment
Update 16 (4): 9--10.
Fox, W., and D. Karp. 2004. Sociology assessment:
Undergraduate training in research
methods and statistics. Saratoga Springs,
NY: Skidmore College.
Hartmann, D. J. 1992. Program assessment in
sociology: The case for the bachelor's paper. Teaching Sociology 20:25--28.
Henscheid, J. M. 2000. Professing the disciplines:
An analysis of senior seminars and capstone
courses. Columbia, SC: University of South
Jervis, K. J., and C. A. Hartley. 2005. Learning to
design and teach an accounting capstone. Issues in Accounting Education 20:311--39.
Kelly, M., and B. E. Klunk. 2003. Learning
assessment in political science departments:
Survey results. PS: Political Science and
Leach, M., and G. C. Lang. 2006. The not-sostony
path to program assessment and,
along the way, transforming a senior capstone
seminar in anthropology. University of
North Dakota Assessment Committee
Newsletter (November): 1--7.
Moriarty, L. J. 2006. Investing in quality: The
current state of assessment in criminal justice
programs. Justice Quarterly 23:409--27.
Rhodes, T. L., and S. Agre-Kippenhan. 2004. A
multiplicity of learning: Capstones at
Portland State University. Assessment
Update 16 (1): 4--5, 12.
Rowles, C. J., D. C. Koch, S. P. Hundley, and S.
J. Hamilton. 2004. Toward a model for capstone
experiences: Mountaintops, magnates,
and mandates. Assessment Update 16 (1):
Simon, L., D. Curley, M. A. Foley, R. Ginsberg,
M. Hockenos, and D. Smith. 2006. Senior
thesis assessment workshop. Saratoga
Springs, NY: Skidmore College.
Spalter-Roth, R. M., and W. B. Erskine. 2003. How does your department compare? A
peer analysis from the AY 2001--2002
Survey of Baccalaureate and Graduate
Programs in Sociology. Washington DC:
American Sociological Association.
University of Wisconsin--Milwaukee Department
of Sociology. 2006. Assessment of undergraduate
and graduate programs in sociology.
University of Wisconsin--Milwaukee.
Wagenaar, T. C. 1993. The capstone course. Teaching Sociology 21:209--14.
Weiss, G. L. 2002. The current status of assessment
in sociology departments. Teaching
Yates, C. B. C. 2004. Executive summary of the
Report on Assessment Methods at Valdosta
State University. www.valdosta.edu/sra/