Diversity and Democracy

Assessing the Practices of Public Scholarship

Despite the growing prominence of campus–community engagement and public scholarship in higher education, little has changed since Cruz and Giles (2000) first questioned the absence of community perspectives in the service-learning research literature. As Stoecker and Tryon (2009) note, asking community partners to define what impact should look like is especially rare. Yet the question of how academic institutional priorities like civic engagement depend on and relate to community aspirations is a pressing ethical and political issue.

Who participates—in setting agendas, defining goals, and creating values? Who benefits? How do we know? These questions must be central to the assessment of community-engaged teaching and research. Assessing the Practices of Public Scholarship (APPS), a working/research group of higher education professionals and cultural practitioners, is ensuring this centrality by elaborating and promoting an integrated approach to assessment.

APPS is sponsored by Imagining America (IA), a national consortium of higher education institutions and allied organizations. Founded in 2000, IA is committed to advancing the public purposes of the cultural disciplines—the arts, humanities, and design—by transforming university research and teaching to promote community partnership and development. APPS emerged in 2009 in response to findings from IA's Curriculum Project (Goldbard 2008), which exposed the lack of sustained, reciprocal partnerships in community cultural development—an approach that recognizes arts and culture as major assets in building community capacities, envisioning alternative futures, and catalyzing social change.

Core Values

The APPS approach to assessment integrates questions about community impact into project and partnership designs, involves community stakeholders meaningfully and collaboratively, and invites evaluation of university practices in relation to mutually defined goals. To support this approach, APPS has articulated a set of five guiding values for assessment: collaboration, reciprocity, generativity, rigor, and practicability (see sidebar).

Guiding Values for Integrated Approaches to Assessment

The Assessing the Practices of Public Scholarship (APPS) approach, elaborated at www.imaginingamerica.org, centers on the following values:

  • Collaboration: Community and university stakeholders define meaningful outcomes from the outset and throughout project implementation.
  • Reciprocity: Community and university stakeholders engage in mutual and transformative exchange, including reflection, feedback, and critique.
  • Generativity: Assessment activity feeds the project, program, or course, while also looking beyond these units and inviting stakeholders to evaluate the long-term relationships at the heart of public work.
  • Rigor: Assessment activity uses sound methods and practices.
  • Practicability: Methods and practices are proportionate to the project and to available resources.

—Miriam Bartha and Georgia Nigro

These core values are not unique: they borrow from multiple literatures and fields (see, for example, Bandy 2012). But centering them offers an intervention of sorts. While IA members have expressed strong desires for practical tools to help them negotiate various demands for assessment, they have also upheld the value of context and narrative, favored qualitative over quantitative methods, and evidenced less concern with generalizability than many social science researchers. The core values offer a flexible and adaptable framework that provides a process to engage in and with rather than a model or tool to apply.

The core values also offer grounds for engaging critically with the multiple institutional agendas that drive assessment—and for renegotiating their terms. Institutional mandates for assessment are often tied to funding and program continuation, making high-stakes, short-term, and relatively inflexible evaluation methods imperative for stakeholders. Significantly, these mandates tend to focus assessment efforts on making the case for particular projects or programs instead of on forging deeper understandings of the processes or strategies that facilitate or impede programmatic goals. An integrated approach to assessment recognizes these realities and provides alternatives, countervalues, and guiding questions that can reframe assessment demands.

APPS's preliminary research suggests that the core values coexist in dynamic and productive tensions—both with one another, and with the practical realities that community and higher education partners inhabit. For instance, stakeholders engaging in collaborative and reciprocity-minded assessment may challenge each other's understanding of scholarly rigor. Similarly, assessment practices that live up to the highest collaborative ideal may not be practicable at a given moment or scale.

An integrated approach to assessment holds these tensions in balance, emphasizing both the practicable and the generative, the near-term and the long-term, the imperfect process and the aspirational goal. It opens the timeframe of impact assessment and allows space for examining the effects of sustained engagement, documenting mistakes as well as successes, and reflecting collaboratively and rigorously on both.

Case Studies
What do integrated approaches to assessment look like in action? Given the challenges inherent to the project of developing and assessing collaborative partnerships, what does it mean to try to actualize these values? APPS has sought to answer these questions and deepen discussion through a series of case studies examining projects and partnerships within the IA membership network and beyond. The IA website features these case studies, which illustrate integrated assessment principles and the challenges of realizing them.

One example comes from MIT@Lawrence, a sustained partnership between the Massachusetts Institute of Technology (MIT) and the city of Lawrence, Massachusetts, that involves collaborative projects ranging from neighborhood revitalization to property management. In MIT@Lawrence, collaboration, reciprocity, and generativity have emerged—but not right away, and not without colliding with other values, such as practicability. To document the project's long-term outcomes, MIT students videotaped interviews of Lawrence residents, yielding over three hundred hours of interviews for analysis. Project director Lorlene Hoyt registered some discomfort with the collaborative balance: because Lawrence residents did not have time to serve as editors, students effectively shaped the story. Practicability dictated this outcome. Yet screenings of the resulting film afforded opportunities for reflection and dialogue with Lawrence residents, illustrating the core value of reciprocity.

APPS case studies are exercises in critical self-analysis that promote discussion, learning, and exchange among IA members seeking to deepen their own engagement through assessment. At IA's annual meetings, discussions of assessment have moved beyond questions of expediency and self-congratulation to engage challenges like how to report problematic findings to partners and sponsors, how to value and measure intangibles like empathy and spontaneity, and how to document dynamic processes and changes that emerge over time. APPS is identifying and creating new venues for exploring these topics, including a recent webinar on the challenges of coordinating and aligning assessments at the level of the university, the school or department, and the project or course.

Evolving Approaches

APPS suggests that an integrative approach to assessment informed by core values and current research can activate, deepen, and sustain organizational conversations about the practices of publicly engaged scholarship and community–campus partnerships. This work is ongoing, iterative, and evolving. But within the field of engaged cultural research and teaching, APPS and IA have problematized assessment as an ethical and epistemological challenge, to be met critically and creatively.

The authors would like to thank Sylvia Gale, Pam Korza, and Joe Bandy for their contributions to this article, which relies heavily on the collective work of APPS members: Joe Bandy (Vanderbilt University), Miriam Bartha (University of Washington), Adrienne Falcon (Carleton College), Sylvia Gale (University of Richmond), Pam Korza (Animating Democracy, Americans for the Arts), Georgia Nigro (Bates College), Susan Schoonmaker (Imagining America), Gladys Palma de Schrynemakers (Long Island University), and Stephani Etheridge Woodson (Arizona State University).


Bandy, Joe. 2012. "Empowering Community by Assessing and Developing Service Learning Partnerships." Paper presented at the Professional and Organizational Development Network Annual Conference, Seattle, Washington.

Cruz, Nadinne I., and Dwight E. Giles, Jr. 2000. "Where's the Community in Service-Learning Research?" Michigan Journal of Community Service Learning Special Issue: 28–34.

Goldbard, Arlene. 2008. "The Curriculum Project Report: Culture and Community Development in Higher Education." http://imaginingamerica.org/wpcontent/uploads/2011/05/08.CP_.report.pdf.

Stoecker, Randy, and Elizabeth A. Tryon, editors. 2009. The Unheard Voices. Philadelphia: Temple University Press.

Miriam Bartha is associate director of the Simpson Center for the Humanities at the University of Washington, and Georgia Nigro is professor of psychology at Bates College.

Previous Issues