Peer Review

Actual and Potential Uses of STIRS Case Studies in Courses and Curricula

Case studies are valuable, engaging pedagogical tools that provide realistic examples of problems to be solved and help students to develop a variety of important skills (Peden 2015; Herreid 2005; Bonney 2015; Barnes, Christensen, and Hansen 1994). They can also be used to help students mature towards producing what AAC&U refers to as “signature work,” significant projects that are used to help students engage in assimilating and using what they have learned (Peden 2015). The AAC&U Scientific Thinking and Integrative Reasoning Skills (STIRS) Project has produced sixteen peer-reviewed cases that instructors can use to incorporate the STIRS framework into their courses (Stanford, Byrne, and Hunting 2016; Riegelman and Hovland 2012). In this article, we describe the intended use of these cases, examples of how they have been used thus far, and potential future use and analysis of these cases.

Using Case Studies to Implement the STIRS Framework

The STIRS framework centers on evidence-based thinking, including understanding what evidence is and learning how to gather and evaluate evidence, how to use evidence to solve problems, and how to use evidence to make decisions. To encourage students’ evidence-based thinking, it is important to integrate these components into a general education curriculum to ensure that undergraduates practice these skills throughout their college experience. In the STIRS framework, this is achieved through the implementation of scaffolded “cornerstone,” “connector,” and “capstone” experiences. In brief, cornerstone experiences establish a foundation in evidence-based thinking, connector courses help students draw connections between their general education and major-related education, and capstone experiences promote integration of knowledge and application to real-world problems.

The cornerstone component of the STIRS framework can be implemented using a range of what AAC&U calls high-impact practices. These include first-year seminars and experiences, learning communities, writing intensive courses, collaborative assignments and projects, as well as diversity and global learning (Kuh, O’Donnell, and Reed 2013). The connector component of the STIRS framework can be fulfilled in research methods courses, which help students to think about how to gather and evaluate evidence. The capstone component of the STIRS framework is where students complete signature work. STIRS case studies are a tool that can be implemented in these cornerstone, connector, and capstone experiences.

Some of the STIRS case studies can be used as part of cornerstone, high-impact practices. For example, “The Two-Sex System: Fact or Fallacy” case study (Bauer 2015) recommends that it be used as “collaborative, student-centered assignments in which students work together in class to solve problems. . . . The case study in its entirety addresses diversity issues in that the readings and documentary encourage students to explore life experiences that may differ from their own. This interdisciplinary case study also contains a writing intensive component, another high-impact practice. Finally, this case study would be an ideal component of high-impact practices such as first-year seminars and/or learning communities, particularly those that address diversity issues.”

A number of the case studies can contribute to teaching the connector component of the STIRS framework. Singer-Freeman’s case, “MMR Vaccine and Autism: Scientific Inquiry, Ethics, and Evidence-Based Problem Solving,” for instance, “teaches topics in scientific thinking and evidence-based reasoning, including the consideration of ways in which evidence can be used to advance knowledge, the application of design and statistical reasoning principles to the evaluation of evidence, and the analysis of ethical issues which are inherent in research.” These skills are appropriate to introduce in the context of a research-methods course, as well as in other connector course contexts.

The capstone component of the STIRS framework focuses on putting evidence-based problem solving and decision making into practice. Evidence-based decision making requires students to frame options and make decisions while taking into account the probability of benefits and harms and the importance and timing of these outcomes. Several of the STIRS case studies could be used to introduce a capstone or signature work curriculum. For instance, “Preventing Spina Bifida and other Neural Tube Defects” (Riegelman 2015) models the STIRS approach to evidence-based problem solving and illustrates the potential impact of the simple intervention of increasing folic acid consumption. Another example is the case “To Drill or Not to Drill? A Dilemma in the Context of Climate Change in the Arctic” (Singh 2015), which illustrates issues that arise when making politically, economically, and environmentally difficult decisions, and demonstrates how evidence can be used to more effectively guide the debate.

The cases developed as part of the STIRS initiative are not intended to be comprehensive or complete in addressing the STIRS framework in all curricular contexts. However, they do illustrate the multiple ways that cases can provide opportunities to engage students in evidence-based thinking within the curricula of an integrative bachelor’s degree.

Intended Uses of STIRS Case Studies as Envisioned by the Authors

STIRS case studies have been intentionally designed to teach students to use evidence-based thinking in the context of societally relevant topics and content from specific disciplines. The primary intention of these cases is to facilitate instructors’ ability to support students in their development of critical thinking skills. As a result, there are many similarities in the ways that the STIRS case study authors envisioned that their cases could be used by instructors. Most authors envisioned that case materials could be used in a wide range of course contexts from introductory-level general education courses to more advanced specialty courses. To allow implementation in upper-level, discipline-specific courses, additional activities are suggested in the facilitator guides to allow students to think about the case topic in greater detail, using more advanced terms and theories.

Case facilitator guides commonly include suggestions about how to tailor a case to different audiences, and about how to adapt the case in ways such as modifying case length, including specific pedagogical strategies, and incorporating student and/or instructor-specific learning goals. When writing the cases, authors did not assume that students or instructors would have prior knowledge of particular fields, methods, or approaches, and thus they provided helpful suggestions and materials to support case incorporation with different audiences and in diverse course contexts. Facilitator guides also include references for obtaining additional information about the topics. Taken together, the intended uses for the STIRS case studies are quite broad, permitting use in many course contexts.

Actual Use of Case Studies

The STIRS case studies were first published online in January 2015 ( In their first year of publication, there were 237 unique downloads of STIRS facilitator guides by 102 distinct individuals from at least 87 unique institutions across the United States and one download from an institution in Canada. Accessing facilitator guides requires registration through the AAC&U website, and thus tracking downloads of these guides is a good means to accurately gauge the number of people who have a genuine interest in using these cases. The individuals who downloaded the cases represent a range of academic professions, including all faculty levels (adjunct faculty and visiting professors through full professors); administrators (department chairs, deans, and program directors); graduate students; postdoctoral fellows; consultants; and staff involved in curriculum design or support (outreach specialist, science consultant, lab coordinator, library director, and teaching fellows directors). They also come from a range of institutions, including community colleges, small liberal arts schools, large state-funded universities, historically black colleges and universities, and large research universities. Each case was downloaded at least five times, and four cases were downloaded twenty or more unique times. The maximum number of unique downloads for any case was thirty-four.

Although the STIRS case studies were published online in January 2015, the first case downloads occurred in May 2015, after marketing efforts by the AAC&U communications department. These actions included sending an email blast to 25,000 faculty and academic affairs administrators to advertise the case studies and publishing an article on the STIRS case studies in the May 2015 AAC&U News, an electronic newsletter. Over 60 percent of the total downloads in the first year of publication occurred in May and June 2015, corresponding with this marketing push. Interestingly, there was no appreciable increase in downloads when the STIRS case studies were described at sessions presented by the case authors and others at the January 2015 and January 2016 AAC&U annual meetings.

Anecdotally, some of the case study authors have been using their cases in their own courses and have seen STIRS case studies begin to be discussed in professional education listservs (e.g., the Association for Assessment of Learning in Higher Education listserv). The authors who have used their cases have had positive experiences thus far. Zerr used his case “Congressional Apportionment: Constitutional Questions, Data, and the First Presidential Veto” in a general education course focused on quantitative reasoning (Zerr 2015). Although data were not gathered to evaluate student learning, he noted that student reactions and engagement were very positive. This case juxtaposes content areas not often seen as clearly connected by students (history and historical writings with quantitative information). Students remarked that the case provided an unexpected context in which to promote an examination of quantitative evidence. Similarly, Carmichael’s use of her case study “People, Places, and Pipelines: Debating Tar Sands Oil Transmission” in an interdisciplinary, first-year learning community elicited favorable responses from students who requested more opportunities for case study activity (Carmichael 2015). In this context, she felt that the case study worked very well to coalesce both interdisciplinary perspectives on complex world problems and the use of evidence-based reasoning to inform decision making in regards to those problems. As another example, Adele Wolfson and Justin Armstrong used their cases “Blood Doping: Cheating or Leveling the Playing Field?” and “Different Times of the Month: A Cross-Cultural Analysis of Menstruation Taboos,” respectively, in their team-taught first-year seminar course focused on scientific and cultural aspects of blood (Wolfson 2015; Armstrong 2015). They felt that the cases were successful in engaging the students but found that they needed more time than expected to work through the cases.

At least two of the authors (Stanford and Byrne) have collected preliminary data from courses in which they incorporated the cases that they authored. Byrne incorporated his case, “Exploring Lawns and Gardens as Complex Socio-Ecological Systems,” into an introduction to sustainability studies course that enrolled first-year through senior undergraduates from diverse majors (Byrne 2015). Byrne surveyed students from this class, before and after using the case, about their ability to complete the primary learning outcomes from the case, using Likert-type items to assess student perception of their learning (twenty-seven respondents). After completing the case study, student responses indicated statistically significant improvements in their self-perceived abilities to achieve the learning outcomes (Byrne unpublished data). The increase in students’ perception of their understanding suggests that the case study is well-placed in the context of Byrne’s course, and it effectively promoted student learning in this context. This fits with Byrne’s personal observations that students were engaged, performed well on case-related activities, and used case-related examples in discussions throughout the rest of the course. Stanford similarly studied student perceptions of their ability to complete course learning outcomes in a course that introduces the biology of cancer to nonbiology majors. She compared results from a survey conducted after two offerings of the course, one that did not use her case study “Cell Phones and Cancer: Evaluating the Evidence to Assess Potential Association” (sixteen respondents) and one that did use the case study (Stanford 2015). While students reported a greater ability to complete course learning objectives after engaging in a course offering that included the case, this outcome was not statistically significant, potentially due to the small sample size studied thus far (Stanford unpublished data).

Of note, in both iterations of the course, the average response indicated that students felt they could effectively complete course learning objectives. Importantly, we do not want to overemphasize these data, as they are from preliminary studies with small numbers of students, which were conducted by individual case authors to understand the effects of incorporating their case into their specific course contexts. However, these data do support other anecdotal data collected, which suggests that these cases can contribute to student learning. Additional work is needed to fully understand whether observed outcomes are replicable in these and other course contexts.

Planned Institutional Uses of Case Studies by STIRS Fellows

While the STIRS project began with the development of case studies, it has now progressed to the development of complete, STIRS-based curricula by five AAC&U STIRS Fellows ( These individuals were chosen to help their institutions reimagine their general education curricula to incorporate the STIRS framework of scaffolded cornerstone, connector, and capstone experiences. The STIRS resources and methodologies provide models for strengthening evidence-based learning and creating more intentional learning pathways for undergraduates, whether the first step in curricular reform is to use case study assignments in particular courses or to attempt complete curricular revision.

As the institutional phase of the STIRS initiative has taken shape over the course of the past year, evidence-based reasoning has been emphasized as a key curricular component. Of the four institutions involved in the second phase of the STIRS initiative (Mercer University, Middlesex Community College, Oregon Institute of Technology, and University of North Dakota), three have either used the STIRS case studies as a model for incorporating evidence-based reasoning into key assignments or have incorporated the case studies into connector courses that bridge initial learning experiences to capstone signature learning projects. At the University of North Dakota, the STIRS initiative has led to a newly renovated Interdisciplinary Studies major that has a focus on helping students to understand evidence-based reasoning across disciplines.

This major incorporates a vibrant and successful first-year learning program with a required general education capstone. Connector courses focus on exposing students to the types of questions that are asked broadly across disciplines, the methods used to find answers to these questions, and the points of intersection between disciplinary methodologies and frameworks. STIRS case studies are intended as an integral part of this curriculum. Since the STIRS case studies are interdisciplinary, accessible, and varied, they are perfect vehicles for engaging students in the work of evidence-based reasoning across disciplines. Because all of the STIRS case studies have been carefully constructed and peer-reviewed, there is standardization of the quality of these cases with regards to promoting student development of evidence-based reasoning. In this institutional context, case studies will be selected by individual course instructors based on how the case will allow the instructor to advance his or her learning outcomes.

Other STIRS Fellows are also looking to the STIRS case studies as part of their evidence-based curriculum revision efforts. The Middlesex Community College curriculum revision centers on the use of primary source, empirical research articles and other research-based evidence to help develop students’ evidence-based decision-making skills. After an initial round of assessment information is collected, curricular development will progress to the design and incorporation of another set of learning activities. The STIRS case studies are one of the options being carefully explored as a viable option for continued curricular development. Similarly, at Mercer University, STIRS case studies are being integrated within a pilot, introductory-level curriculum. Taken together, STIRS case studies are helping to facilitate institutional curricular changes intended to help students develop evidence-based reasoning skills.

Approaches to Future Evaluation of Case Studies

At this time, the STIRS case studies have been published for fewer than two years. The very early outcomes regarding the use of STIRS case studies suggest that there is interest in using these cases in undergraduate curricula. Taking into account that the majority of case downloads took place after significant marketing activities by AAC&U, we hope that this issue of Peer Review will compel others to consider use of these free, peer-reviewed resources at their institutions.

Future evaluation of case studies should aim to involve multiple mechanisms of collecting data (Lundeberg and Yadav 2006; Lundeberg 2006; Bonney 2015). Using the information collected from downloads of STIRS facilitator guides, we can reach out to those who have downloaded these guides to identify who has actually used the cases and learn about their experiences. This would allow us to assess the scope of use and develop a greater understanding of how effective users have been in implementing these cases in their own courses. Data collection can initially occur through surveys and be followed up with interviews to construct a more complete understanding of the instructor experience. Ideally, we would also be interested in developing mechanisms to help instructors assess student competencies before and after implementation of a case. Not only could this serve as a mechanism to gather data on whether these cases are effective learning tools, but also could help faculty to demonstrate the effectiveness of their teaching (i.e., for promotion and tenure portfolios).

Additional work to assess outcomes of the STIRS case studies is warranted. Though other case-study resources exist, the STIRS cases were developed using a unique framework and approach, with a specific focus on evidence-based thinking. While case-based learning has been established as a valuable learning approach in many contexts (Barnes, Christensen, and Hansen 1994; Miller and Tanner 2015; Hung, Jonassen, and Liu 2008), the specific approach used in developing STIRS case studies still needs to be evaluated. Careful assessment of the use and outcomes of the STIRS case studies will allow us a better understanding of whether these tools are useful resources for teaching within the STIRS framework. If so, it is our hope that this will justify the development of additional resources of this type. Tangibly supporting faculty in promoting scientific thinking and integrative reasoning in diverse classroom environments with readily available, high-quality pedagogical resources is essential to facilitate the incorporation of this type of thinking into the undergraduate classroom. Real change in the way that students are being taught will require sustained effort in curriculum development and sharing of curricular materials to support faculty teaching. These types of efforts could help, institutions overcome the significant barriers that inhibit curricular change at the undergraduate level (Bok 2006; Wieman 2012; Brownell and Tanner 2012; Austin 2011).


The authors would like to thank Katherine Hunting, who provided significant mentoring and editorial support to the STIRS case study authors; Kevin Hovland, Bethany Sutton, Kathy Wolfe, Lisa Russell O’Shea, and David Paris for supporting development of the STIRS case studies; Elizabeth Dickens for her help with data collection; AAC&U for financial support for the STIRS project through an anonymous donor; and especially the STIRS Scholars who authored the STIRS case studies and STIRS Fellows who are exploring how to effectively integrate these cases into curricula.


Armstrong, Justin. 2015. “Different Times of the Month: a Cross-Cultural Analysis of Menstruation Taboos.” AAC&U.

Austin, Ann E. 2011. “Promoting Evidence-Based Change in Undergraduate Science Education.” Washington, DC: National Academies National Research Council Board on Science Education.

Barnes, Louis B., C. Roland Christensen, and Abby J. Hansen. 1994. “Teaching and the Case Method: Text.” Cases, and Readings, 3rd ed. Boston: Harvard Business School Press.

Bauer, Angela. 2015. “The Two-Sex System: Fact or Fallacy?” Association of American Colleges and Universities.

Bok, Derek. 2006. Our Underachieving Colleges: A Candid Look at How Much Students Learn and Why They Should Be Learning More. Princeton, NJ: Princeton University Press.

Bonney, Kevin M. 2015. “Case Study Teaching Method Improves Student Performance and Perceptions of Learning Gains.” Journal of Microbiology & Biology Education 16 (1): 21.

Brownell, Sara E., and Kimberly D. Tanner. 2012. “Barriers to Faculty Pedagogical Change: Lack of Training, Time, Incentives, and . . . Tensions with Professional Identity?” CBE-Life Sciences Education 11 (4): 339−346.

Byrne, Loren B. Unpublished Data.

——. 2015. “Exploring Lawns and Gardens as Complex Socio-Ecological Systems.” Association of American Colleges and Universities.

Carmichael, Tami S. 2015. “People, Places, and Pipelines: Debating Tar Sands and Shale Oil Transmission.” Association of American Colleges and Universities.

Herreid, Clyde Freeman. 2005. “Using Case Studies to Teach Science. Education.” American Institute of Biological Sciences.

Hung, Woei, David H. Jonassen, and Rude Liu. 2008. “Problem-Based Learning.” In Handbook of Research on Educational Communications and Technology, edited by J. Michael Spector, M. David Merrill, Jeroen van Merrienboer, and Marcy P. Driscoll, 3rd ed., 485−506. New York: Taylor and Francis Group.

Kuh, George D., Ken O’Donnell, and S. Reed. 2013. “Ensuring Quality and Taking High-Impact Practices to Scale.” Washington, DC: Association of American Colleges and Universities.

Lundeberg, Mary A., and Aman Yadav. 2006. “Assessment of Case Study Teaching: Where Do We Go From Here? Part II.” Journal of College Science Teaching 35 (6): 8.

Lundeberg, Mary A., and Aman Yadav. 2006. “Assessment of Case Study Teaching: Where Do We Go From Here? Part I.” Journal of College Science Teaching 35 (5): 10-13.

Miller, Sarah, and Kimberly D. Tanner. 2015. “A Portal into Biology Education: An Annotated List of Commonly Encountered Terms.” CBE-Life Sciences Education 14 (2): fe2.

Peden, Wilson. 2015. “Signature Work: A Survey of Current Practices.” Liberal Education 101 (1/2): 22−29.

Riegelman, Richard. 2015. “Preventing Spina Bifida and Other Neural Tube Defects.” Association of American Colleges and Universities.

Riegelman, Richard K., and Kevin Hovland. 2012. “Scientific Thinking and Integrative Reasoning Skills (STIRS): Essential Outcomes for Medical Education and for Liberal Education.” Peer Review 14 (4):10.

Singer-Freeman, Karen. 2015. “MMR Vaccine and Autism: Scientific Inquiry, Ethics, and Evidence-Based Problem Solving.” Association of American Colleges and Universities.

Singh, Vandana. 2015. “To Drill or Not to Drill? A Dilemma in the Context of Climate Change in the Arctic.” Association of American Colleges and Universities.

Stanford, Jennifer S. 2015. “Cell Phones and Cancer: Evaluating the Evidence to Assess Potential Association.” Association of American Colleges and Universities.

——. Unpublished Data.

Stanford, Jennifer S., Loren Byrne, and Katherine Hunting. 2016. “Promoting Evidence-Based Thinking through the STIRS Case Studies.” Peer Review 18 (4):14−18.

Wieman, Carl. 2012. “Applying New Research to Improve Science Education.” Issues in Science & Technology 29 (1): 25−32.

Wolfson, Adele J. 2015. “Blood Doping: Cheating or Leveling the Playing Field?” Association of American Colleges and Universities.

Zerr, Ryan J. 2015. “Congressional Apportionment: Constitutional Questions, Data, and the First Presidential Veto.” Association of American Colleges and Universities.

Jennifer S. Stanford, assistant professor, department of biology, codirector of the Center for the Advancement of STEM Teaching and Learning Excellence (CASTLE), Drexel University; Tami Carmichael, professor and director of humanities and integrated studies, University of North Dakota; Ryan J. Zerr, professor, department of mathematics, director of essential studies, University of North Dakota; Loren Byrne, associate professor, department of biology, marine biology, and environmental science, Roger Williams University; and Richard K. Riegelman, professor and founding dean, department of epidemiology and biostatistics, The Milken Institute School of Public Health, The George Washington University

Previous Issues