Liberal Education

Spreading Innovations into the Mainstream: Building Strong Foundations

Hours before he passed away, Sir Donald Wolfit, the British actor and theater manager, was asked what it was like to die. He quipped, “Dying is easy . . . comedy is hard!”

Developing and validating a teaching innovation may be “easy.” But weaving it into the institutional fabric of teaching and learning? That’s really hard.

At the University System of Maryland (USM), we have taken a close look at a recent series of initiatives piloting course redesign (2006–2014).1 The courses targeted were multi-section developmental and gateway courses with a history of high DFW rates (i.e., a high rate of students receiving a D, fail, or withdraw). The strategy was to improve learning by applying backward design and making use of active and interactive teaching techniques such as online tutorials and assessments and small group work in class, often supported by undergraduate learning assistants.2 The hope was that equal or better results could be achieved through the use of such learning-centered practices,3 even though fewer faculty would be required to teach the courses. It was also hoped that, if these initial redesigns were successful, institutions would continue to redesign courses without further assistance or support from the system office. A matching grant of $20,000 was provided for each course, along with a series of faculty workshops and consulting help.

With support from the Bill & Melinda Gates Foundation, the USM’s William E. Kirwan Center for Academic Innovation conducted a qualitative study of these initiatives, focusing on three questions:

1. Was the success of the redesigns sufficient to persuade USM institutions to continue and expand this kind of academic transformation once they had to provide all the funds, rather than just half?

2. Did certain cultural and organizational factors make it difficult to sustain and expand course redesign?

3. If that was the case, then how should universities work on those factors in order to foster sustainable, scalable improvements in teaching?

The USM offers an especially good opportunity to explore such questions. The system is made up of eleven degree-granting universities, as well as regional centers and a research institute. It currently serves over 125,000 undergraduates in the United States and abroad, and over 41,000 graduate students. The USM comprises several institutional types that are quite different from one another: research-intensive institutions, three historically black institutions, regional comprehensives, and one of the largest online universities in the United States. The smallest USM institution enrolls about 3,000 students, while the largest serves about 53,000.

To make a long story short, the initiatives met their immediate goals.4 DFW rates improved by 7 percentage points across the 57 redesigned courses. During the 2013–14 academic year, over 143,000 students received more active and supportive learning experiences, and the equivalent of more than $5 million in faculty time and adjunct expenditure was freed for other purposes, such as teaching upper-division courses.

Beyond these immediate benefits, the initiative’s successes helped the system obtain from the legislature increased funding for academic transformation, including continued work on course redesign at some institutions, the creation of the Kirwan Center to support academic innovation across the USM, and the establishment of new roles in all system institutions to provide leadership for work on academic transformation. The Kirwan Center organized the Academic Transformation Advisory Council (ATAC), which is composed of those institutional leaders. ATAC has led a variety of initiatives since the center’s founding, ranging from the Maryland Open Source Textbook initiative to efforts that resulted in policy changes affecting the use of social media in courses.

This was not an avalanche of change sweeping all before it, however. In the system-supported initiatives, only about 10 percent of the eligible courses were actually proposed for redesign grants; the large majority of faculty held back, even though many of them were interested enough to investigate the option. Even today, the numerous changes sparked by the pilots are vulnerable to the departure of their champions or the next round of budget cuts.

Our interviews with faculty and administrators suggest that seven institutional foundations made a difference for redesign. When a foundation was strong, they often remarked, the work was a bit easier. When a foundation was weak, it was somewhat more difficult to foster redesign. We concluded that it was important to ask seven questions to understand why a successful redesign had, or hadn’t, sparked a sustained, expanding pattern of change at an institution:

1. Were senior administrators and department chairs visibly and continually demonstrating that improving learning outcomes was a priority?

2. Was there an institutional history of pragmatically working across silos in order to solve problems and seize educational opportunities?

3. Did many faculty have beliefs about teaching, learning, and their own instructional roles that were consistent with the activities required for redesign?

4. Did large numbers of faculty already have experience with at least a few of the elements of redesigning a course and then teaching it?

5. Were necessary infrastructure and support systems already available?

6. Did the institution already offer assessment-related services to help faculty use evidence to guide student learning?

7. Were faculty personnel policies and practices encouraging or discouraging faculty to work on innovation?

As we have already described, there was indeed a wave of subsequent changes across the system.

Today, the focus in many institutions and systems across the country is on making demonstrable improvements in graduation outcomes. Redesigning a single course here and there is unlikely to have that effect. To create a pervasive enough pattern of change, far more faculty will need to be engaged in reconsidering how courses and academic programs might improve.

Typically such calls for change have resulted in direct action, such as the USM course redesign initiatives. Our findings suggest that, for such initiatives to spread, institutions also need to strengthen those seven foundations. Following are suggestions for how each foundation might be made stronger.

1. Senior leadership

USM faculty often commented on the benefits of strong leadership from the top—the chancellor of the system, senior administrators at individual institutions, and department chairs. The faculty wanted to see that their leaders were visibly and consistently backing this kind of improvement.

To foster a supportive climate for improving graduation outcomes through academic transformation, senior leaders need to recognize

  • that it is essential for the institution to improve graduation outcomes, especially for students from underserved economic and ethnic backgrounds;5
  • that such improvement can be achieved by changing how students learn (a point some will see as obvious, while others will object that good students will learn and bad students will not, regardless of how they are taught);
  • that changes in how students learn will require changes in how students, faculty, and the institution normally use time and money (contrary to the assumption that reform necessarily implies adding something alongside normal practice, with the support of extra money);
  • that to accomplish this level of change, the institution needs to take the long view (recognizing that demonstrable improvements in graduation outcomes can easily take a decade of incremental, cumulative steps).

2. Cross-silo relationships

Our interviews revealed that improving even one course requires collaboration among people who might not have met previously and who have no direct authority over one another—for example, faculty, facilities managers, and information technology support staff.

Improving graduation outcomes will require a more extensive set of relationships, some of which will need to have been developed through the normal work of the institution. If not already happening on a small scale, such collaboration should be encouraged in order to build the working relationships needed for larger-scale and more sustained initiatives. For example, are the teaching center staff, department chairs, online learning office staff, library personnel, and disability support specialists working together yet to help faculty improve their teaching online and on campus?

It takes time to develop the mutual understanding needed for work under pressure. Without such an understanding, people may first overestimate and then, after a disappointment, underestimate the capabilities of their collaborators.6 The difficulties of collaboration are exacerbated by the fact that everyone in a university feels over-committed and under-resourced, though they may not believe that their colleagues in another silo feel the same. Everyone has priorities that can easily delay or displace the collaborative effort, unless the focus of the effort is of great importance to all. This is why a culture of collaboration can only be produced by persistent effort over many years.

3. Core faculty beliefs about teaching and learning

Interviewees sometimes mentioned that at least a few colleagues objected in principle to the work of redesign. For example, they might object to the proposition that how students are taught in college has a powerful role in determining how well they learn. (Counter-proposition: No, student learning in college is almost entirely determined by the kind of people they are, not by how they are taught.) Or they might object to the proposition that faculty sometimes need to agree on course goals, core content, or assessment techniques in order to promote student success. (Counter-proposition: No, for faculty even to ask one another to agree on course goals, core content, or methods of assessment would be an unacceptable violation of their academic freedom.)

A large-scale reform effort to improve graduation outcomes can quickly derail if even a large minority of faculty refuse to participate because they believe that its goals are foolish or its methods are out of bounds. At a minimum, faculty need to become comfortable discussing and debating such propositions. For example, when discussing appointments, promotion, and tenure, it should be acceptable to explore candidates’ past efforts to improve student learning outcomes.

4. Faculty experience with learning-centered practice

Faculty teaching redesigned courses told us that many of their colleagues were probably reluctant to take the plunge into course redesign because they lacked experience with many of its elements, such as backward design, managing collaborative work in the classroom, using evidence to guide their teaching, or using technology to enable them to teach in a way they’d prefer. Before faculty in a department can take on something as ambitious as improving a degree program, it would help if most of them were, at a minimum, comfortable with at least some such practices.

Institutions and grantmakers ought to experiment with new strategies for engaging faculty on a large scale. For example, imagine that one-fourth of the faculty already use a simple collaborative learning activity called “think pair share,” that a quarter of the faculty would be dead set against it, and that the remaining half of the faculty would be willing to try it in the right circumstances. What might those circumstances be? What strategy might a university use to engage that half of its faculty with this comparatively simple, powerful, easy-to-try, low-risk element of learning-centered teaching?

5. Institutional infrastructure and support systems

Interviewees in our study often mentioned elements of infrastructure such as appropriate technology support and appropriate classroom facilities. They mentioned technology support because their redesigned courses often relied on several kinds of digital tools and resources working together smoothly. They mentioned classrooms because their redesigns depended on students’ use of computing and their ability to shift easily from small-group work to full-class work and back again.

For an institution taking on the challenge of improving graduation outcomes, at least two other elements of infrastructure are likely to be quite useful. The first of these concerns the training and rewarding of undergraduate learning assistants (ULAs). Our study discovered that ULAs, depending on whether and how they were employed, had an enormous impact on DFW rates.7 For example, ULAs enabled faculty to use more active and collaborative learning strategies in their courses. But almost all the faculty leaders had to reinvent the wheel in terms of recruiting ULAs, preparing them, and making sure they were adequately rewarded.

The second useful element concerns faculty members’ need for proactive support in program mapping and design, in improving assessment, and in doing research on what graduates are able to do. A well-staffed teaching center, or the equivalent, needs to provide those supports. In our study, it appeared that some course redesigns could have benefited from more such support.

6. Assessment-related services

This entire essay deals with teaching improvements guided by evidence of learning. For such learning-centered practices to become the norm, institutions need to provide several kinds of assessment support.

Redesign often is intended to develop students’ abilities to work on unscripted problems. Faculty are sometimes reluctant to make such assignments because it can be time-consuming to grade them and provide feedback. Fortunately, there are some time-saving methods for doing that; unfortunately, few faculty are aware of them. The institutional capability to help large numbers of faculty experiment with simple elements of learning-centered practice (foundation 4) ought to be used for this purpose. One example is to adapt and share rubrics in order to describe the goals of an assignment and then to provide grades and feedback.

To improve graduation outcomes, student learning needs to be monitored as students work from course to course. Techniques and services for monitoring student learning range from day-one assessments to uses of learning analytics.

Finally, there is a need for robust student feedback (or evaluation) forms. For example, the institution should offer tools faculty can use to gather anonymous student feedback midway through a course. Such a process empowers both faculty and students, and it lays the groundwork for end-of-course student evaluations.

In discussing the first foundation above, we suggested that senior leaders need to reinforce the idea that rethinking teaching implies rethinking how time, money, space, and other resources are used. But, as our interviews suggested, few faculty begin with much of an understanding of how those resources are currently being used in their courses and departments. How much time and money is needed to provide technology support for a course? How much faculty time is used for grading? Answering questions such as these requires doing some information gathering and spreadsheet-level math. For example, some years ago, faculty at the University of Pennsylvania wanted to make undergraduate engineering laboratories both more effective and more efficient. They asked faculty members and graduate students managing labs to report on how much time they spent on various tasks and, specifically, which of those activities were fulfilling and which were a burden. Their findings led to a successful reconceptualization of lab facilities, staffing, and methods.8

7. Faculty personnel policies and practices

As we probed for factors that might discourage faculty participation in course redesign, perhaps the most frequent remark was “well, there’s the reward system, of course.” The discussions then went on to include other issues related to faculty personnel policies and practices.

Some interviewees mentioned that the use of data from course evaluation forms sometimes discouraged faculty innovation, because no allowance was made for transient drops in student evaluations when work became more challenging. Others pointed out that defining “teaching load” in terms of whole courses subtly discouraged faculty from sharing the work of teaching a course or breaking a course into smaller chunks of instruction. Alternative ways of defining teaching responsibilities are more flexible—for example, through agreements that allocate a certain percentage of each faculty member’s time to teaching-related activities.

Interviewees also pointed to problems caused by the nature of contracts with part-time faculty. The materials and student tools for redesigned courses tended to be updated frequently, requiring section leaders to undergo training—sometimes off campus. Typically, there was no allowance in adjunct contracts for such extra preparation.

Synergy between direct action and the strengthening of foundations

The story leading to the course redesign initiatives at the USM began with leadership, when the regents and the chancellor created the Efficiency and Effectiveness initiative in 2004. Two years later, as part of that effort, the first course redesign initiative was funded in order to transform a course in each institution across the system. Early successes and new grants encouraged the regents and the chancellor to set “academic transformation” as one of five major themes of the ten-year strategic plan for the system that was adopted in 2010. From then on, presidents would be assessed annually, in part, on the basis of institutional progress in this area.

System provosts began using “academic transformation” as a regular topic for their monthly meetings. This leadership, along with continuing changes in the larger world, encouraged USM provosts to assign people to coordinate institutional work on academic transformation, strengthening leadership still further. Those same conditions also encouraged the legislature to increase funding for academic transformation. Meanwhile, the system office created a center for academic innovation, which then organized those institutional coordinators into the Academic Transformation Advisory Council. Together, the council and the center provided the additional infrastructure needed to promote student success across the system. They began work on removing policy barriers to innovation and on helping institutions work together to explore new strategies, such as the use of learning analytics (foundation 6), competency-based education, and open-source textbooks.

The synergy between direct action and the strengthening of foundations occurred within institutions as well as across them. For example, the faculty fellows who helped run the later iterations of the course redesign initiatives also developed the mutual understanding that could be useful in later initiatives. Within institutions, course redesign plans sometimes led to renovating learning spaces (infrastructure), new patterns of collaboration, and fresh experience with learning-centered practices for faculty teaching the many sections of these giant courses. In fact, the USM course redesign initiatives seemed to strengthen all seven foundations, while the seven foundations were helping the initiatives succeed.


In the past, universities have tried to improve learning outcomes by altering curricula or improving advising. But experience and research in many quarters, including our research on course redesign across the University System of Maryland, suggest that altering courses and programs has a better chance of success if the institution strengthens seven foundations:

1. Seek and retain senior administrators and department chairs who allocate their time and resources to improving learning outcomes.

2. Where needed, work across silos to solve problems and seize opportunities, in the process developing relationships that later can be used for larger-scale, more sustained efforts.

3. Encourage faculty discussion and debate about core beliefs about teaching, learning, and their own instructional roles.

4. Help a large fraction of faculty gain experience with at least a few of the elements of learning-centered teaching.

5. Provide necessary infrastructure and support systems for more learning-centered and more technology-intensive approaches
to teaching.

6. Provide the kinds of assessment-related services needed to guide teaching and learning.

7. Examine faculty personnel policies and practices to make sure that they do not subtly discourage faculty, full-time and part-time, from working to improve student learning.


1. Earlier work by the National Center for Academic Transformation (NCAT) inspired USM’s course redesign initiatives. In the NCAT definition of “course redesign,” equal priority is given to improving quality and cost saving (by which they mean freeing faculty resources for other purposes). USM institutions eventually used the term only to refer to the rethinking of a course, often supported in part with technology, to improve learning outcomes.

2. For information about backward design, see Grant Wiggins and Jay McTighe, Understanding by Design (Alexandria, VA: Association for Supervision and Curriculum Development, 2005). In this article, “teaching” includes anything done intentionally to support the learning of others. When faculty members create instructional materials or remain silent to encourage students to talk, they are teaching. When students in a small group assignment help each other get unstuck, they are participating in teaching, too.

3. “Learning-centered teaching practices” are chosen, refined, and judged by actual learning, not just by teaching intentions. The evidence of learning derives from educational research and/or from informal observation and analysis of learning in a particular course or program.

4. See Stephen C. Ehrmann and M. J. Bishop, Pushing the Barriers to Teaching Improvement: A State System’s Experience with Faculty-Led, Technology-Supported Course Redesign (Adelphi, MD: William E. Kirwan Center for Academic Innovation, the University System of Maryland, 2015).

5. While half of all people from high-income families have a bachelor’s degree by age twenty-five, just one in ten people from low-income families do. See Martha J. Bailey and Susan M. Dynarski, “Inequality in Postsecondary Attainment” in Whither Opportunity: Rising Inequality, Schools, and Children’s Life Chances, ed. Greg Duncan and Richard Murnane (New York: Russell Sage Foundation, 2011), 117–32.

6. For an early anthropological account of this kind of cross-silo overestimation followed by underestimation, see Margaret Luszki, Interdisciplinary Team Research: Methods and Problems (New York: New York University Press, 1958).

7. See Ehrmann and Bishop, Pushing the Barriers, 41.

8. See Rita M. Powell, Helen Anderson, Jan Van der Spiegel, and David P. Pope, “Using Web-based Technology in Laboratory Instruction to Reduce Costs,” Computer Applications in Engineering Education 10, no. 4 (2002): 204–14.

Stephen C. Ehrmann is associate director for research and evaluation at the University System of Maryland’s Kirwan Center for Academic Innovation, and M. J. Bishop is director of the center.

To respond to this article, e-mail, with the authors’ names on the subject line.

Previous Issues