Liberal Education

Transparency in Teaching: Faculty Share Data and Improve Students' Learning

Faculty rarely have opportunities to research their students’ views about how their best learning happens in college or graduate school. Even less common are the means for teachers to gather such information from colleagues on a large scale and distill it into pragmatic insights about teaching practices best suited to their own particular students. The Illinois Initiative on Transparency in Learning and Teaching is a grassroots assessment project doing just that, and it demonstrably enhances students’ learning. The project has two main goals: (1) to promote students’ conscious understanding of how they learn; and (2) to enable faculty to gather, share, and promptly benefit from data about students’ learning by coordinating their efforts across disciplines, institutions, and countries.

Statistically significant early results indicate distinct current and future learning benefits of particular teaching and learning methods that are specific to discipline, class size, level of expertise, and student demographics. Reporting of the results helps faculty identify and adopt the learning and teaching method(s) best suited to achieving the desired outcomes for the specific population of students in their courses. And ongoing analysis suggests that benefits for underrepresented and nontraditional students might be leveraged to promote higher retention and graduation rates for these groups, and even increased participation of diversely prepared students in master’s and doctoral degree programs.

The Transparency Initiative complements existing assessments of content mastery and teaching performance by asking students about their perceptions of the current and future learning benefits they are gaining. And it reimagines the scope of impact by sharing the aggregate data and findings (anonymously and with the approval of an institutional review board) across the institutional and national confines that usually circumscribe such research. Since 2010, the initiative has involved more than twenty-five thousand students, one hundred sixty courses, and twenty-seven institutions in seven countries.

The practices tested share several things in common: they are transparent, requiring explicit conversation among teachers and students about the processes of learning and the rationale for required learning activities; they involve relatively minor adjustments to any teacher’s current practice; and they are consistent with research-based best practices in higher education.

Studying transparent teaching practices

With faculty and students as a starting point, the Transparency Initiative developed from a desire to research a phenomenon that faculty reported anecdotally in a series of pedagogy seminars at Harvard University, the University of Chicago, and the University of Illinois: students’ learning outcomes improved when they understood how and why instructors had structured their learning experiences in particular ways (Winkelmes 2013). Prior research on metacognition demonstrated that students learn more and retain that learning longer when they have an awareness of and some control over how they are learning (Cohen 1980; Dunlosky and Metcalfe 20009; Francis, Adams, and Noonan 1998; Light 1990; Nelson and Dunlosky 1991; Perry, Hall, and Ruthig 2007). Research also suggests that training students to understand how to have more agency in their learning increases their academic success (Perry, Hall, and Ruthig 2007; Gynnald, Holstad and Myrhaug 2008), and that monitoring students’ understanding of their learning can enrich assessment practice (Micari et al. 2007).

To find out more about their students’ experience of this phenomenon, a small faculty cohort at the University of Chicago worked with the Center for Teaching and Learning there to develop a questionnaire in order to gather students’ perceptions of their learning experiences. The first online survey was tested for reliability, validity, and factor loadings in the spring of 2009. In 2010, it was analyzed, revised, and retested at the University of Illinois, and then revised again. Analysis of the data gathered by the current survey not only benefits faculty participants—providing prompt data about practices that enhance their students’ learning—but it also contributes to the state of research in three ways. First, it adds a measure of students’ learning perceptions that complements assessments of students’ mastery of intellectual skills or disciplinary content. Second, to research on the importance of metacognition, it adds new information about how students’ awareness of their learning process relates to their perception of learning outcomes. And third, to research on diversity and learning, it adds new data about learning practices that benefit underrepresented undergraduate and graduate students.

Faculty role

Instructors are essential not only to the design of the survey but also to the implementation of the initiative. Data gathering is facilitated voluntarily by teachers at the course level, rather than by institutions. Faculty participants have identified several common teaching practices that can enhance students’ metacognition when the instructor and students address them explicitly together as part of the course’s work.1 The list grows as participants identify new methods to be investigated. As faculty join the project, they usually choose to make one small change in their teaching, at their own discretion, and then students complete a four-to-five-minute online survey at the end of the semester. Because many of these practices are already familiar, it usually takes little time and minimal adjustment to put them to use and discuss them explicitly with students.

While the project tracks the frequency with which instructors implement their chosen methods, it does not seek to enforce consistency in implementation. Instead, it provides instructors with several possible examples of each method and aims to measure the effects on students’ learning that can be expected when a teacher uses one of the methods at her or his own discretion. Transparency staff arrange permissions with ethical research boards, keeping faculty identities confidential and students’ identities anonymous, so that teachers can focus their time instead on their teaching and their students’ learning. End-of-term reports offer each faculty participant an analysis of students’ learning in their course relative to the learning experiences reported by students in similar courses in the study, along with an overview of findings about the most effective methods with respect to discipline, class size, level of expertise, and some student demographics. Analyses in the reports can immediately be applied to improving students’ learning in the courses these faculty teach.

Benchmarks and related studies

Any study that relies on students’ self-reported perceptions of their learning experiences must account for an important limitation of students’ self-reports: lack of alignment between students’ self-assessments of mastery and their actual performance. To avoid unreliable student self-reports of their overall achievement, the Transparency survey questions focus not on self-assessments of mastery, but instead on students’ reports of how much (if at all) their learning experience in a particular course affected their mastery of content and critical thinking skills.2 To further ensure reliability, the project surveys students according to the conditions under which their self-assessments are most reliable—the answers are known to them, the questions are clear, the questions concern recent activities, the respondents regard the questions seriously, and there is no negative consequence to responding (Kuh 2001, 3–4).

To establish that survey respondents are not over-reporting or overestimating in their responses about their learning experiences, national benchmarks are important. Several questions on the Transparency survey are intentionally similar to questions on both the Personal and Social Responsibility Inventory (PSRI) and the National Survey of Student Engagement (NSSE).3 Comparisons indicate that undergraduate students at US institutions in the control group surveyed by the Transparency Initiative do not overestimate their ability to learn or their learning mastery in comparison with undergraduate students surveyed by the PSRI or NSSE. Because the Transparency survey asks about the learning benefits of a single course, while the PSRI and NSSE surveys ask about a year’s courses or four years’ courses, the control group’s Transparency survey responses ought to be somewhat less positive, and this is indeed the case.

Demographically, the population of US undergraduates responding to the Transparency survey is very similar to the Department of Education’s analysis of undergraduate student demographics overall (Aud et al. 2012).

Benefits of transparent teaching and learning methods

Transparent teaching methods can offer benefits for both current and future learning.4 Several survey questions address aspects of student learning that are directly tied to course activities, and the responses to these help identify benefits for the current learning experience. To help determine future benefits, several survey questions focus students on identifying lifelong learning skills that will be useful to them after the course is completed.

In humanities courses at the introductory undergraduate level, two practices seem to benefit students’ current course learning experiences:

  • Discuss assignments’ learning goals and design rationale before students begin each assignment (in classes ranging in size from thirty-one to sixty-five students).
  • Debrief graded tests and assignments in class (in classes ranging in size from sixty-six to three hundred students).

In social science courses at the introductory undergraduate level, several transparent methods have statistically significant benefits for students’ current course learning experiences:

  • Discuss assignments’ learning goals and design rationale before students begin each assignment (in classes ranging in size from thirty-one to sixty-five students, and in those containing three hundred or more students)
  • Gauge students’ understanding during class via peer work on questions that require students to apply concepts you’ve taught (in classes ranging in size from thirty-one to sixty-five students, and in those containing three hundred or more students).
  • Debrief graded tests and assignments in class (in classes ranging in size from thirty-one to sixty-five students).

In addition, students indicated significant future learning benefits from debriefing graded tests and assignments in class in the courses containing three hundred or more students. As class size in introductory undergraduate social science courses increases—from classes of thirty-one to sixty-five students to classes of three hundred or more students—transparency about the learning goals and design rationale for assignments appears to become more effective for students’ current course learning experiences.

In introductory courses in the STEM fields (science, technology, engineering, and mathematics), with class sizes ranging from sixty-six to three hundred students, the following transparent methods have statistically significant benefits for students’ current course learning experiences and for their future learning:

  • Explicitly connect “how people learn” data with course activities when students struggle at difficult transition points.
  • Gauge students’ understanding during class via peer work on questions that require students to apply concepts you’ve taught.
  • Discuss assignments’ learning goals before students begin each assignment.

Students at the intermediate and advanced levels in STEM courses (containing sixty-six to three hundred students) indicated that the following methods are helpful to their current and future learning:

  • Gauge students’ understanding during class via peer work on questions that require students to apply concepts you’ve taught.
  • Debrief graded tests and assignments in class.

Underrepresented and nontraditional students

Some of the practices tested are especially beneficial for underrepresented students, both at the undergraduate and graduate levels. These students are an important focus of continuing Transparency research in the 2013–14 academic year. At present, the project’s sample sizes allow for some analysis of significant benefits for first-generation students, non-Caucasian students, and transfer students.

In humanities courses at the intermediate and advanced undergraduate levels (ranging in size up to thirty students) that implemented transparency around the learning goals and design rationale for assignments, students who identified themselves as either first-generation college students or transfer students responded more positively than similar students in control group courses in this category to the question, “How much has this course helped you in improving your ability to learn effectively on your own?” Transfer students in introductory humanities courses (ranging in size from thirty to sixty-five students) where the instructor provided commentary about the disciplinary methods and thought processes in use during class responded more positively than non-transfer students to the question, “As a result of taking this course, are you better or worse at recognizing when you need help with your academic work, or has the course made no difference?”

Transfer students in intermediate and advanced undergraduate social sciences courses (ranging in size up to thirty students) using transparency around grading practices responded more positively than non-transfer students to the question, “As a result of taking this course, are you more or less confident about your ability to succeed in school, or has the course made no difference?”

Students who described their racial/ethnic groups as other than Caucasian reported greater gains in academic self-confidence than did their Caucasian peers in courses containing both graduate students and advanced undergraduates in the STEM disciplines (ranging in size up to thirty students) when courses offered transparency around the learning goals and design rationale for assignments. The non-Caucasian students in these courses responded more positively to the question, “As a result of taking this course, are you more or less confident about your ability to succeed in school, or has the course made no difference?” In addition, these same non-Caucasian students responded more positively than their Caucasian peers in these courses to the question, “As a result of taking this course, are you better or worse at recognizing when you need help with your academic work, or has the course made no difference?”

Non-Caucasian students in graduate-level courses in the social sciences (ranging in size up to thirty students) where instructors explicitly involved students in developing the agendas for class meetings responded more positively than their non-Caucasian peers in these courses to the question, “As a result of taking this course, are you better or worse at recognizing when you need help with your academic work, or has the course made no difference?”

While the numbers of underrepresented and nontraditional students participating in the Transparency Initiative have not yet allowed for additional disaggregation of underrepresented students, the initiative aims to gather data in 2013-14 that can be used to enhance the success and graduation rates of underrepresented students in higher education by revealing more about practices that advance their learning. It might be possible to leverage these forthcoming data in order to promote higher retention and graduation rates for underrepresented and nontraditional students, and even increased participation of diversely prepared students in master’s and doctoral degree programs.

Large-enrollment courses

Some of the methods tested seem to enhance students’ learning experiences particularly in large-enrollment courses. While the project has not yet tested large courses in all disciplines at all levels, there are already significant findings regarding practices that benefit students’ learning in large classes. The following practices are associated with increased current learning benefits for students in large-enrollment courses in the Transparency study (ranging from sixty-six to three hundred students in humanities and STEM courses; three hundred or more students in social science courses):

  • Discuss assignments’ learning goals and design rationale before students begin each assignment (introductory social sciences).
  • Gauge students’ understanding during class via peer work on questions that require students to apply concepts you’ve taught (introductory social sciences, introductory STEM, intermediate and advanced undergraduate STEM).
  • Debrief graded tests and assignments in class (introductory humanities, introductory social sciences, intermediate and advanced STEM).

The following practices were associated with increased future learning benefits for students in large-enrollment courses:

  • Discuss assignments’ learning goals and design rationale before students begin each assignment (introductory STEM).
  • Gauge students’ understanding during class via peer work on questions that require students to apply concepts you’ve taught (intermediate and advanced undergraduate STEM).
  • Debrief graded tests and assignments in class (introductory social sciences, intermediate and advanced undergraduate STEM).

In large enrollment courses where the following transparent methods were used, students responded more positively than students in similar control-group courses (where no transparent methods were employed) to the question, “How much does the instructor value you as a student?”:

  • Debrief graded tests and assignments in class (introductory humanities, introductory social sciences, intermediate and advanced undergraduate STEM).
  • Gauge students’ understanding during class via peer work on questions that require students to apply concepts you’ve taught (introductory social sciences, introductory STEM, intermediate and advanced undergraduate STEM).
  • Explicitly connect “how people learn” data with course activities when students struggle at difficult transition points (introductory STEM).

Faculty benefits

The Transparency Initiative removes many of the common barriers to participation by faculty and instructors in assessment of students’ learning, including resistance, lack of control, lack of expertise, insufficient time, lack of short-term benefits to teaching and learning practices, and concerns about privacy. Faculty join the initiative voluntarily and choose methods to implement at their own discretion, because they find their participation beneficial and the resulting data useful to their teaching practice. Participation requires very little adjustment or time from faculty and students.

Faculty can gather information about how their students and similar students at other institutions are learning, and respond to the findings in the next semester. Some report benefits in the same semester that they participate, due to their increased communications with students about learning and teaching methods. The statistical significance of learning benefits of each method is recalculated every semester in order to ensure that the findings from one year to the next remain significant. And reporting of the results helps faculty identify and adopt the learning and teaching method(s) best suited to achieving the desired learning outcomes for the specific population of students in their courses.

Implementing best practices

Not only does the Transparency Initiative apply data to practice, but it also implements good practice while collecting data. The “transparent” practices are compatible with the Principles of Excellence and high-impact practices defined by the Association of American Colleges and Universities’ Liberal Education and America’s Promise initiative (AAC&U 2007), and with research-based practices identified in recent and longstanding publications (Ambrose et al. 2010; Chickering and Gamson 1987). The demonstrated, positive impact of this project on students’ learning and faculty’s beneficial use of assessment data has great potential.

To complement existing and ongoing research on best practices, the Transparency Initiative continues to gather information about which practices enhance students’ current and future learning the most with respect to discipline, level of expertise, class size, and demographics. Already, instructors can benefit from the findings by adopting transparent methods that have been most effective for enhancing students’ learning in courses like theirs, where their colleagues have implemented particular transparent methods at their own discretion. For individuals and institutions who offer general education or other large-enrollment courses, experimenting with transparent methods that promote students’ learning in large courses in their disciplines might lead to students’ increased current and future learning benefits.

Instructors or institutions interested in comparing students’ learning perspectives in on-site, online, and blended courses might make use of the Transparency survey as well. Individual faculty members or faculty development organizations might benefit from adding to this collaborative international project’s data on best practices for enhancing students’ learning experiences. While the research is underway, the benefits accrue directly to the faculty and students involved. Faculty share data that inform their choices about the learning and teaching methods best suited to their disciplines, and to the expertise and demographics of their students, while those students gain an enhanced awareness of their learning.

References

AAC&U (Association of American Colleges and Universities). 2007. College Learning for the New Global Century: A Report from the National Leadership Council for Liberal Education and America’s Promise. Washington, DC: AAC&U.

Ambrose, S. A., M. W. Bridges, M. DiPietro, M. C. Lovett, and M. K. Norman. 2010. How Learning Works: Seven Research-Based Principles for Smart Teaching. San Francisco: Jossey-Bass.

Aud, S., W. Hussar, F. Johnson, G. Kena, E. Roth,
E. Manning, X. Wang, and J. Zhang. 2012. The Condition of Education 2012. Washington, DC: US Department of Education, National Center for Education Statistics.

Chickering, A. W., and Z. F. Gamson. 1987. “Seven Principles for Good Practice in Undergraduate Education” AAHE Bulletin 39 (7): 3–7.

Cohen, P. A. 1980. “Effectiveness of Student-Rating Feedback for Improving College Instruction: A Meta-Analysis of Findings.” Research in Higher Education 13 (4): 321–41.

Dunlosky, J., and J. Metcalfe. 2009. Metacognition. Thousand Oaks, CA: SAGE Publications.

Francis, G. E., J. P. Adams, and E. J. Noonan. 1998. “Do They Stay Fixed?” The Physics Teacher 36 (8): 488–90.

Gynnald, V., A. Holstad, and D. Myrhaug. 2008. “Identifying and Promoting Self-Regulated Learning in Higher Education: Roles and Responsibilities of Student Tutors.” Mentoring & Tutoring 16 (2): 147–61.

Kuh, G. D. 2001. Conceptual Framework and Overview of Psychometric Properties. Bloomington, IN: Indiana University, Center for Postsecondary Research, http://nsse.iub.edu/pdf/psychometric_framework_2002.pdf.

Light, R. J. 1990. The Harvard Assessment Seminars: Explorations with Students and Faculty about Teaching, Learning, and Student Life. First Report. Cambridge, MA: Harvard University.

Micari, M., G. Light, S. Calkins, and B. Streitwieser. 2007. “Assessment Beyond Performance: Phenomenography in Educational Evaluation.” American Journal of Evaluation 28 (4): 458–76.

Nelson, T. O., and J. Dunlosky. 1991. “When People’s Judgments of Learning Are Extremely Accurate at Predicting Subsequent Recall: The Delayed JOL Effect.” Psychological Science 2 (4): 267–70.

Perry, R., N. C. Hall, and J. C. Ruthig. 2007. “Perceived (Academic) Control and Scholastic Attainment in Higher Education.” In The Scholarship of Teaching and Learning in Higher Education: An Evidence-Based Perspective, edited by R. P. Perry and J. C. Smart, 477–552. Dordrecht, The Netherlands: Springer.

Winkelmes, M. 2013. “Transparency in Learning and Teaching: Faculty and Students Benefit Directly from a Shared Focus on Learning and Teaching Processes.” NEA Higher Education Advocate 30 (1): 6–9.

Notes

1. For specific examples of these modes of transparency, see http://go.illinois.edu/transparentmethods.
2. The full set of survey questions can be viewed online at https://illinois.edu/sb/sec/5647574.
3. Originally developed by the Association of American Colleges and Universities through its Core Commitment’s initiative, the Personal and Social Responsibility Inventory (PSRI) surveys faculty, students, student affairs professionals, and academic administrators regarding key dimensions of personal and social responsibility; more information about the PSRI is available online at http://www.psri.hs.iastate.edu. The National Survey of Student Engagement (NSSE) is an instrument designed to help college and university administrators gauge students’ levels of engagement with their learning; more information about NSSE is available online at http://nsse.iub.edu.
4. For explanations and examples of all the transparent methods mentioned here, see http://go.illinois.edu/transparentmethods. For statistical significance of their impact, see http://www.teachingandlearning.illinois.edu/transparency.html.


Mary-Ann Winkelmes is administrative provost fellow and campus coordinator for programs on teaching and learning at the University of Illinois at Urbana-Champaign. The author thanks Elisa Mustari and Yeonwoo Rho for assisting with the statistical analysis presented here.


To respond to this article, e-mail liberaled@aacu.org, with the author’s name on the subject line.

Previous Issues