Select any filter and click on Apply to see results
Table of Contents
Rethinking the Student Course Evaluation
How a Customized Approach Can Improve Teaching and Learning
How a Customized Approach Can Improve Teaching and Learning
The standard evaluation forms used in most college and university courses are ill suited to assess courses and instructors whose goals and pedagogy differ significantly from the “chalk and talk” approach to undergraduate instruction. This conclusion became clear as I struggled to create an instrument that would be useful for evaluating a newly designed proficiencies-oriented elementary economics course. The evaluation instrument that emerged downplays the narrow focus on the surface attributes of courses and instructors. Instead, by asking students to reflect on course content, design, and pedagogy, it focuses on what students learn. The illuminating responses have provided useful insights about the effectiveness of my teaching and have proved helpful in fine-tuning course content, design, and pedagogy to enhance student learning.
The mandated course and instructor evaluation system currently in place at my university is deeply embedded in academic life, even though little evidence exists as to the accuracy and meaningfulness of the results it produces. Student government organizations, which initiated the system, regularly publish evaluation results obtained from departments for the purpose of helping students identify the most popular courses and instructors. Departments produce voluminous compilations of evaluation results each semester to inform department recommendations for merit salary increases, promotions, teaching awards, and the like. Campus-wide faculty promotion committees, wanting to communicate their concern about good teaching, require that course evaluation results be included in department-compiled dossiers of faculty members recommended for promotion to tenured positions.
The machine-scored evaluation questionnaire long used in the economics department includes
- four student data questions about year in school, plans for majoring in economics, reasons for taking the course, and proportion of class sessions attended;
- seven faculty evaluation questions, asking about the proportion of well-prepared and clearly presented lectures, proportion of interesting and/or thought-provoking lectures, usefulness of assigned readings, and effectiveness of exams in measuring knowledge of course material;
- three course/instructor questions that constitute the heart of the form and ask about the relative difficulty of course, whether a respondent would recommend the course to a friend, and how the respondent would rate the professor’s performance in the course.
What useful information is conveyed by such summative evaluations remains in doubt. Many faculty members privately express skepticism, wondering whether high rankings reflect superior teaching, easy grading standards, or simply instructor “popularity.” In addition, considerable numbers of students either do not complete the forms or do so carelessly. Even more important, because the forms lack a formative dimension, they yield little information that can help interested faculty members improve their teaching. At the very least, any useful evaluation system should be both summative and formative.
A proficiencies-based course
When it became clear a decade before retiring that I would be teaching regularly, among other courses, a moderately sized (40–110 students) general economics course, I decided to take a proficiencies-based approach. I first developed this approach to teaching economics in the early 1980s, after having experimented with its various components during the previous decade.1 The idea underlying this approach can be framed by the following question: how do we want our economics majors to be able to demonstrate what they learned the day after they graduate, when they are no longer disciplined by class attendance, reading assignments, homework exercises, examinations, and course grades? Based on extensive contacts with officials in government, private sector leaders, colleagues, and recently graduated economics majors, I devised a set of seven proficiencies expected of economics majors:
- accessing and organizing existing knowledge
- displaying command of existing knowledge
- interpreting existing knowledge
- interpreting and manipulating quantitative data
- applying existing knowledge
- creating new knowledge
- questing for knowledge and understanding
The challenge came in figuring out how to begin developing these proficiencies in what would be the first and perhaps only economics course many students would take. Clearly, not enough time would be available in this single introductory course to master all the proficiencies expected of graduating economics majors. Instead, emphasis was placed on the importance of nurturing these proficiencies sufficiently to help students, largely freshmen and sophomores, succeed in this course and at the same time begin to acquire proficiencies that would be of value to them no matter what major they might choose.
As a start, students needed to understand the proficiencies and be actively involved in their own learning; thus, the course packet included an assigned reading that describes these proficiencies. The texts selected and the contents of the reading packet were designed to highlight and reinforce the key concepts and principles being taught within the context of a course that emphasized contemporary economic issues and policies. Consistent with this objective, considerable attention was given to enhancing economic literacy, by linking the concepts being studied to current newspaper articles that illustrate how these economic concepts are being applied in the outside world. To reinforce the importance of the proficiencies, I tried to model them in lectures and also in weekly discussion sections that were structured to involve students and challenge them in new ways.
The design of the course encompassed the following major components: setting out the course objectives, the content knowledge to be learned, and the proficiencies to be developed; deciding on the appropriate instructional materials; selecting the pedagogies to be used in the course; giving students practice in applying what they were learning; requiring students to demonstrate not only what they learned (the economics content knowledge) but also their ability to use that knowledge (the proficiencies); and encouraging students to reflect more actively on their learning.
The importance of gaining practice in the proficiencies was emphasized by teaching the course as a writing-intensive course. The importance of students being able to apply their learning to current economic issues was emphasized by several major writing assignments; these assignments required students to select recent newspapers articles and analyze them for the economic concepts and principles embedded in them. To prepare for the discussion sections, students had to write a summary or précis for each of the assigned readings or three types of questions to guide these discussions. Finally, students were challenged to submit two questions (and their own answers to them) suitable for inclusion in each of the three exams, including the final exam. In short, the course was designed to help students develop the proficiencies by engaging them in a wide array of learning activities.
Officially, this was a three-credit course, but the campus timetable listed three lectures (Monday, Wednesday, Friday) along with four discussion sections (Thursday or Friday). Because I would have to miss several lecture periods, and because I wanted to teach the discussion sections, these sections did not meet every week. On average, the class met about three and a half times per week. I was careful to explain the heavier workload during the opening session of the course and in the syllabus. I also indicated that I, rather than a teaching assistant, would meet the discussion sections, and consequently I, too, would be working harder than I was required to do. I went on to explain quite candidly that I wanted to work closely with them in a variety of ways to help them enhance both their content knowledge and their proficiencies in economics.
About the course and its evaluation
The most immediate challenge was to figure out how best to describe the proficiencies-oriented course to students during the opening week of classes. I devoted considerable time in each of the first two class meetings to explaining the proficiencies approach and its implementation. I indicated my keen interest in helping students learn more and doing it better and faster by engaging them in a variety of learning activities that would be new to most of them. I made it clear that the course would be demanding and that it would include a larger than usual number of assignments. Finally, I indicated my interest in learning from them how they perceived the course, what advantages it had for them, and what additional costs in time and effort were involved.
Early on, students got the flavor of my interest from an evaluation conducted at the end of the fourth week. The evaluation format was not new. I had always found it useful to get early feedback on the course. To add to its legitimacy, I asked a group of student volunteers to administer the evaluation, tabulate and summarize the results, and discuss with me the evaluation results prior to the next class meeting. After the volunteers conveyed the evaluation results to me, I went to the next class where I summarized these results. Most of the results were familiar to me from past semesters—poor handwriting on the blackboard, talking too fast, requiring too much work, and so on. I told them I would try to deal positively with their suggestions. On some matters, I had to explain the importance of some facet of the course—for example, the writing assignments—and why it was important in helping them learn economics. Students were usually disarmed by my openness, and I did try to overcome my obvious shortcomings.
In conducting this early evaluation, I am trying to enlist students to make a stronger commitment to the course and the challenges of learning. What always surprises me is how little time students say they devote not only to my course but also to their other courses. When the modal response is usually three to four hours per week for my course, I remind them of the old rule of thumb, which is to study two hours outside of class for every hour spent in class. Most students have never heard of the rule; I explain that all freshmen heard this rule in an earlier age when grading standards were much tougher. This gives me a chance to recommend that students devote more time not only to my course but to all their other courses, if they are to make the most of their undergraduate experience.
I also take the occasion to give them a brief economics lesson. They need to be reminded of not only the “sticker price” cost of their education but also the “opportunity cost” of attending college, namely, the earnings they forego while attending college. I close by noting the substantial “tuition subsidy” they receive from state taxpayers, who include their parents, a subsidy that in the last years I taught was still equal to the amount of tuition they paid.
Toward the end of the semester, I explained to my students that I would need their assistance in conducting a more detailed end-of-the-semester evaluation. They were receptive to this idea because by this time it was apparent that they appreciated the course and how it was conducted. Meanwhile, the department agreed to allow me, rather than a department secretary, to administer the evaluation. By being present, I signaled my interest in having students respond thoughtfully to the evaluation questions—not only the standard questions, but also the much more detailed questions specific to course content, design, and pedagogy. Students were given more time to respond (about thirty minutes, rather than the usual ten minutes). When the students completed their evaluation, I designated a student to collect the forms, place them in a sealed envelope, and deliver them to the department office for later processing. To ensure confidentiality, these forms as well as a summary of the machine-read responses were turned over to me only after I had submitted my course grades.
Creating the new evaluation form
I gave considerable thought to the kinds of student feedback that would be useful in evaluating my efforts to enhance students’ learning in economics, develop their proficiencies, and increase their awareness of what and how they were learning. The newly designed evaluation form sought answers to seven major questions.2
1. Did the course deliver what it promised to deliver? The focus here was on the four major content objectives cited in the syllabus.
2. How effective were the instructional materials and pedagogy described in the syllabus and the course reading packet in helping students learn the subject matter? The purpose was to ascertain the perceived effectiveness of each of fifteen different types of instruction-related materials and assignments, including the instructional materials used (texts, reading packet, handouts), pedagogy employed (lectures, class discussions, and structured group discussion), learning activities utilized (preparing questions on readings, writing assignments, preparing study questions for exams), and type of exams employed (essay and short-answer questions). This long list provided useful information to me; in addition, it was included to stimulate student thinking about the usefulness of the proficiencies approach to learning.
3. How effective were the varied learning activities in improving the ability of students to use their content knowledge of economics? The list included ten different kinds of learning activities, all of them related to the proficiencies, though not on a one-to-one basis.
4. Which of fifteen categories of instruction-related materials and assignments were most helpful in learning the subject matter? To sharpen their responses, students were asked to check no more than four of the fifteen items.
5. How much emphasis should be given in the following semester to each of ten different kinds of learning activities in order to help students improve their ability to use their knowledge? Students were asked to respond to each item.
6. How did students evaluate this course or instructor based on the key questions included in the department’s standard evaluation form? These questions ask (a) about the relative difficulty of the course, (b) whether a student would recommend the course to a friend, and (c) how the student would rate the professor’s performance in the course. These questions had to be included so that the evaluation results for this course would be comparable to those for the rest of the department’s courses. I assumed the answers to these mandated questions would be better informed because students had already been asked to respond to a more complex group of questions that pushed them to think more deeply about the distinctive features of the course and what they learned.
7. What additional observations did students offer in the open-ended questions that were an integral part of the evaluation form? Students were asked to elaborate on two sentence-completion statements and to respond to three specific questions. The two sentence-completion statements were (a) “The thing I liked most about this course is. . .” and (b) “The thing I liked least about this course is. . . .” The three specific items were (c) “Please describe how you think your ability to write has been improved as a result of this course,” (d) “What part of the course proved to be most interesting/stimulating?,” and (e) “Should the course materials (books, reading packet, handouts) be changed?” Students also had the option (f) of writing additional comments on the back of the evaluation form.
Interpreting the evaluation results
This teaching/learning approach to evaluation was used regularly over a five-year period. Because the questionnaire evolved as I gained experience using it, I report here on the results for the two most recent semesters, one class with fifty-two students and the other with forty-one students. What did this new approach to evaluation tell me that I needed to know about course content, design, and pedagogy? And, how did I react to the students’ comments?
Overall, students felt the course’s content objectives had been met, and they most valued the student-teacher interaction in the lectures and discussion sections. They appreciated the “active learning” nature of the course, particularly “learning how to relate course concepts to current economic events.” Student interest was most stimulated by the same activities that contributed most to their learning. Those who had already commented favorably on the importance of “learning how to relate course concepts to current economic issues” and “learning to think more critically” wanted these activities to be even more heavily emphasized in the future. The high ranking given in response to the department’s evaluation question, “how would you rate the professor’s performance in this course?,” occurred despite the “more difficult” nature of the course and a somewhat greater reluctance to recommend the course to others.
The responses to the open-ended questions were positive and constructive. Despite the demanding nature of the course, student comments indicated that they appreciated the course, how the instructor organized and taught it, and what and how they learned.
The major criticism concerned the heavy workload; students regarded it as too great for the three credits they earned. Students objected to the amount of writing required and how it was evaluated, even though the course carried a “writing-intensive” label in the course catalog. Some students also commented that more should have been done to demonstrate how to write an analysis of a news article dealing with an economic issue. Students were not happy with the two question-writing activities, but still gave a high ranking to studying from the student-prepared exam questions that resulted from an activity they did not profess to enjoy. In addition, students not only did not like the book on writing but also objected to buying it in light of the minimal use made of it.
Now, my reactions. Student concerns about the workload had some merit. Yes, the course was much more demanding than most other courses, at least those in the social sciences and humanities. While it would have been easy to reduce the workload and the number of writing assignments, I am not sure that doing so would have been in the best interests of the students; many of them needed to sharpen their learning skills and experience. Yes, the grading was tough, but I attribute student concern in considerable part to often-rampant grade inflation in other disciplines. I viewed the escalating grade inflation as no reason for me to relax my grading standard. Yes, the reading packet should have contained a guide with examples of how to write a news analysis. Though guides were provided to help students develop other skills, such as writing discussion questions, I had failed to develop this particular type of guide. Yes, the question-writing activities were demanding, but students needed to develop this skill. Yes, the writing book was not used enough to warrant its purchase, even though every student should possess a writing book for general reference; in future offerings of the course, I would still require a writing book but make greater use of it.
My principal conclusion is that the course accomplished the goals I set for it, but that some fine-tuning was required. Most important was my need to do a better job of explaining to students the importance of what we were doing, why we were doing it, and how the various elements of the course fit together and reinforced each another.
Developing and implementing this new instrument produced useful knowledge about the untapped potential of a teaching/learning approach to student evaluation of courses and instructors. By incorporating required questions from standard evaluation forms, individual faculty members can create evaluation forms that meet their own needs as well as those of their departments.
What most impressed me was the willingness of students to respond to the many questions in this much more detailed end-of-semester evaluation form. Not only did students respond to all fifty-five items, but they devoted considerable time and thought in responding to the open-ended questions. I was also impressed by the willingness of students to indicate through their detailed comments that they really cared about what they were learning and how the course and instructor contributed to their learning. There was no reason to believe these students were special in any sense. The course itself had no special prerequisites; it was one of three options open to students—a small honors course, a large lecture course with discussion sections led by teaching assistants, and this course.
What made the difference? It was the combination of a more challenging course, an evaluation instrument geared explicitly to the course, and an expectation that the responses would be used to improve the course. This experience demonstrates that by taking a teaching/learning approach to evaluation, faculty members can obtain helpful feedback to use in altering course design, objectives, and pedagogy. They also can identify barriers to student learning and discover much about what their teaching is doing to enhance student learning.
Beyond this, what else did I learn? Most important, students are willing, even eager, to engage more fully in mastering course content and simultaneously improving their learning skills. They are willing to do so even with a heavier course workload and a tougher grading regime because they perceive positive benefits in terms of both what and how they learned. As the open-ended comments revealed, students felt they mastered the economic content and improved their ability to apply their learning to understanding and interpreting current economic issues and problems. They enhanced their writing and discussion skills through the wide range of learning activities; they increased their ability to think, write, and discuss analytically and critically; and they appreciated the instructor’s efforts to help them learn more and become better learners.
Readers interested in the “bottom line” will ask how the results presented above square with the responses to the three questions from the standard evaluation form. As is apparent from the preceding discussion, this course was perceived as more difficult than comparable entry-level courses in economics—i.e., large lecture courses and small honors courses. Yet compared to other entry-level economics courses, this course received a slightly higher ranking on recommending the course to others and a much more favorable ranking for the professor.
The proficiencies-oriented course was viewed as slightly more difficult than earlier versions of this same course, with its difficulty rating moving up from 3.7 to 3.8 on the five-point scale. The “recommend the course to a friend” rating dropped from 4.0 to 3.6, but was still higher than that for other comparable courses in the department. Finally, the “how would you rank the professor’s performance” rating rose slightly from 4.4 to 4.5, thereby increasing the gap between this course and other comparable courses in the department. These differences underestimate the impact of shifting to a full-fledged proficiencies course, because in the pre-proficiency versions many elements of the proficiencies approach were being gradually implemented and tested.
These additional results can be interpreted as follows. The heavier workload and greater difficulty of the course decreased its attractiveness. The negligible increase with respect to the “rate the professor” question can be attributed in considerable part to the instructor’s already high 4.4 rating in his prior teaching. The drop in the rating on recommending the course to others is probably attributable to the heavier workload. Although these results apply to a single course and a single instructor, they accord with the traditional view that more difficult courses are generally viewed as less attractive notwithstanding a high regard for the professors who teach them.
To equip students to practice their learning long after they graduate, individual faculty members must find new ways of combining course content, design, and pedagogy in order to engage students in their own learning more fully. They must give greater emphasis to what and how students learn, and what they can do with their learning after acquiring it. Above all, this means focusing on what students are learning and how their intellectual development is being stimulated in the process. Instituting a proficiencies-based approach to instruction and learning offers an effective means of accomplishing these objectives. The approach to course evaluation described here is an essential building block in constructing an effective proficiencies-based course. n
1. For additional information about the development of my expected proficiencies approach in the economics major, see W. Lee Hansen, “What Knowledge Is Most Worth Knowing—For Economics Majors?,” American Economic Review 17, no. 2 (1986), 149–53; Hansen, “Reinvigorating Liberal Education with an Expected Proficiencies Approach to the Academic Major,” in Educating Economists: The Teagle Discussion on Re-evaluating the Undergraduate Economics Major, ed. D. Colander and K. McGoldrick (Northampton, MA: Edward Elgar, 2009), 188–96.
2. For more information on the structure of the course evaluation form as well as the detailed student responses for the course described in this article, see W. Lee Hansen, “Creating a Teaching/Learning Evaluation Instrument for Proficiencies-Based Economics Courses,” August 18, 2014, http://www.ssc.wisc.edu/~whansen/?p=314.
W. Lee Hansen is professor emeritus of economics at the University of Wisconsin–Madison.
To respond to this article, e-mail email@example.com, with the author’s name on the subject line.