Winter online evaluations set to start

As the winter semester course-evaluation period begins this week, those responsible for the transition to an online teaching questionnaire remind faculty and students that the success of the program hinges on their participation.

The numbers
Student participation was 62 percent in fall 2008 compared to 64 percent filling out paper questionnaires in class during winter 2008

A comparison of median responses on universitywide questions shows little change between fall ’07 (paper) and fall ’08 (online). On a five-point scale where 5 indicates “Strongly Agree” and 1 indicates “Strongly Disagree,” the median ratings were:

•Overall, this was an excellent course: fall ’07 = 4.19; fall ’08 = 4.13
•Overall, the instructor was an excellent teacher: fall ’07 = 4.50; fall ’08 = 4.50
•I learned a great deal from this course: fall ’07 = 4.21; fall ’08 = 4.20
•I had a strong desire to take this course: fall ’07=3.92; fall ’08=4.00

During the fall term, students responded enthusiastically to the online evaluation, program, even though they had to use their own time to fill out the questionnaires, says James Kulik, director and research scientist, Office of Evaluations and Examinations.

Keeping the momentum, however, means faculty must continue to encourage student participation. Students, too, must remember their role in improving teaching," Kulik says.

"The amount of student participation last semester can be attributed to the efforts made by faculty and GSIs to assure students that their thoughtful comments on teaching methods can help improve the classroom experience," Kulik says.

"It will be important that those who teach continue to stress the importance of student feedback."

To that end, those responsible for administering the system have continued a communications campaign stressing the importance of student responses, using the theme: "Log in/Speak up: Your feedback matters to faculty and GSIs."

In addition, program leaders have equipped faculty with a PowerPoint presentation they can use in class, and have developed more ways to track student input while maintaining the confidentiality of the evaluation process.

A new dashboard feature in CTools allows faculty to receive in-process response counts. This will allow faculty members to know how many students have completed the teaching questionnaire at any point during the evaluation period so that they can tailor their in-class encouragement accordingly.

Another added feature on the dashboard, based on feedback from the fall experience, is evaluation previews for teachers, so they can take a look at exactly what students will see when they log in.

The fall 2008 launch into online teaching evaluations was declared an important success by Lester Monts, senior vice provost, who says one very positive outcome of the process was that the worst fears of faculty and administrators did not come to fruition.

"As we looked at the experience of other universities that had made this transition, their early responses resulted in reduced participation and generally lower satisfaction ratings. At Michigan, students embraced their roles in providing positive feedback and constructive criticism, and the result was participation very close to previous levels and satisfaction measures comparable to past evaluations," Monts says.

"We can't stress enough that continued support and encouragement of this process by faculty and GSIs will help us maintain and strengthen Michigan's long tradition of taking student evaluations seriously," Monts says.

The new system was not without a few glitches, Kulik notes, including some delays getting results to faculty members and departments. The Office of Evaluations and Examinations, MAIS and CTools are making every effort to ensure the problems are resolved this time around, he says.