Professor Eva von Dassow showed images of clay tablets in a class on the Near East in her office.
Kyndell Harkness • email@example.com,
Eva von Dassow, an associate professor of classical and Near Eastern studies, says there’s always a risk that reviews will be skewed.
Kyndell Harkness • firstname.lastname@example.org,
U of M may break secrecy and go public with course ratings
- Article by: Maura Lerner
- Star Tribune
- April 14, 2014 - 11:16 AM
For years, students at the University of Minnesota have been dutifully filling out evaluations at the end of each semester, rating their courses and professors.
But the results, for the most part, have been a closely guarded secret. Starting this fall, that may change.
For the first time, students would be able to look up any course on the U website and see what other students have to say about it, under a proposal before the Faculty Senate. If approved, it will be the culmination of more than a decadelong campaign by student leaders to try to pry the ratings open. “There is a huge demand by students to have more information,” said Nicholas Ohren, 18, a freshman from Eau Claire, Wis., who has taken up the cause. If he knew a course was getting bad reviews, he said, “it would be a lot easier to avoid.”
Traditionally, student evaluations have been internal documents — read and used by the faculty themselves. But increasingly, colleges and universities have come under pressure to post them publicly, in part as a response to sites like Ratemyprofessor.com, where students can share their uncensored opinions online.
Today’s students have grown up expecting this kind of openness, said Valkyrie Jensen, 19, a U sophomore and student representative. “If there’s information, it should be shared, especially on a university setting.”
Yet traditionalists argue that consumer reviews, in a university setting, make little sense.
“The evaluations will tell you, indirectly, how well a professor dresses, how well he or she tells jokes,” said Stuart Rojstaczer, a former Duke University engineering professor who runs a website called GradeInflation.com. Or worse, he said, they reward easy graders over demanding ones. “When you start putting evaluations out before the public, what you’re conveying is a message that student satisfaction is more important than quality of education. And that to me sends the wrong signal.”
Going beyond advisers, sites
Last summer, when Ohren was signing up for first-semester classes, he had little information beyond his adviser’s recommendation, he said. He picked one dance class that turned out to be nothing like he expected. “For me, it was frustrating not knowing about the courses I was going to take,” he said.
He checked the popular website Ratemyprofessor.com, but came away unimpressed. “I think most students understand how unreliable it really is,” he said. Often, the handful of comments are on the extremes, either rave reviews or bitter rants.
Since the 1990s, the U’s student government has been badgering administrators to disclose some, if not all, of the data from the class evaluations. The forms, which have been collected for decades, ask students to rate both the courses and instructors on a series of questions. Among them: Was the instructor well prepared? How many hours did you spend on homework? Would you recommend this course to other students?
In a 1997 survey, 95 percent of students said the ratings would be useful in picking courses.
The U made several attempts to mollify the students, creating a subset of the evaluation form — known as the “Student Release Questions” — that could be shared publicly. But it was up to the instructors to release their own ratings.
In practice, few did.
“It’s always been in the single digits, 5 or 6 percent,” said David Langley, director of the U’s Center for Teaching and Learning. “Most faculty were simply not opting in.”
Faculty leaders say it was more a matter of benign neglect than outright resistance.
“It wasn’t that high a priority,” said Jennifer Goodnough, a member of the Faculty Senate and associate professor of chemistry at the Morris campus. “When it’s an opt-in system, it’s really very difficult to get us to do that extra step, even if we have no problem releasing the information.”
Last year, the Faculty Senate started floating the idea of taking it out of the instructor’s hands, and releasing the ratings automatically.
No teacher ratings
One big exception: Students would not be able to see teacher ratings, only the course ratings.
Prof. Will Durfee, who chairs the Faculty Senate leadership committee, said he knows that may disappoint many students. But according to university attorneys, the teacher evaluations are considered private data under state law, because they play a “significant role” in personnel decisions such as tenure and promotion.
Even so, Durfee said, he thinks the course ratings, based on hundreds of thousands of student evaluations, will be more useful than a handful of reviews on the Internet. “We can guarantee a much higher response rate.”
Nationally, the very idea of students rating professors has gotten some pushback. Last year, an essay in the Chronicle of Higher Education compared it to online bullying.
“Teachers should evaluate the teaching skills of other teachers,” wrote Spurgeon Thompson, an English instructor at Fordham University. “Leaving it to students is almost absurd.”
Eva von Dassow, an associate professor of classical and Near Eastern studies at the U, says there’s always a risk that student reviews will be skewed. “The chances are that students who are doing poorly are likely to think the teacher’s doing poorly.”
But as a professor, developing a thick skin is part of the job, she said, and there seems to be little opposition among faculty members to the change.
For students, it’s a step in the right direction, says Ohren, a member of the student government. But even if the ratings are posted, he said, they won’t replace the time-tested practice of asking your friends.
“Word-of-mouth is always going to matter.”
Maura Lerner • 612-673-7384
© 2016 Star Tribune