Georgetown University’s Newspaper of Record since 1920

The Hoya

Georgetown University’s Newspaper of Record since 1920

The Hoya

Georgetown University’s Newspaper of Record since 1920

The Hoya

Web Evaluations Met With Qualms in Debut

The inaugural run of online faculty evaluations for certain courses has been met by mixed reviews from students, with some wary of dubious anonymity and a lack of personal connection.

 

“I’ve heard of people who don’t take the online surveys because they don’t think they’re anonymous,” Christian Ortiz (CED ’12) said.

 

Completing the Internet surveys does require a NetID and password, but the website assures students that the responses are still anonymous.

 

 

 

“I worry about whether my responses can be traced back to me when they come with an e-mail address,” said Corinne Seals (GRD), a teaching assistant who is in the linguistics department.

 

 

 

Some students say they may not feel as invested in the survey now that some are on the web.

 

 

 

“When I do it online I can feel less personal about giving honest feedback,” Glenda Dieuveille (COL ’14) said.

 

 

 

An email from the Office of the Registrar seemed to offer incentives for students hesitant to complete the voluntary evaluations. Advertisements told of the opportunity to win two lower-level tickets to a Georgetown men’s basketball game, an iPod or gift certificates.

 

 

 

Still, the question remains whether or not the marketing push can halt a slide in participation. As the system stands, online surveys are not mandatory and students are no longer handed hard copies of the evaluation forms in class for courses that elect to be rated online.

 

 

 

“At the end of the semester, I don’t have the time to do a survey unless the school makes time for it,” Seals said “I wouldn’t give my students online surveys if I was given the option.”

 

 

 

According to the Registrar’s website, “Evaluation results are used in decision making for rank and tenure.” Although Georgetown publishes the aggregate results online, some of the survey data is withheld.

 

 

 

Associate Professor Maria Donoghue of the biology department and Faculty Senate expressed displeasure at what seemed to be an antiquated reporting process.

 

 

 

“I don’t know why the results of student evaluations posted on the Registrar’s website aren’t more current,” she said. “I think my last evaluation is from 2007, and when I look at the list of evaluated faculty, many of the professors aren’t teaching the listed courses anymore or aren’t even at Georgetown any longer.”

 

 

 

According to John Pierce, University Registrar and evaluations coordinator, Georgetown has been conducting faculty evaluations at each semester’s end for over 25 years. He doesn’t recall any modifications to the questionnaire since the 1990s. The generic paper questionnaires feature a fill-in-the-bubble-style front side and a chance to respond to open-ended questions on the back.

 

 

 

Some students don’t put much faith in this evaluation style.

 

 

 

“I don’t feel the evaluations are very personal,” Ann-Marie Jacoby (COL ’13) said. “And because of that, I don’t put in much effort.”

 

 

 

Jacoby has another reason for doubting the evaluations, citing a professor she said has undermined the process. This week, according to Jacoby, the professor said to the class, “`I would like you to take the survey, but it’s not going to affect me.'”

 

 

 

For Jacoby, professor apathy toward student feedback can be disenchanting.

 

 

 

“When a professor says something like that, it makes me feel like the evaluations aren’t serious and my voice isn’t being heard.”

 

 

 

Ryan Crowe (COL ’12) said that, to her, evaluations usually come down to whether or not she liked her professor.

 

 

 

“I only take them seriously if I don’t like the teacher,” Crowe said. “If I hated the teacher I might reflect that in my evaluation of the course.”

 

 

 

Professors share similar concerns about the outcome of the new evaluations,

 

 

 

“My concern is that my evals might be lower than the paper evaluations. Why? I think when you are in class and the professor is right there that a student might feel more inclined to review the class positively,” Elizabeth Stephen, associate professor of demography, wrote in an email. “If a student is sitting in his/her room, she/he might think, `Oh, man, that class was a lot of work’ or `I didn’t really like how the prof. graded my paper.'”

 

 

 

Other professors said that they hoped the new system would prove successful, despite fears that students would not respond if not forced to fill out the form in class.

 

 

 

“I really hope students will use the online forms – -if the response rate is lower, then I will probably switch back,” John Corcoran, a Ph. D. candidate who is teaching a European history course this semester, wrote in an email. “I’d like to think they will be more effective, because students will be able to take a little more time when the evaluations aren’t squeezed into one of the last class meetings.”

 

 

 

Most students interviewed by THE HOYA were unaware that the results are published online. Donoghue believes students could get more from the evaluations for course selection, particularly for general education requirements, if the published results were more current and accurate.

 

 

 

“Instead students have to rely on ratemyprofessor.com, which I think includes only the extreme points of view and tends to miss the middle, since some effort is required to post,” Donoghue says.

 

 

 

Many colleges around the country supplement student evaluations of the faculty with other forms of assessment. At Williams College, for example, students are also interviewed, and departing seniors give a more detailed evaluation.

 

Associate Professor Anna De Fina of the Italian department, a Faculty Senator, notes that Georgetown has other forms of evaluation. She cites faculty peer observation, review of teaching materials and other in-department methods as a few examples.

 

Since the Georgetown University Student Association pushed for the original implementation of the evaluation system, the body has had access to the survey results, according to Pierce.

 

 

 

GUSA President Calen Angert (MSB ’11), though, has never seen the evaluations used by GUSA in his four years of service.

 

 

 

Angert conceded to being “hazy” on the history of the evaluations, and he was not aware that some results were not published.

 

 

 

“We obviously have a limited amount of time to address a limited amount of issues,” Angert responded when questioned why GUSA has not advocated a more complete publication of results.

 

 

 

“We can’t address every issue under the sun. If someone were to approach us, we would explore it.”

 

 

 

Joan Leach, rank and tenure committee president, and Wayne Davis, Faculty Senate president, could not be reached for comment.

 

 

 

Last spring, prior to the online evaluation system’s launch this semester, some classes tested out the web surveys in a pilot program.

 

Leave a Comment
More to Discover

Comments (0)

All The Hoya Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *