Page semi-protected

From Wikipedia, the free encyclopedia
Jump to: navigation, search Logo.jpg
Web address
Type of site
Review Site
Available in English
Users About 800,000 visitors/month
Owner mtvU
Created by, LLC. (RMP) is a review site, founded in May 1999 by John Swapceinski, a software engineer from Menlo Park, California, which allows college and university students to assign ratings to professors and campuses of American, Canadian, and United Kingdom institutions. The site was originally launched as and converted to RateMyProfessors in 2001. was acquired in 2005 by Patrick Nagle and William DeSantis.[1] Nagle and DeSantis later resold in 2007 to Viacom's mtvU, MTV’s College channel.[2] is the largest online destination for professor ratings. The site has 8,000+ schools and over 1,000,000 ratings,[3]

Ratings and reviews

Users who have or are currently taking a particular professor’s course may post a rating and review of any professor already listed on the site. Furthermore, users may create a listing for any individual not already listed. To be posted, a rater must rate the course and/or professor on a 1-5 scale in the following categories: "easiness", "helpfulness", "clarity", the rater's "interest" in the class prior to taking it, and the degree of "textbook use" in the course. The rater may also share what grade they received in the course, rate the professor on their "hotness," and include comments of up to max 350 characters in length.

According to the website’s FAQ page, "The Overall Quality rating [that the professor ends up with] is the average of a teacher's Helpfulness and Clarity ratings...." It’s the professor’s Overall Quality rating that determines whether his/her name, on the list of professors, is accompanied by a little smiley face (meaning "Good Quality"), a frowny face ("Poor Quality"), or an in-between, expressionless face ("Average Quality"). A professor's name is accompanied by a chili pepper icon if the sum of his or her "HOT" ratings is greater than zero (one "hot" rating equals +1, one "not hot" or left blank equals −1).

Top Lists

Each year, compiles Top Lists of the Highest Rated Professors, Hottest Professors, and Top Schools in the U.S. based on ratings and comments from students.

For the first time, along with the release of their 2011-2012 Top Lists, debuted its "Fun Lists."

Professor Rebuttals

Since mtvU took over the website, has added a rebuttal feature which allows professors to rebut students' comments. Professors must register with the website, using an ".edu" e-mail address, in order to make their rebuttals. The site also has a feature called "Professors Strike Back" which features videos of professors responding to specific ratings that they received on RateMyProfessors.[4]

School ratings

Students can also comment on and rate their school as well, by visiting their school's school page. School Ratings categories include Academic Reputation, Location, Campus, School Library, Food, Clubs & Activities, Social Events, and Happiness.

Other features regularly updates the site to meet student preferences. In 2009, RateMyProfessors introduced an iPhone app. And in late 2011, professors were given the ability to make their Twitter handle available on their professor profile pages for students to follow.


In 2008 was recognized by Time Magazine as one of the 50 best websites of 2008.

Student evaluations of Professors from actually accounts for 25% of a school's rating in Forbes annual "America's Best Colleges" listing.

Validity versus formal in-class student evaluations

Using data for 426 instructors at the University of Maine, [researchers] examined the relationship between (RMP) indices and formal in-class student evaluations of teaching (SET). The two primary RMP indices correlate substantively and significantly with their respective SET items: RMP overall quality correlates r = .68 with SET item, Overall, how would you rate the instructor?; and RMP ease correlates r = .44 with SET item, How did the work load for this course compare to that of others of equal credit? Further, RMP overall quality and RMP ease each correlates with its corresponding SET factor derived from a principal components analysis of all 29 SET items: r = .57 and .51, respectively. Leading to the author's conclusion "While these RMP/SET correlations should give pause to those who are inclined to dismiss RMP indices as meaningless, the amount of variance left unexplained in SET criteria limits the utility of RMP.".[7]


Evaluation bias

The main criticism of RMP is that there is little reason to think that the ratings accurately reflect the quality of the professors rated.[8][9]

For one thing, ratings have been shown to reflect gender bias toward the professors evaluated.[10] Also, "easiness", "clarity", and "helpfulness" are the only components taken into consideration.[11][12] Edward Nuhfer says that both and RMP "are transparently obvious in their advocacy that describes a 'good teacher' as an easy grader. ... Presenter Phil Abrami...rated the latter as 'The worst evaluation I've seen' during a panel discussion on student evaluations at the 2005 annual AERA meeting."[13] Studies of RMP ratings conducted by Felton et al. found that "the hotter and easier professors are, the more likely they’ll get rated as a good teacher."[14]

Edward Nuhfer has argued, "Pseudo-evaluation damages the credibility of legitimate evaluation and victimizes individuals by irresponsibly publishing comments about them derived from anonymous sources. This is voyeurism passed off as 'evaluation' and examples lie at and Neither site provides evaluation of faculty through criteria that might be valuable to a student seeking a professor who is conducive to their learning, thinking or intellectual growth."[15]

Multiple ratings per person

Single individuals are able to make multiple separate ratings of a single professor on RMP.[16] RMP admits [17] that while it does not allow such multiple ratings from any one IP address, it has no control over raters who use several different computers, or those that "spoof" IP addresses. Also, there is no way of knowing that those who rate a professor's course have actually taken the course in question, making it possible for professors to rate themselves and each other.[18]

Rating relevancy

Critics state that a number of the ratings focus on qualities they see as irrelevant to teaching, such as physical appearance.

It is common at universities and colleges for faculty (especially junior faculty) to be called on by their departments to teach courses on topics that are not within their area(s) of expertise, which can earn them poor ratings at RMP that do not reflect the ability of those professors to teach courses on subjects that they are much more qualified to teach.[citation needed] RateMyProfessors, though it lets the student identify the course that they took with the professor, lumps together the ratings for all courses taught by each professor, instead of providing separate ratings averages for each course taught.

Permanent vs part-time faculty

Part-time (also known as adjunct) faculty are not always readily identifiable nor verifiable, as part-time professors often work at multiple schools or maintain employment outside the school.[citation needed]


  1. ^ Wired Magazine - 2005
  2. ^ "MTV Networks' mtvU Agrees to Acquire". 
  3. ^ "About". 
  4. ^ Professors Strike Back on mtvU - As Seen on Rate My Professors
  5. ^ "50 Best Websites 2008". Time. 17 June 2008. 
  6. ^ "Methodology". Forbes. 13 August 2008. Retrieved 18 November 2011. 
  7. ^ " versus formal in-class student evaluations of teaching". 
  8. ^ Pfeiffer, Sacha (September 20, 2006). "Ratings sites flourish behind a veil of anonymity". Boston Globe Online. 
  9. ^ Westhues, Kenneth (December 2006). "Stephen Berman: Scapegoat". 
  10. ^ Huntsberry, William (February 23, 2015). "How We Talk About Our Teachers". WNYC Morning Edition. 
  11. ^ Lang, James M. (December 1, 2003). "". Chronicle of Higher Education. 
  12. ^ See Fritz Machlup and T. Wilson, cited in Paul Trout, "Deconstructing an Evaluation Form", The Montana Professor, Vol. 8 No. 3, Fall 1998, accessed 7 May 2008.
  13. ^ Edward B. Nuhfer, 2005, "A Fractal Thinker Looks at Student Evaluations", accessed 10 May 2008.
  14. ^ David Epstein, "‘Hotness’ and Quality", Inside Higher Ed, 8 May 2006, accessed 10 May 2008.
  15. ^ Nuhfer, Edward Nuhfer (2010). "A Fractal Thinker Looks at Student Evaluations" (PDF). Retrieved April 25, 2010. 
  16. ^ Gabriela Montell, "The Art of the Bogus Rating", Chronicle of Higher Education, September 27, 2006 [1]
  17. ^ Pfeiffer, "Ratings sites flourish behind a veil of anonymity".
  18. ^ Montell, "The Art of the Bogus Rating", Chronicle of Higher Education.

External links