Jump to content

RateMyProfessors.com: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
→‎Positive correlation between easiness of class and rating of professor: removing definition of RMP as it's defined above
→‎Rating relevancy: moving to other features section
Tag: references removed
Line 68: Line 68:
Critics state that a number of the ratings focus on qualities they see as irrelevant to teaching, such as physical appearance, even though the appearance related "hotness" score is not included in the calculation of overall professor quality. Many students, while rating, understand hotness as the measure of how a professor makes his/her subject "spicy" and use interactive/unconventional teaching elements.{{cn}}
Critics state that a number of the ratings focus on qualities they see as irrelevant to teaching, such as physical appearance, even though the appearance related "hotness" score is not included in the calculation of overall professor quality. Many students, while rating, understand hotness as the measure of how a professor makes his/her subject "spicy" and use interactive/unconventional teaching elements.{{cn}}
It is common at universities and colleges for faculty (especially junior faculty) to be called on by their departments to teach courses on topics that are not within their area(s) of expertise, which can earn them poor ratings at RMP that do not reflect the ability of those professors to teach courses on subjects that they are much more qualified to teach.{{Citation needed|date=August 2008}} RateMyProfessors, though it lets the student identify the course that they took with the professor, lumps together the ratings for all courses taught by each professor, instead of providing separate ratings averages for each course taught.== Top Lists ==
It is common at universities and colleges for faculty (especially junior faculty) to be called on by their departments to teach courses on topics that are not within their area(s) of expertise, which can earn them poor ratings at RMP that do not reflect the ability of those professors to teach courses on subjects that they are much more qualified to teach.{{Citation needed|date=August 2008}} RateMyProfessors, though it lets the student identify the course that they took with the professor, lumps together the ratings for all courses taught by each professor, instead of providing separate ratings averages for each course taught.
Each year, RateMyProfessors.com compiles Top Lists of the Highest Rated Professors, Hottest Professors, and Top Schools in the U.S. based on ratings and comments from students.<ref>http://toplists.ratemyprofessors.com/</ref>

For the first time, along with the release of their 2011-2012 Top Lists, RateMyProfessors.com debuted its "Fun Lists."{{cn}}


===Permanent vs adjunct faculty===
===Permanent vs adjunct faculty===

Revision as of 09:44, 16 April 2018

RateMyProfessors.com
File:RMP new logo.jpg
Type of site
Review Site
Available inEnglish
OwnerViacom
Created byRateMyProfessors.com, LLC.
URLwww.ratemyprofessors.com
UsersAbout 800,000 visitors/month
LaunchedMay 1999; 25 years ago (1999-05)

RateMyProfessors.com (RMP) is a review site, founded in May 1999 by John Swapceinski, a software engineer from Menlo Park, California, which allows college and university students to assign ratings to professors and campuses of American, Canadian, and United Kingdom institutions[1]. The site was originally launched as TeacherRatings.com and converted to RateMyProfessors in 2001. RateMyProfessors.com was acquired in 2005 by Patrick Nagle and William DeSantis.[2] Nagle and DeSantis later resold RateMyProfessors.com in 2007 to Viacom's MTVU, MTV’s College channel.[3]

RateMyProfessors.com is the largest online destination for professor ratings. The site has 8,000+ schools, 1.7 million professors and over 19 million ratings.[4]

Ratings and reviews

Users who have or are currently taking a particular professor’s course may post a rating and review of any professor that is already listed on the site. Furthermore, users may create a listing for any individual not already listed. To be posted, a rater must rate the course and/or professor on a 1-5 scale in the following categories: "overall quality" and "level of difficulty". The rater may also share if they would take the professor again, if the class was taken for credit, if attendance is mandatory, if the textbook is used, what grade they received in the course, rate the professor on their "hotness" and include comments of up to max 350 characters in length. Raters may also select up to 3 tags that describes the professor from a list of 20.[5][6]

According to the website’s Help page,[7] "A professor’s Overall Quality rating should reflect how well a professor teaches the course material, and how helpful he/she is both inside and outside of the classroom".[citation needed] It’s the professor’s Overall Quality rating that determines whether his/her name, on the list of professors, is accompanied by a little smiley face (meaning "Good Quality"), a frowny face ("Poor Quality"), or an in-between, expressionless face ("Average Quality"). [citation needed] A professor's name is accompanied by a chili pepper icon if the sum of his or her "hot" ratings is greater than zero (one "hot" rating equals +1, one "not hot" or left blank equals −1). [citation needed]

Validity

RateMyProfessors.com versus formal in-class student evaluations

Using data for 426 instructors at the University of Maine, [researchers] examined the relationship between RateMyProfessors.com (RMP) indices and formal in-class student evaluations of teaching (SET). The two primary RMP indices correlate substantively and significantly with their respective SET items: RMP overall quality correlates r = .68 with SET item, Overall, how would you rate the instructor?; and RMP ease correlates r = .44 with SET item, How did the work load for this course compare to that of others of equal credit? Further, RMP overall quality and RMP ease each correlates with its corresponding SET factor derived from a principal components analysis of all 29 SET items: r = .57 and .51, respectively. Leading to the author's conclusion "While these RMP/SET correlations should give pause to those who are inclined to dismiss RMP indices as meaningless, the amount of variance left unexplained in SET criteria limits the utility of RMP.".[8]

Positive correlation between easiness of class and rating of professor

Research on in-class evaluations shows that professor ratings increase when students rate the course as easy.[9] The same relationship has been shown for RMP. In an article in the journal Assessment and Evaluation in Higher Education, Clayson investigated what RMP actually rates and concluded that "students will give higher evaluations to instructors they judge as being easy. There is also a suggestion in these findings that, if students like an instructor (for whatever reason), then the easiness of the class becomes relatively irrelevant."[10]. Clayson concluded that "the majority of the evidence indicates that [ratemyprofessors.com] is biassed by a halo effect, and creates what most accurately could be called a 'likeability' scale." Other analyses of RMP class ratings have come to similar conclusions,[11][12] and some have concluded that professor attractiveness is also positively correlated with evaluation scores on RMP.[13]

Criticism

Evaluation bias

The main criticism of RMP is that there is little reason to think that the ratings accurately reflect the quality of the professors rated.[14][15]

The analogy with rating a plumber versus rating a professor is based on the false grounds. Student's poor performance in class that may prompt an unfair review is commonly not a professor's fault. There is nothing similar in rating a plumber that may cause a customer to produce an unfair review.[citation needed]

The website has an economic reason to be on the student's side simply because there are many more students than professors. That reason alone makes the website's reviews skewed towards students trying to hurt the professors who have done their jobs and graded fairly.[citation needed]

For one thing, ratings have been shown to reflect gender bias toward the professors evaluated.[16] Also, "easiness", "clarity", and "helpfulness" are the only components taken into consideration.[17][18] Edward Nuhfer says that both Pickaprof.com and RMP "are transparently obvious in their advocacy that describes a 'good teacher' as an easy grader. ... Presenter Phil Abrami...rated the latter as 'The worst evaluation I've seen' during a panel discussion on student evaluations at the 2005 annual AERA meeting."[19] Studies of RMP ratings conducted by Felton et al. found that "the hotter and easier professors are, the more likely they’ll get rated as a good teacher."[20]

Edward Nuhfer has argued, "Pseudo-evaluation damages the credibility of legitimate evaluation and victimizes individuals by irresponsibly publishing comments about them derived from anonymous sources. This is voyeurism passed off as 'evaluation' and examples lie at PickAProf.com and RateMyProfessors.com. Neither site provides evaluation of faculty through criteria that might be valuable to a student seeking a professor who is conducive to their learning, thinking or intellectual growth."[21]

Multiple ratings per person

Single individuals are able to make multiple separate ratings of a single professor on RMP.[22] RMP admits [23] that while it does not allow such multiple ratings from any one IP address, it has no control over raters who use several different computers, or those that "spoof" IP addresses. Also, there is no way of knowing that those who rate a professor's course have actually taken the course in question, making it possible for professors to rate themselves and each other.[24]

Rating relevancy

Critics state that a number of the ratings focus on qualities they see as irrelevant to teaching, such as physical appearance, even though the appearance related "hotness" score is not included in the calculation of overall professor quality. Many students, while rating, understand hotness as the measure of how a professor makes his/her subject "spicy" and use interactive/unconventional teaching elements.[citation needed]

It is common at universities and colleges for faculty (especially junior faculty) to be called on by their departments to teach courses on topics that are not within their area(s) of expertise, which can earn them poor ratings at RMP that do not reflect the ability of those professors to teach courses on subjects that they are much more qualified to teach.[citation needed] RateMyProfessors, though it lets the student identify the course that they took with the professor, lumps together the ratings for all courses taught by each professor, instead of providing separate ratings averages for each course taught.

Permanent vs adjunct faculty

Adjunct faculty are not always readily identifiable nor verifiable, as such professors may work at multiple universities, change universities frequently, or maintain employment outside an academic setting.[citation needed].

Data breach

On January 11, 2016, RMP notified its users via email (and with a small notification link on its website) that a decommissioned version of RMP's website suffered a data breach affecting email addresses, passwords, and registration dates.[25] According to the California Department of Justice website, the security breach occurred six weeks earlier on or about November 26, 2015.[26]

Other features

Updates to website

RateMyProfessors.com regularly updates the site to meet student preferences. In late 2011, professors were given the ability to make their Twitter handle available on their professor profile pages for students to follow. And in 2014, RateMyProfessors debuted a new responsive site design. In 2015, the site introduced custom URL's, which allows professors to create a custom URL for their ratings page.[citation needed]

Professor Notes

Since mtvU took over the website, RateMyProfessors.com has added a notes feature which allows professors to reply to students' comments. Professors must register with the website, using an ".edu" e-mail address, in order to make their notes. The site also had a feature called "Professors Strike Back" which featured videos of professors responding to specific ratings that they received on RateMyProfessors.[27]

In 2015, the site debuted a new series "Professors Read Their Ratings".[28] The series features professors reading and reacting to their Rate My Professors ratings. Students can submit their videos on the site as well.[29]

School ratings

Students can also comment on and rate their school as well, by visiting their school's RateMyProfessors.com school page. School Ratings categories include Academic Reputation, Location, Campus, School Library, Food, Clubs & Activities, Social Events, and Happiness.[citation needed]

Recognition

In 2008 RateMyProfessors.com was recognized by Time Magazine as one of the 50 best websites of 2008.

In 2008, student evaluations of Professors from RateMyProfessors.com accounted for 25% of a school's rating in Forbes annual "America's Best Colleges" listing. However, this is no longer true (2018).[30]

Students are consumers, who, ostensibly at least, attend college to learn and acquire knowledge and skills. The core dimension of the learning experience comes from attending classes taught by instructors. Asking students what they think about their courses is akin to what some agencies like Consumers Report or J.D. Powers and Associates do when they provide information on various goods or services.[31]

In 2015, the site won two People's Choice Webby Awards after an extensive site overhaul.[32]

References

  1. ^ "About RateMyProfessors.com".
  2. ^ Wired Magazine - 2005
  3. ^ "MTV Networks' mtvU Agrees to Acquire RateMyProfessors.com".
  4. ^ "About RateMyProfessors.com".
  5. ^ http://www.ratemyprofessors.com/AddRating.jsp?tid=1458112
  6. ^ http://www.ratemyprofessors.com/help.jsp#tally
  7. ^ http://www.ratemyprofessors.com/help.jsp
  8. ^ "RateMyProfessors.com versus formal in-class student evaluations of teaching".
  9. ^ Mau, Ronald R., & Opengart, Rose A. (2012). Comparing Ratings: In-Class (Paper) vs. out of Class (Online) Student Evaluations. Higher Education Studies, 2(3), 55-68.
  10. ^ Dennis E. Clayson (2013) What does ratemyprofessors.com actually rate?, Assessment & Evaluation in Higher Education, 39:6, 678-698, DOI: 10.1080/02602938.2013.861384
  11. ^ Legg, Angela & H. Wilson, Janie. (2012). RateMyProfessors.com offers biased evaluations. Assessment & Evaluation in Higher Education. 37. 89-97. 10.1080/02602938.2010.507299.
  12. ^ http://www.apa.org/gradpsych/features/2007/ratings.aspx
  13. ^ James Felton, Peter T. Koper, John Mitchell, and Michael Stinson. Attractiveness, easiness and other issues: student evaluations of professors on ratemyprofessors.com. Assessment & Evaluation in Higher Education, 33(1):45–61, 2008
  14. ^ Pfeiffer, Sacha (September 20, 2006). "Ratings sites flourish behind a veil of anonymity". Boston Globe Online.
  15. ^ Westhues, Kenneth (December 2006). "Stephen Berman: Scapegoat". UWaterloo.ca.
  16. ^ Huntsberry, William (February 23, 2015). "How We Talk About Our Teachers". WNYC Morning Edition.
  17. ^ Lang, James M. (December 1, 2003). "RateMyBuns.com". Chronicle of Higher Education.
  18. ^ See Fritz Machlup and T. Wilson, cited in Paul Trout, "Deconstructing an Evaluation Form", The Montana Professor, Vol. 8 No. 3, Fall 1998, accessed 7 May 2008.
  19. ^ Edward B. Nuhfer, 2005, "A Fractal Thinker Looks at Student Evaluations", accessed 10 May 2008.
  20. ^ David Epstein, "‘Hotness’ and Quality", Inside Higher Ed, 8 May 2006, accessed 10 May 2008.
  21. ^ Nuhfer, Edward Nuhfer (2010). "A Fractal Thinker Looks at Student Evaluations" (PDF). Retrieved April 25, 2010.
  22. ^ Gabriela Montell, "The Art of the Bogus Rating", Chronicle of Higher Education, September 27, 2006 [1]
  23. ^ Pfeiffer, "Ratings sites flourish behind a veil of anonymity".
  24. ^ Montell, "The Art of the Bogus Rating", Chronicle of Higher Education.
  25. ^ "RateMyProfessors.com – Find and rate your professor or campus". www.ratemyprofessors.com. Retrieved 2016-01-12.
  26. ^ https://oag.ca.gov/ecrime/databreach/reports/sb24-59576
  27. ^ Professors Strike Back on mtvU - As Seen on Rate My Professors
  28. ^ http://www.ratemyprofessors.com/blog/video/albion-college-professors-read-their-ratings-part-2/
  29. ^ http://www.ratemyprofessors.com/blog/video/submit-to-ratemyprofessors-com
  30. ^ https://www.forbes.com/forbes/welcome/?toURL=https://www.forbes.com/sites/cartercoudriet/2017/08/02/top-colleges-2017-the-methodology/&refURL=https://www.google.com/&referrer=https://www.google.com/
  31. ^ "Methodology". Forbes. 13 August 2008. Retrieved 18 November 2011.
  32. ^ http://www.ratemyprofessors.com/blog/buzzpost/weve-won-two-peoples-voice-webby-awards/