Fact-checked by Grok 2 weeks ago

Rate My Professors

Rate My Professors is an online platform founded in May 1999 by software engineer John Swapceinski in , initially under the name TeacherRatings.com, which facilitates anonymous student evaluations of postsecondary instructors through numerical ratings and textual comments on teaching effectiveness, course difficulty, and overall quality. The site rebranded to RateMyProfessors.com in 2001 and has since expanded to cover over 1.7 million professors at approximately 8,000 institutions, accumulating more than 19 million reviews that influence student course selections. Key features include aggregated scores for overall quality, clarity, helpfulness, and a "would take again" , derived from user-submitted without institutional , enabling rapid but unmoderated . Ownership transitioned through acquisitions, including by Viacom in the mid-2000s and later by Cheddar in 2018, reflecting its commercial evolution amid growing student usage for pre-enrollment decisions. While valued for democratizing access to peer insights on academic experiences, Rate My Professors faces scrutiny for reliability, as anonymous postings invite subjective biases, including correlations between high ratings and perceived easiness or instructor attractiveness rather than substantive merit, per empirical analyses of patterns. studies highlight moderate alignment with official evaluations but underscore causal distortions from self-selected, grade-influenced submissions, positioning the platform as a supplementary rather than authoritative tool amid broader debates on in .

History

Founding and Early Years

Rate My Professors was founded in May 1999 by John Swapceinski, a software engineer based in , who was pursuing a at . The platform initially launched as TeacherRatings.com, allowing students to submit anonymous evaluations of college instructors based on criteria such as clarity, helpfulness, and overall quality. Swapceinski developed the site in response to his own unsatisfactory experience with a , aiming to provide peers with candid insights into effectiveness absent from official channels. In 2001, the website rebranded to RateMyProfessors.com, expanding its scope to include searchable databases of professors across U.S. and Canadian institutions while maintaining user anonymity to encourage honest feedback. Early adoption was driven by word-of-mouth among students seeking alternatives to limited administrative evaluations, with the site's simple rating scales—typically on a 1-5 point system for attributes like easiness and hotness—facilitating quick submissions. By early 2004, it had amassed nearly 1.5 million ratings covering professors at almost 4,000 schools, reflecting organic growth amid rising internet access on campuses. The platform's early years saw steady expansion in , tripling to over six million ratings by , though this period also introduced debates over review validity due to potential biases in anonymous postings. Swapceinski emphasized the site's role in empowering students with data-driven choices, positioning it as a consumer-oriented tool in .

Expansion and Acquisition

Following its relaunch as RateMyProfessors.com in , the platform experienced rapid early expansion driven by increasing adoption among college students, evolving from a niche tool into a national resource covering thousands of institutions. By the mid-2000s, it had grown to encompass ratings for over one million professors based on millions of student submissions, reflecting steady year-over-year traffic increases amid broader web-based community trends. In 2005, the site was acquired by software entrepreneurs Patrick Nagle and William DeSantis, marking its first major ownership change and enabling operational scaling under private management. Two years later, on January 17, 2007, Viacom's MTV Networks subsidiary purchased it from Nagle and DeSantis, integrating the platform into its college-focused media ecosystem to leverage synergies with campus programming and expand user engagement among undergraduates. Under Viacom ownership, the site continued to accumulate ratings, reaching approximately 6.8 million total submissions by 2009. Viacom held the platform until October 25, 2018, when digital media company Cheddar acquired it for an undisclosed sum, aiming to reposition it as a utility for student decision-making rather than purely media content. At the time of the Cheddar deal, RateMyProfessors.com featured over 20 million ratings for 1.8 million professors across more than 8,000 schools, with monthly active users exceeding 6 million (peaking seasonally at 7 million) and 125,000 new ratings added each month. Post-acquisition, Cheddar invested in site redesigns, enhanced search functionalities, and planned premium analytics tools for educators, further supporting platform growth and monetization. By late 2018, annual revenue stood at roughly $3.4 million, underscoring its established scale in the education review sector.

Core Features and Functionality

Ratings and Review System

Users submit ratings and reviews for professors anonymously via the Rate My Professors platform, with submissions required to originate from individuals who have taken or are currently enrolled in the professor's ; only one review per per is permitted. The core numerical rating is Overall Quality, assessed on a 1-5 scale, where scores reflect the professor's effectiveness in material and providing helpfulness, including factors such as availability, approachability, and communication clarity; ratings of 3.5-5 denote good quality (accompanied by a face ), 2.5-3.4 average (neutral), and 1-2.4 poor (frowny face). The platform's displayed Overall Quality score for a professor is the of all valid numerical submissions for that instructor. Supplementary ratings include Level of Difficulty, rated from 1 (easiest) to 5 (hardest), which gauges the perceived rigor of the course without influencing the Overall Quality score. Users also indicate whether they "Would Take Again," a yes/no option introduced for ratings after May 25, 2016, with the aggregate displayed as the percentage of "yes" responses averaged across qualifying submissions; this metric similarly does not affect Overall Quality. An additional notes textbook usage, but it carries no numerical weight in aggregations. Prior to May 18, 2016, Overall Quality was derived by averaging separate 1-5 scores for Clarity (clarity of lectures and explanations) and Helpfulness (respectfulness, concern for students, and ), but the system shifted to a unified Overall Quality input thereafter. Accompanying each submission is an optional textual , limited to comments on the academic experience, including pros and cons, while adhering to strict guidelines that prohibit , , personal attacks (such as references to appearance or private life), unsubstantiated claims of illegal activity, hyperlinks, or non-English text (except for Canadian institutions). All reviews undergo moderator review prior to posting, with non-compliant content removed; users may flag existing reviews for re-evaluation, though is retained unless it violates rules. Professors cannot access submitter identities, ensuring , but the platform verifies school enrollment for campus-related ratings to maintain relevance.

Search and Additional Tools

The search functionality on Rate My Professors centers on a primary query interface where users enter an institution's name to access school-specific pages, enabling subsequent searches for s by name or navigation through departments and courses. This school-first approach structures results around over 8,000 covered institutions as of , facilitating targeted lookups within relevant academic contexts. Direct name searches are also supported platform-wide, though results often require manual refinement due to common names or incomplete indexing. Filtering options remain basic, with users able to sort professor listings by metrics such as overall quality rating (on a 1-5 scale), difficulty level, or "would take again" percentage within a school's . No robust advanced with multi-criteria sliders or operators is officially provided, leading some users to rely on external extensions or manual department browsing for deeper analysis. Development analyses of similar platforms highlight potential for enhanced filters by subject or rating thresholds, but Rate My Professors prioritizes simplicity over granular controls. Additional tools extend beyond search to include interactive review engagement via "Thumb Wars," where users upvote or downvote individual ratings to boost visibility of helpful feedback. Registered accounts allow management and editing of one's own submitted reviews, promoting user accountability while maintaining in public displays. The mirrors search capabilities with push notifications for new ratings on followed professors, though it lacks unique utilities like repositories or advisor recommendation engines. These features support course planning but do not include verified integrations with university systems or data export tools for comparative analysis.

Rating Methodology

Evaluation Criteria and Scales

Rate My Professors employs a numerical on a scale of 1 to 5 for its primary evaluation criteria, where 1 represents the lowest assessment and 5 the highest. The core criteria include Overall Quality, which measures a professor's effectiveness in and providing helpfulness inside and outside the ; Level of Difficulty, which gauges the perceived rigor of the and ; and Would Take Again, which indicates whether the rater would enroll in another class with the same . These ratings are submitted by verified users and aggregated into averages displayed on profiles, with interpretive labels such as "Good" for Overall Quality scores of 3.5–5, "" for 2.5–3.4, and "Poor" for 1–2.4, alongside similar thresholds for Difficulty (e.g., "" for 3.5–5) and Would Take Again (e.g., "" for 3.5–5). Prior to May 18, 2016, the Overall Quality score was derived by averaging separate Clarity (effectiveness in explaining material) and Helpfulness (assistance with learning and queries) ratings, both on the 1–5 scale; this methodology was consolidated into the single Overall Quality metric to streamline user input while maintaining focus on instructional efficacy. Users also provide qualitative comments and select from predefined tags (e.g., "Inspirational," "Tough Grader," "Get Ready to Work") to supplement numerical data, though these do not contribute to the scaled averages. Aggregate scores require a minimum number of ratings for reliability, but no explicit weighting is applied for factors like recency or user verification beyond basic enrollment checks. The platform's scales emphasize subjective student perceptions rather than objective metrics, such as outcomes or peer-reviewed teaching evaluations, potentially introducing variability tied to individual expectations and course contexts. For instance, Difficulty ratings often correlate inversely with Overall Quality in empirical analyses, reflecting a tendency for students to favor less rigorous courses independent of pedagogical merit. No adjustments are made for disciplinary differences, where fields typically receive lower Difficulty scores (averaging around 3.0–3.5 on the 1–5 scale) compared to .

User Submission Process and Anonymity

Users submit reviews on Rate My Professors by searching for a specific affiliated with their , selecting the relevant course if multiple are listed, and completing a rating form that includes numerical scores on a 1-5 scale for overall quality (reflecting teaching effectiveness and student satisfaction) and difficulty (assessing course rigor), a "would take again" indicator, optional tags such as "inspirational" or "clear grading criteria," and a free-text comment limited to factual experiences without or personal attacks. Submissions require users to affirm they have attended or are attending the class, though no like enrollment proof is enforced, allowing posting without account registration. A dedicated moderation team reviews every submission prior to publication to ensure compliance with site guidelines, which prohibit off-topic content, , or disclosure of , while permitting only one review per user per professor per course to prevent duplicates or manipulation. Approved reviews appear publicly under the professor's profile without any identifying details about the submitter, such as name, email, or IP address, preserving anonymity even for registered users who may opt to track their own contributions privately. This policy explicitly states that professors cannot identify individual raters, aiming to encourage candid feedback without fear of retaliation. The feature, while facilitating open expression, relies solely on self-reported attendance and moderator discretion rather than technological or institutional verification, which has raised questions about in analyses, though the maintains it as essential for unfiltered student input. Users can or delete their own reviews if registered, but once posted, content cannot be altered by professors or third parties except through flagged removal for guideline violations.

Empirical Analysis

Correlations with Official Student Evaluations

Studies have investigated the extent to which ratings on RateMyProfessors (RMP) align with official student evaluations of (SETs), typically administered by . These analyses generally reveal moderate to strong positive s, indicating that RMP serves as a partial for institutional evaluations, though differences in , , and bases introduce variability. For instance, a 2009 study of 126 professors at a mid-sized U.S. found a of r=0.51 between RMP's overall quality rating and SET overall effectiveness scores, with r=0.54 for RMP clarity and SET communication clarity; however, correlations were weaker (r<0.40) for aspects like workload fairness, suggesting RMP emphasizes perceived ease and entertainment over rigorous pedagogy. Further research corroborates these findings. A 2007 analysis reported strong correlations between RMP ratings and in-class SETs, particularly for helpfulness (r>0.60) linked to instructor availability. Similarly, a study matching RMP data to formal SETs yielded r=0.68 for RMP overall quality against corresponding SET items, with substantive alignment on clarity (r=0.62) and interest/stimulation (r=0.55), based on data from multiple institutions. A 2017 examination of publicly available evaluations across U.S. universities identified a strong overall association between RMP scores and institutional SETs, though institutional policies on SET and timing influenced the strength.
StudySampleKey CorrelationsNotes
Snyder (2009)126 professors, one universityOverall quality: =0.51; Clarity: =0.54Lower for workload-related items; supports but highlights RMP's focus on subjective appeal.
Coladarci & Kornfield (2007)In-class SETs vs. RMPHelpfulness: >0.60 with availabilityEmphasizes behavioral factors over content mastery.
You & Baek (2013)Multiple institutionsOverall quality: =0.68; Clarity: =0.62Stronger alignment on engagement metrics.
These correlations persist despite RMP's voluntary, submissions contrasting with mandatory SETs, implying shared priorities like perceived instructor friendliness; however, both instruments face for conflating easiness with quality, as evidenced by lower correlations with objective learning outcomes in meta-analyses of SET validity. Recent work (post-2020) notes that while correlations hold, RMP's aggregation of fewer reviews per can amplify noise compared to comprehensive SET datasets.

Identified Biases and Predictive Factors

Empirical analyses of RateMyProfessors data reveal that perceived easiness is the strongest predictor of overall quality ratings, with correlation coefficients of 0.60 to 0.61 in datasets encompassing millions of reviews. This relationship intensifies for higher-rated professors, where quality scores above 4.25 exhibit a steeper slope relative to easiness, suggesting students conflate instructional clarity or low difficulty with effective teaching. Perceived physical attractiveness serves as another key predictor, correlating at 0.31 with quality ratings; professors deemed "hot" receive boosts of approximately 0.6 to 0.8 points in overall scores, an effect persisting even after accounting for other variables. Gender emerges as a modest predictive factor, with female professors averaging 0.04 to 0.05 points lower on quality but 0.03 points higher on easiness compared to males across large samples. These disparities widen in certain disciplines, such as , where males score 3.69 versus 3.46 for females (p < 0.001), while no fields show a reverse advantage for females. Disciplinary context also predicts ratings, with STEM fields like yielding lower quality (3.47) and easiness scores than humanities like (3.90). Biases in the system include a hotness premium that equally benefits male and female instructors but stems partly from "," where high- teaching inflates attractiveness perceptions (e.g., 17.7% probability increase per quality point). -specific illusions differ, driven by clarity for males and helpfulness for females. Social influence introduces herding bias, as prior positive or negative ratings skew subsequent ones, with experimental evidence showing anchoring effects on new reviewers. Selection and negativity biases further distort data, as reviews skew toward dissatisfied students and are harsher than official evaluations. Academic entitlement among reviewers correlates with biased evaluations, though controlling for it reveals persistent gender stereotypes in perceptions of instructional .

Criticisms and Limitations

Correlation with Perceived Ease Over Instructional Quality

Multiple empirical studies have demonstrated a strong positive between overall ratings and perceived easiness on RateMyProfessors, indicating that student evaluations prioritize course leniency over pedagogical . In a of ratings for 3,190 professors across 25 universities, the between and easiness was 0.61, with easiness and perceived "sexiness" together explaining approximately half of the variance in scores. This suggests that higher ratings often reward professors who deliver less demanding coursework, potentially at the expense of deeper instructional rigor. Further large-scale examinations reinforce this pattern. A study of nearly 8 million ratings from over 190,000 U.S. professors found persistent positive correlations between reported instruction quality and easiness, with quality scores rising alongside lower perceived difficulty levels. Reviews frequently conflate authentic instructional challenges—such as rigorous assessments that foster learning—with artificial barriers like unclear expectations, leading to lower quality ratings for demanding but effective teaching. For instance, high-quality profiles often describe difficult material as "hard but fair" when accompanied by supportive practices, whereas low-quality ones criticize "impossible" tests tied to inadequate guidance. These correlations highlight a causal among reviewers for outcomes aligned with minimal effort, rather than metrics of transmission or development. Such biases undermine the site's utility as a proxy for instructional excellence, as evidenced by the consistent weighting of subjective comfort over objective learning gains in aggregated data. Academic entitlement may exacerbate this, with entitled students disproportionately penalizing professors who enforce standards, further skewing ratings toward perceived accommodation.

Anonymity-Induced Issues and Review Integrity

The anonymous review system on Rate My Professors enables submissions without user verification or , fostering potential inaccuracies and manipulations that compromise . The platform explicitly states that it cannot adjudicate disputes over review truthfulness, as prevents identification of reviewers or confirmation of their claims against assertions, leading to a of non-removal unless specific guidelines on prohibited are violated. This structure lacks mechanisms to ensure reviewers are actual students or to detect multiple entries from the same individual, allowing for unverified or fabricated input that distorts ratings. Empirical analysis underscores these vulnerabilities: a by Katrompas and Metsis examined Rate My Professors data and found in self-reporting amplifies biases and inaccuracies, with ratings exhibiting a pronounced negativity relative to evaluations and no compensatory validation processes. Quantitatively, the research identified a strong negative (r = -0.65) between perceived course difficulty and quality scores, suggesting reviews prioritize subjective ease over pedagogical merit, further eroded by the absence of reviewer . Such patterns indicate invalid , as non-s or repeat submitters could contribute without detection, undermining the site's ostensible goal of aggregating genuine feedback. These anonymity-driven flaws manifest in reported manipulations, including faculty self-posting positive reviews to counter negatives and students submitting unsubstantiated criticisms, though systematic quantification remains limited due to the platform's opacity. Overall, the lack of incentivizes selective or retaliatory posting, reducing integrity and rendering ratings susceptible to rather than reflective of .

Demographic and Disciplinary Biases

Studies have identified systematic biases in Rate My Professors (RMP) ratings, with professors consistently receiving lower overall quality scores than professors, even after accounting for factors like course difficulty and student satisfaction. Legg and Wilson (2012) analyzed RMP data alongside university evaluations and found that ratings favor professors perceived as "easy" graders and attractive, criteria that disadvantage women more than men due to stereotypical expectations of nurturing versus authoritative teaching styles. This bias manifests in linguistic patterns across 14 million reviews, where descriptors of intellectual excellence such as "brilliant," "genius," and "smart" appear more frequently for professors in all 25 analyzed disciplines, while women are more often labeled with terms like "bossy," "nurturing," or fashion-related adjectives such as "frumpy." Racial and ethnic biases also emerge, particularly against professors perceived as non-white. Empirical analysis of RMP reviews shows that instructors with Asian-sounding surnames receive lower ratings, correlated with complaints about accents rather than pedagogical effectiveness, suggesting prejudice tied to and of foreignness. Broader examinations confirm that racial minority , including and Asian professors, face more negative evaluations than counterparts, with male professors experiencing the most severe penalties independent of effects alone. Disciplinary differences amplify these demographic biases, as ratings in fields are systematically lower than in or sciences, reflecting student preferences for perceived easiness over rigor rather than objective instructional quality. encounter compounded disadvantages, with gender bias intensifying in quantitative disciplines where expectations for leniency clash with field demands. Such patterns indicate that RMP's anonymous, self-selected reviews prioritize subjective student experiences over verifiable teaching outcomes, perpetuating inequities across demographics and fields.

Notable Incidents

2016 Data Breach

On or about November 26, 2015, unauthorized hackers accessed a decommissioned version of the RateMyProfessors.com database, compromising addresses and hashed passwords associated with some registered users of the active . The breach did not affect the site's operational data, such as professor ratings or user-submitted reviews, but exposed credentials that could potentially be used for or account takeover attempts on the live platform. RateMyProfessors.com detected the incident during an internal security review and notified affected users via email on January 6, 2016, with public disclosure following shortly thereafter. The company advised users to change their passwords immediately, enable two-factor authentication where available, and monitor accounts for suspicious activity, emphasizing that passwords were stored in hashed format, which provides some protection against direct cracking. No evidence emerged of widespread misuse of the stolen data, and the incident was reported to state authorities, including California's Attorney General, as required under data breach notification laws. The highlighted vulnerabilities in legacy s, as the accessed database was no longer in active use but retained sensitive without adequate or decommissioning safeguards. It was cataloged in annual data summaries, such as the Identity Theft Resource Center's 2016 report, which classified it under business sector compromises involving an unknown number of records. No lawsuits or regulatory penalties directly stemming from were publicly documented, though it prompted standard post- remediation efforts by the company.

Instances of Manipulated Reviews and Responses

In late 2016, shortly after the U.S. , A. Cuevas at the faced coordinated cyberharassment that included approximately 60 fabricated reviews on his RateMyProfessors profile. These reviews, generated using dummy accounts by individuals linked to white supremacist groups, featured uniformly vulgar and racially motivated content, such as false accusations of incorporating into coursework and references to bestiality. The platform expeditiously removed the reviews upon Cuevas's reports, though some reappeared temporarily before being purged, highlighting vulnerabilities to organized manipulation via anonymous postings. A protracted case of review manipulation occurred beginning in winter 2019, when a stalker identifying as "the Lurker"—later traced to a former student named S.—targeted criminology professor Janani Umamaheswar at Southern Connecticut State University. The perpetrator posted multiple defamatory entries, including 1/5-rated reviews falsely claiming the course textbook exclusively covered "crimes of the poor" and discriminated against students, as well as accusations of educational dishonesty (e.g., fabricating Canadian schooling) and bribery via email for grade inflation. To evade detection and removal, the individual reposted content under altered class codes, persisting into early 2020 and demonstrating how anonymity facilitates repeated, baseless sabotage. While RateMyProfessors prohibits professors from submitting self-reviews or encouraging biased student postings, anecdotal reports from forums indicate occasional attempts by to inflate ratings through proxies or , though verified cases remain scarce compared to harassment-driven fakes. The site's professor response feature, introduced to allow rebuttals to reviews, has been used defensively but lacks documented instances of systematic ; studies suggest responses can mitigate negative perceptions without altering core ratings.

Reception and Impact

Usage Patterns Among Students and Faculty

A 2010 survey of 550 undergraduate students found that 90% had used RateMyProfessors.com at least once, primarily to inform decisions on and selection by accessing peer ratings on clarity, helpfulness, and overall quality. Students reported moderate trust in the site's information (mean score of 3.46 on a 1-5 ), valuing it as a quick supplement to informal sources like friends' recommendations, though ratings on "easiness" were considered less essential than instructional attributes. Subsequent confirms ongoing influence on choices, with sophomores and juniors consulting the more frequently than freshmen or seniors due to greater scheduling flexibility. No significant differences in usage emerged between education majors and other fields, suggesting broad applicability across disciplines for evaluating professor quality and class difficulty. Faculty engagement with RateMyProfessors remains limited compared to student usage, with many professors viewing the platform sporadically for informal rather than formal . Since 2014, certified professors have been able to claim profiles and post responses to reviews, facilitating targeted rebuttals to specific comments and indicating a subset of actively monitors content to address perceived inaccuracies. However, widespread toward the site's nature and documented biases—such as correlations with perceived easiness over pedagogical effectiveness—often discourages routine reliance, with some reporting emotional impact from negative entries but minimal changes to teaching practices.

Broader Effects on Higher Education

Rate My Professors has significantly shaped student course selection processes, with surveys indicating that a substantial portion of undergraduates consult the site when registering for classes, often prioritizing professors rated highly for "easiness" and overall positivity over those praised for instructional depth. A 2023 study of college students found that sophomores and juniors, who have greater scheduling flexibility, rely on the more heavily than freshmen or seniors to gauge quality and course demands, thereby directing enrollment toward favorably reviewed instructors. This selective behavior can concentrate student numbers in certain sections, potentially leading to under-enrollment in rigorous courses and affecting departmental . The platform's emphasis on anonymous, student-driven metrics correlates with broader incentives for faculty to adopt lenient grading and accommodating policies, as positive ratings—frequently tied to perceived ease rather than pedagogical effectiveness—boost intentions by factors exceeding expected norms. Experimental shows that exposure to high RMP ratings increases students' self-reported likelihood of enrolling by over threefold compared to low ratings, influencing not only individual choices but also aggregate university revenue tied to class sizes and faculty retention. Such dynamics contribute to documented trends, where professors face indirect pressure to prioritize student satisfaction to maintain viable enrollments, as students increasingly treat as a evaluable on platforms like RMP. These patterns raise concerns about diminished academic rigor in , as RMP's visibility amplifies evaluations' role in rewarding over challenge, potentially eroding incentives for demanding . While proponents argue it empowers informed , critics highlight how the site's biases toward leniency—evident in correlations between high ratings and lower difficulty scores—may homogenize teaching toward lower standards, with ripple effects on institutional priorities favoring popularity metrics over learning outcomes. Administrators rarely incorporate RMP data formally, yet its pervasive use underscores a cultural shift wherein mechanisms, unchecked by verification, indirectly govern course viability and pedagogical evolution.

Competitors

Several platforms compete with Rate My Professors by providing student-generated reviews and ratings of college instructors, though most operate on smaller scales or with differing verification methods. Coursicle stands out as a direct alternative, offering verified professor ratings tied to enrolled students to reduce and manipulation, with features allowing department-wide comparisons launched as of June 2025. Uloop enables campus-specific professor reviews alongside course planning tools, aggregating student feedback for universities across the . In early 2025, Professor Index emerged as a newer entrant, developed by Louisiana State University computer science professor Nash Mahmoud to facilitate more structured evaluations of professors and courses, addressing perceived shortcomings in anonymous review integrity. Other sites like RateMyTeachers.com focus more on secondary education but extend to some higher education reviews with a similar rating system emphasizing class difficulty and teacher quality. Traffic data indicates Coursicle as the leading similar site by user engagement metrics as of September 2025. These competitors generally maintain lower user bases compared to Rate My Professors, which dominates due to its established network effects, but they appeal to users seeking verified or niche feedback amid criticisms of anonymity-driven inaccuracies on the primary platform.

References

  1. [1]
    Ratemyprofessors.com – Case Study
    Rate My Professors was started in Menlo Park, California by software engineer, John Swapceinski in 1999. The original website was called TeacheRatings.com but ...
  2. [2]
    Rate My Professors - LinkedIn
    Oct 30, 2005 · Rate My Professors (RMP) is a review site, founded in May 1999 by John Swapceinski, a software engineer from Menlo Park, California, which ...Missing: history | Show results with:history<|separator|>
  3. [3]
    Rate My Professor: What professors think - The Reflector
    Oct 1, 2024 · Since it was founded in 1999, Rate My Professor has compiled over 19 million reviews of 1.7 million professors, bringing in about 4 million ...
  4. [4]
    Professors Rate RateMyProfessor.com - The DePauw
    Dec 7, 2021 · Rate my Professors, a review platform founded in 1999 featuring 1.7 million professors, 8000 schools, and over 19 million ratings, ...
  5. [5]
    Guidelines | RateMyProfessors
    The Rate My Professors website (www.ratemyprofessors.com) and mobile app provide user generated feedback on professors' teaching methods and their respective ...Missing: statistics | Show results with:statistics
  6. [6]
    Full article: Does ratemyprofessor.com really rate my professor?
    Jul 23, 2008 · This article explores the usefulness and validity of self‐selected online student ratings of faculty.
  7. [7]
    RateMyProfessors.com offers biased evaluations - ResearchGate
    Aug 7, 2025 · We found differences in the types of evaluations students provide for their professors for both perceptions of professor clarity and ratings of ...<|control11|><|separator|>
  8. [8]
    A Comparison of Rate-My-Professor and Department Results
    Jun 25, 2024 · Previous studies employing numeric scores have observed that anonymous postings on the Rate-My-Professor (RMP) site tend to be more negative ...
  9. [9]
    Vol-121-Iss-4 by The GW Hatchet - Issuu
    ... original program. ... The faculty review site first launched in 1999 under the name TeacherRatings before solidifying itself as Rate My Professors in 2001.
  10. [10]
    How Does Your Teacher Rate? - The Washington Post
    Mar 29, 2003 · An instructor who made John Swapceinski's life miserable at San Jose State University provided the inspiration for RateMyProfessors.com.
  11. [11]
    PROF VS. STUDENT SMACKDOWN - The Eyeopener
    Sep 24, 2008 · In 1999, Swapceinski launched RateMyProfessors.com (RMP), a website that allows students to publicly evaluate, celebrate or humiliate their ...
  12. [12]
    How RateMyProfessors Works - Computer | HowStuffWorks
    Jul 14, 2009 · RateMyProfessors has enjoyed continued growth in popularity since John Swapceinski founded it in 1999. Swapenceinski, a software engineer from ...
  13. [13]
    [PDF] How Do We Rate? An Evaluation of Online Student Evaluations ...
    In early 2004, RMP had almost 1.5 million ratings from nearly 4000 schools. Three years later, the number of postings increased by 300% to over six million.
  14. [14]
    Everybody's a Critic - The New York Times
    Apr 23, 2006 · Ratemyprofessors, which debuted in 1999, has since been joined by dozens of sites where college students can post reviews of their teachers; ...Missing: launched | Show results with:launched
  15. [15]
    Cheddar buys a user-generated content biz, Rate My Professors ...
    Oct 25, 2018 · According to Cheddar, Rate My Professors is being used by more than 6 million college students per month, who write an average of 125,000 new ...
  16. [16]
    MTV to buy RateMyProfessors.com - CNET
    Jan 17, 2007 · MTV Networks, a division of Viacom, announced Wednesday that it has agreed to acquire RateMyProfessors.com. The company says the acquisition ...
  17. [17]
    Cheddar Buys 'Rate My Professors' Website From Viacom - Variety
    Oct 25, 2018 · Viacom bought RateMyProfessors.com in 2007, where it had been housed in the conglomerate's MTVU college network group. Related Stories. DEXTER: ...<|separator|>
  18. [18]
    Can you explain the rating scale? - Rate My Professors Help Center
    Jun 26, 2023 · A professor's Overall Quality rating should reflect how well a professor teaches the course material, and how helpful they are both inside and ...
  19. [19]
    Rate My Professors
    Let's make it official. Lady with a pencil. Manage and edit your ratings. Person making an anonymous entry. Your ratings are always anonymous.
  20. [20]
    Rate My Professors Helping Students Evaluate Faculty
    Rate My Professors Your Guide to Faculty Ratings. • Evaluate faculty to make ... Search Functionality: Find professors by name, university, or department.
  21. [21]
    Best 10 Professor Rating Sites in 2025: Which One Should You Trust?
    Jun 20, 2025 · Use Advanced Search Tools. Most platforms offer tools to refine your search: On Rate My Professors, use the “course difficulty” slider to ...Missing: sources | Show results with:sources
  22. [22]
    Rate my professor advanced search - Reddit
    May 21, 2024 · Does anyone have any better tools to accurately comb thru rate my professor other than manually going through every teachers page one by one?What do you make of Rate My Professor? : r/college - RedditHow do you use ratemyprofessor? : r/WWU - RedditMore results from www.reddit.comMissing: features | Show results with:features
  23. [23]
    Develop Teachers Rating Website like Rate My Professors
    Jun 29, 2024 · A “Rate My Professors” website is an online platform where students can review and rate their teachers based on various criteria.
  24. [24]
    Will my professor know that I have rated them?
    Jan 1, 2021 · Whether you choose to create a registered account or not, all ratings submitted will remain anonymous. While all comments are posted anonymously ...
  25. [25]
  26. [26]
    An empirical test of the validity of student evaluations of teaching ...
    Sep 2, 2009 · The present article examined the validity of public web‐based teaching evaluations by comparing the ratings on RateMyProfessors.com for 126 professors.
  27. [27]
    [PDF] Student Consensus on RateMyProfessors.com
    In this study we documented a number of findings that could be used to inform faculty and researchers' judgments about online rating sites such as ...Missing: early | Show results with:early
  28. [28]
    RateMyProfessors com versus formal in-class student evaluations of ...
    The two primary RMP indices correlate substantively and significantly with their respective SET items: RMP overall quality correlates r = .68 with SET item, ...
  29. [29]
    [PDF] Correlations, trends and potential biases among publicly accessible ...
    Jan 8, 2017 · As of December 2016,. RateMyProfessors has approximately 17 million ratings for over 1.6 million professors at nearly 7000 institutions ...Missing: early growth
  30. [30]
    Full article: Student Evaluations of Teaching Encourages Poor ...
    May 13, 2020 · An empirical test of the validity of student evaluations of teaching made on RateMyProfessors.com. Assessment & Evaluation in Higher ...
  31. [31]
    Academic entitlement and Ratemyprofessors.com evaluations bias ...
    Apr 30, 2024 · This study explored relationships between academic entitlement (AE) and Ratemyprofessors.com (RMP) use.
  32. [32]
  33. [33]
    [PDF] Hot or Not: The Role of Instructor Quality and Gender on the ...
    Previous analysis of the website RateMyProfessors.com confirms this, indicating that instructors who are viewed by students as “hot” receive higher “quality” ...
  34. [34]
    Is RateMyProfessors.com Unbiased? A Look at the Impact of Social ...
    Oct 16, 2017 · Results of this study suggest prior online ratings, both positive and negative, do affect subsequent online ratings and bias them. There are ...<|separator|>
  35. [35]
    Academic entitlement and Ratemyprofessors.com evaluations bias ...
    This study's findings suggest that student attitudes related to AE and impacted by RMP evaluations have significant implications for professors' occupational ...<|control11|><|separator|>
  36. [36]
    the relations between perceived quality, easiness and sexiness
    Sep 14, 2010 · Students rate professors on Quality, Easiness, and Sexiness. Quality correlates with easiness (0.61) and sexiness (0.30). Sexy professors get ...
  37. [37]
    (PDF) Web-Based Student Evaluations of Professors: The Relations ...
    Aug 5, 2025 · One study even found that instructors' perceived "easiness" and "sexiness" accounted for half the variation in overall instructor quality ...
  38. [38]
    [PDF] Correlations, trends and potential biases among publicly ...
    Jan 2, 2018 · 2020. TLDR. Using data from Rate My Professors, the largest and most recent quantitative data analysis is conducted to study questions related ...
  39. [39]
    [PDF] Analyzing How Students Define Difficulty in RateMyProfessor. - ERIC
    Employing rhetorically grounded approaches to computer-assisted corpus analysis, we compared 4,600 RateMyProfessor.com instructor profiles meeting the criteria ...
  40. [40]
    [PDF] Web-based student evaluations of professors: the relations between ...
    Sep 14, 2010 · Using the self-selected data from this public forum, we examine the relations between quality, easiness, and sexiness for 3190 professors at 25 ...
  41. [41]
    Will Rate My Professors remove a review I flag as false or defamatory?
    Dec 31, 2020 · If a professor contacts us to say that something in a review is untrue and/or defamatory, we don't know who to believe.<|separator|>
  42. [42]
  43. [43]
    RateMyProfessors.com LLC | BBB Complaints | Better Business ...
    Since 2020, they have allowed alleged students to post false, malicious comments about my performance as a professor, and they have followed me (cyber-stalked) ...
  44. [44]
    New analysis of Rate My Professors finds patterns in words used to ...
    Feb 8, 2015 · The problem with male and female professors being evaluated in different ways is hardly unique to Rate My Professors. A law dean last month ...
  45. [45]
    Study finds students give lower ratings to Asian professors.
    Mar 5, 2015 · A new study looks at how students on Rate My Professors rate instructors who have Asian-sounding last names, and the results suggest that these instructors are ...
  46. [46]
    [PDF] The Role of Perceived Race and Gender in the Evaluation of ...
    Racial minority faculty, especially Black and Asian, are evaluated more negatively than White faculty. Black male faculty are rated more negatively than other ...
  47. [47]
    A Study Of Bias and Inaccuracies In Anonymous Self-Reporting
    May 6, 2025 · Rate My Professor is a popular and public website by which students can rate their professors and classes that mimics some of the features of social media.
  48. [48]
    [PDF] From: Rate my Professors [mailto:noreply@ratemyprofessors.com]
    RateMyProfessors.com investigated and subsequently learned that, on or about November 26, 2015, hackers gained access to a decommissioned version of the ...
  49. [49]
    Download Full Data Breach List (CSV)
    ... RateMyProfessors.com LLC","11/26/2015","01/12/2016" "JB Autosports, Inc ... Data Breach","09/01/2014, 10/09/2014","10/27/2014" "Fidelity National ...
  50. [50]
    [PDF] DATA BREACH REPORTS - Identity Theft Resource Center
    Oct 21, 2016 · ... 2016 Data Breach Category Summary. How is this report produced? What ... RateMyProfessors.com. NY. Yes - Unknown #. Business. Unknown.
  51. [51]
    A New Reality? The Far Right's Use of Cyberharassment against ...
    ... Rate My Professors provided an opportunity for more attacks. Soon, sixty new “reviews” of my teaching, all uniformly vulgar, appeared. Some referred to ...
  52. [52]
    The lurker: an obsessive tormentor who made professors' lives ...
    Oct 25, 2023 · Janani Umamaheswar occasionally checks Rate My Professors to monitor her course reviews. The site offers a loose barometer of how you are doing as a teacher.
  53. [53]
    The Case for Striking Back - Inside Higher Ed
    Jul 28, 2015 · Instructors have little to lose by responding to student criticism on RateMyProfessors.com, a new study suggests, and what they say could ...
  54. [54]
    [PDF] Examining How Students Perceive and Use RateMyProfessors.com
    Our data show that sites such as RateMyProfessors.com affect many students' decisions about which professors to take. Due to concerns about validity of the ...<|separator|>
  55. [55]
    Student Use of the Rate My Professor Website in Course Selection
    Sep 14, 2023 · This study explored college students' perceptions and evaluations of the website, RateMyProfessors.com, and its impact on course selection.
  56. [56]
    Do professors care about their rating on websites such as ...
    Dec 23, 2017 · I occasionally check RateMyProfessor, and I do care (a little) about what students say: my feelings are hurt by negative comments, ...
  57. [57]
    Ratemyprofessors now allows teachers to respond to students' reviews
    Jan 23, 2014 · For professors to answer student comments and provide feedback, they need to complete a certification on the website. To be certified by ...
  58. [58]
    [PDF] Effects of Ratemyprofessors.com and University Student Evaluations ...
    Sep 1, 2020 · Legg, A. M., & Wilson, J. H. (2012). RateMyProfessors.com offers biased evaluations. Assessment &. Evaluation in Higher Education, 37(1), 89 ...
  59. [59]
    [PDF] Online Posting of Teaching Evaluations and Grade Inflation Talia Bar
    While student evaluations of teaching increased, the policy change contributed to grade inflation and had little effect on course enrollment and composition. * ...
  60. [60]
    Do the reviews on RateMyProfessors have any effect on a ... - Quora
    Jan 25, 2023 · Rate my Professor is a not reliable. It's basically the textbook illustration of biased data collection. It's even worse than the official ...
  61. [61]
    Rate My Professor Alternative | Verified Professor Ratings & Reviews
    Jun 26, 2025 · Discover honest, verified professor ratings on Coursicle—the smarter Rate My Professor alternative. Compare professors by department and ...
  62. [62]
    Professor Ratings for College Students - Uloop
    Uloop lists professors like Drew Traulsen (History), Jason Scott (Film), Fred Cawthorne (Physics), and Sharon Williams (English).Missing: competitors | Show results with:competitors
  63. [63]
    LSU professor launches competitor to Rate My Professors website
    Feb 20, 2025 · LSU computer science professor Nash Mahmoud recently launched an app offering an alternative to the popular website Rate My Professors.
  64. [64]
    Rate My Teachers
    RMT is about helping students answer a single question "what do I need to know to maximize my chance of success in a given class?"<|separator|>
  65. [65]
    ratemyprofessors.com Competitors - Top Sites Like ... - Similarweb
    ratemyprofessors.com's top 5 competitors in September 2025 are: coursicle.com, assist.org, ratemyprofessor.com, topuniversities.com, and more.