Rankings of universities in the United States
Numerous organizations produce rankings of universities in the United States each year. A 2010 study by the University of Michigan found that university rankings in the United States significantly affect institutions' applications and admissions.
- 1 Rankings
- 1.1 American Council of Trustees and Alumni
- 1.2 Faculty Scholarly Productivity rankings
- 1.3 Forbes College rankings
- 1.4 The Top American Research Universities
- 1.5 TrendTopper MediaBuzz College Guide
- 1.6 U.S. News & World Report College and University rankings
- 1.7 United States National Research Council Rankings
- 1.8 Washington Monthly national universities rankings
- 1.9 Revealed preference rankings
- 1.10 Other
- 2 Criticisms
- 3 See also
- 4 References
- 5 External links
American Council of Trustees and Alumni
In 2009, the American Council of Trustees and Alumni (ACTA) began grading colleges and universities based on the strength of their general education requirements. In ACTA's annual What Will They Learn? report, colleges and universities are assigned a letter grade from "A" to "F" based on how many of seven subjects are required of students. The seven subjects are composition, mathematics, foreign language, science, economics, literature and American government or history. The 2011-2012 edition of What Will They Learn? graded 1,007 institutions. In the 2011-2012 edition, 19 schools received an "A" grade for requiring at least six of the subjects the study evaluated. ACTA's rating system has been endorsed by Mel Elfin, founding editor of U.S. News & World Report’s rankings. New York Times higher education blogger Stanley Fish, while agreeing that universities ought to have a strong core curriculum, disagreed with some of the subjects ACTA includes in the core.
Faculty Scholarly Productivity rankings
Forbes College rankings
In 2008, Forbes.com began publishing an annual list, prepared by the Center for College Affordability and Productivity of "America's Best Colleges". The Forbes rankings use student evaluations from RateMyProfessors.com, self-reported salaries of alumni from PayScale, four-year graduation rates, numbers of students and faculty receiving "nationally competitive awards," and four-year accumulated student debt to calculate the rankings. The list emphasizes tuition costs, which boosts the ratings of the zero-cost United States Service academies. It disregards public reputation, which causes some colleges to score lower than in other lists.
The Top American Research Universities
The Center for Measuring University Performance has ranked American research universities in the Top American Research Universities since 2000. The methodology is based on data such as research publications, citations, recognitions and funding, as well as undergraduate quality such as SAT scores. The information used can be found in public–accessible materials, reducing possibilities for manipulation. The methodology is generally consistent from year to year and changes are explained in the publication along with references from other studies.
TrendTopper MediaBuzz College Guide
TrendTopper MediaBuzz College Guide is an American-college guide based on what it calls "Internet brand equity" based on data collected from the Internet and global media sources. It ranks the Top 300 United States colleges and universities. The guide includes specialty and for profit schools including Art, Business, Design, Music, and Online Education. The TrendTopper MediaBuzz College Rankings are produced twice a year by the Global Language Monitor of Austin, Texas.
Time Magazine described internet brand equity as "a measure of who's talking about you online, based on Internet data, social media, blogs and the top 75,000 print and electronic media outlets.
GLM ranks the schools "according to their online presence -- or internet brand equity ... By focusing on online presence, the Monitor hopes to avoid the biases that characterize other rankings, which commonly rely on the opinions of university officials and college counselors rather than that of the greater public." GLM believes the rankings provide an up-to-date perspective on which schools have the most popular brand. The resulting rankings gauge the relative value of the various institutions and how they change over time.
U.S. News & World Report College and University rankings
Referred to as the "granddaddy of the college rankings", America's best–known American college and university rankings have been compiled since 1983 by U.S. News & World Report and are widely regarded as the most influential of all college rankings.
The US News rankings are based upon data which U.S. News collects from each educational institution either from an annual survey or from the school's website. It also considers opinion surveys of university faculty and administrators outside the school. The college rankings were published in all years thereafter, except 1984.
The US News listings have gained such influence that some Universities have made it a specific goal to reach a particular level in the US News rankings. Belmont University president Bob Fisher stated in 2010, "Rising to the Top 5 in U.S. News represents a key element of Belmont’s Vision 2015 plan." Clemson University made it a public goal to rise to the Top 20 in the US News rankings, and made specific changes, including reducing class size and altering the presentation of teacher salaries, so as to perform better in the statistical analysis by US News. And at least one university, Arizona State, has actually tied the university president's pay to an increase in the school's placement in the US News rankings.
U.S. News precise methodology has changed many times, and the data are not all available to the public. The actual presentation of rankings has changed as well. For many years, the magazine divided each category of post-secondary institutions into quartiles, with the schools in the highest quartile ("First Tier") ranked from 1 to about 50. All schools in the lower three quartile were merely identified as being in the "Second Tier", "Third Tier", and "Fourth Tier". However, the system was dramatically changed starting with the 2011 rankings and now all the schools ranked in the top three quartiles (ranked 1 to 194) are "First Tier" Universities, and the bottom quartile—the schools in the bottom 25%—are now labeled "Second Tier".
The following are elements in the US News rankings.
- Peer assessment: a survey of the institution's reputation among presidents, provosts, and admissions deans of other institutions (15%)
- Guidance Counselor assessment: a survey of the institution's reputation among approximately 1,800 high school guidance counselors (7.5%)
- Retention: six–year graduation rate and first–year student retention rate (20%)
- Faculty resources: average class size, faculty salary, faculty degree level, student-faculty ratio, and proportion of full–time faculty (20%)
- Student selectivity: standardized test scores of admitted students, proportion of admitted students in upper percentiles of their high school class, and proportion of applicants accepted (15%)
- Financial resources: per–student spending (10%)
- Graduation rate performance: difference between expected and actual graduation rate (7.5%)
- Alumni giving rate (5%)
U.S. News determined the relative weights of these factors and changed them over time. The National Opinion Research Center reviewed the methodology and stated that the weights "lack any defensible empirical or theoretical basis". The first four of the listed factors account for the great majority of the U.S. News ranking (80%, according to U.S. News's 2005 methodology), and the "reputational measure" (which surveys high-level administrators at similar institutions about their perceived quality ranking of each college and university) is especially important to the final ranking (accounting by itself for 25% of the ranking according to the 2005 methodology).
A New York Times article reported that, given the U.S. News weighting methodology, "it's easy to guess who's going to end up on top: the Big Three, Harvard, Yale and Princeton round out the first three essentially every year. When asked how he knew his system was sound, Mel Elfin, the rankings' founder, often answered that he knew it because those three schools always landed on top. When a new lead statistician, Amy Graham, changed the formula in 1999 to one she considered more statistically valid, the California Institute of Technology jumped to first place. Ms. Graham soon left, and a modified system pushed Princeton back to No. 1 the next year."
Research at the University of Michigan analyzed the effects of the U.S. News & World Report rankings, showing a lasting effect on college applications and admissions by students in the top 10% of their class. In addition, they found that rankings influence survey assessments of reputation by college presidents at peer institutions, such that rankings and reputation are becoming much more similar over time.
United States National Research Council Rankings
Washington Monthly national universities rankings
The Washington Monthly's "National Universities Rankings", most recently published in 2013, began as a research report in 2005, with rankings appearing in the September 2006 issue. It offers American university and college rankings based upon "contribution to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and PhDs), and Service (encouraging students to give something back to their country)."
Revealed preference rankings
Avery et al. pioneered the use of choice modelling to rank colleges. Their methodology used a statistical analysis of the decisions of 3,240 students who applied to college in 1999. MyChances.net adopted a similar approach starting in 2009, stating that its method is based on this approach. The study analysed students admitted to multiple colleges. The college they attended became the winner, and the others became the losers. An Elo rating system was used to assign points based on each win or loss, and the colleges were ranked based on their Elo points. A useful consequence of the use of Elo points is that they can be used to estimate the frequency with which students, upon being admitted to two schools, will choose one over the other.
Other organizations that rank US institutions include the Fiske Guide to Colleges and College Prowler. Many specialized rankings are available in guidebooks, considering individual student interests, fields of study, geographical location, financial aid and affordability.
Among the rankings dealing with individual fields of study is the Philosophical Gourmet Report or "Leiter Report", a ranking of philosophy departments. This report has attracted criticism from different viewpoints. Notably, practitioners of continental philosophy, who perceive the Leiter report as unfair to their field, have compiled alternative rankings.
The Gourman Report, last published in 1996, ranked the quality of undergraduate majors and graduate programs.
The Higher Education Rankings, developed and managed by the New York City consulting company IV Research, is a commercial product that provides both general rankings as well as personalized rankings based on a complicated assessment of 6 criteria and 30 indicators.
Global Language Monitor produces a "TrendTopper MediaBuzz" ranking of the Top 300 United States colleges and universities semi–annually. It publishes overall results for both university and college categories. It uses the Carnegie Foundation for the Advancement of Teaching’s classifications to distinguish between universities and liberal arts colleges. The rankings list 125 universities, 100 colleges, the change in the rankings over time, a "Predictive Quantities Indicator" (PQI) Index number (for relative rankings), rankings by Momentum (yearly and 90-day snapshots), and rankings by State. The most recent ranking appeared on November 1, 2009, covering 2008. The PQI index is produced by Global Language Monitor's proprietary PQI algorithm, which has been criticized by some linguists for its use in a counting of the total number of English words. The Global Language Monitor also sells the TrendTopper MediaBuzz Reputation Management solution for higher education for which "colleges and universities can enhance their standings among peers". The Global Language Monitor states that it "does not influence the Higher Education rankings in any way".
The Princeton Review, annually publishes a book of Best Colleges. In 2011, this was titled The Best 373 Colleges. Phi Beta Kappa has also sought to establish chapters at the best schools, lately numbering 280.
American college and university ranking systems have drawn criticism from within and outside higher education in Canada and the United States. Institutions that have objected include Reed College, Alma College, Mount Holyoke College, St. John's College, Earlham College, MIT, Stanford University, University of Western Ontario, and Queen's University.
Critics charged that U.S. News intentionally changed its methodology every year so that the rankings change and they can sell more magazines. A San Francisco Chronicle article argues that "almost all of US News factors are redundant and can be boiled down to one characteristic: the size of the college or university's endowment."
Some higher education experts, like Kevin Carey of Education Sector, have argued that U.S. News and World Report's college rankings system is merely a list of criteria that mirrors the superficial characteristics of elite colleges and universities. According to Carey, "[The] U.S. News ranking system is deeply flawed. Instead of focusing on the fundamental issues of how well colleges and universities educate their students and how well they prepare them to be successful after college, the magazine's rankings are almost entirely a function of three factors: fame, wealth, and exclusivity." He suggested more important characteristics are how well students are learning and how likely students are to earn a degree.
On 19 June 2007, during the annual meeting of the Annapolis Group, members discussed a letter to college presidents asking them not to participate in the "reputation survey" section of the U.S. News survey (this section comprises 25% of the ranking). As a result, "a majority of the approximately 80 presidents at the meeting said that they did not intend to participate in the U.S. News reputational rankings in the future." However, the decision to fill out the reputational survey was left to each individual college. The statement stated that its members "have agreed to participate in the development of an alternative common format that presents information about their colleges for students and their families to use in the college search process." This database was outlined and developed in conjunction with higher education organizations including theNational Association of Independent Colleges and Universities and the Council of Independent Colleges.
U.S. News and World Report editor Robert Morse issued a response on 22 June 2007, stating:
"in terms of the peer assessment survey, we at U.S. News firmly believe the survey has significant value because it allows us to measure the "intangibles" of a college that we can't measure through statistical data. Plus, the reputation of a school can help get that all-important first job and plays a key part in which grad school someone will be able to get into. The peer survey is by nature subjective, but the technique of asking industry leaders to rate their competitors is a commonly accepted practice. The results from the peer survey also can act to level the playing field between private and public colleges."
In reference to the alternative database discussed by the Annapolis Group, Morse argued:
"It's important to point out that the Annapolis Group's stated goal of presenting college data in a common format has been tried before [...] U.S. News has been supplying this exact college information for many years already. And it appears that NAICU will be doing it with significantly less comparability and functionality.U.S. News first collects all these data (using an agreed-upon set of definitions from the Common Data Set). Then we post the data on our website in easily accessible, comparable tables. In other words, the Annapolis Group and the others in the NAICU initiative actually are following the lead of U.S. News."
Knowing that universities—and, in most cases, the statistics they submit—change little from one year to the next, I can only conclude that what are changing are the formulas the magazine's number massagers employ. And, indeed, there is marked evidence of that this year. In the category "Faculty resources," even though few of us had significant changes in our faculty or student numbers, our class sizes, or our finances, the rankings' producers created a mad scramble in rank order [...data...]. Then there is "Financial resources," where Stanford dropped from #6 to #9, Harvard from #5 to #7. Our resources did not fall; did other institutions' rise so sharply? I infer that, in each case, the formulas were simply changed, with notification to no one, not even your readers, who are left to assume that some schools have suddenly soared, others precipitously plummeted.
- Bowman, Nicholas and Michael Bastedo,"Getting on the Front Page: Organizational Reputation, Status Signals, and the Impact of U.S. News and World Report Rankings on Student Decisions." personal.umich.edu Retrieved June 29, 2010.
- ACTA. "What Will They Learn?". Retrieved 14 September 2010.
- "ACTA Gives College Highest Possible Academic Ranking". Thomas Aquinas College. September 1, 2011. Retrieved May 23, 2012.
- McGurn, William (November 1, 2011). "What's Your Kid Getting From College?". Wall Street Journal. Retrieved April 9, 2012.
- Daniel L. Bennett (19 August 2009). "What Will They Learn?". Center for College Affordability and Productivity. Retrieved 9 February 2010.
- Stanley Fish (24 August 2009). "What Should Colleges Teach?". The New York Times. Retrieved 9 February 2010.
- "FSP Index Top Performing Schools". Academic Analytics, LLC. 2008-08-22. Archived from the original on 2008-08-22. Retrieved 2009-09-28.
- "The Faculty Scholarly Productivity Index, a means of assessing doctoral programs". Unknown parameter
- Center for College Affordability and Productivity
- "America's Top Colleges". Forbes. Retrieved 29 October 2011.
- Noer, Michael (3 August 2011). "America's Top Colleges (Methodology)". Forbes. Retrieved 29 October 2011.
- "The Top American Research Universities". The Center for Measuring University Performance. Retrieved 2009-08-23.
- The Most Buzzed-About University?
- Harvard, Yale Beaten
- The Most Buzz Worthy Schools
- "America's Best Colleges". U.S. News and World Report. 2007.
- A review of US News ranking by NORC
- Thompson, Nicholas (2003): "The Best, The Top, The Most"; The New York Times, August 3, 2003, Education Life Supplement, p. 24
- Bastedo, Michael N. and Nicholas A. Bowman. "The U.S. News and World Report College Rankings: Modeling Institutional Effects on Organizational Reputation." personal.umich.edu Retrieved June 29, 2010.
- NAP.edu report
- "Assessment of Research-Doctorate Programs". Sites.nationalacademies.org. Retrieved 2010-06-08.
- "The Washington Monthly's Annual College Guide"
- The Washington Monthly "2013 National Universities Rankings"
- A Revealed Preference Ranking of U.S. Colleges and Universities. SSRN 601105.
- "2009–2010 College Rankings: National Universities". Retrieved 2010-07-17.
- "New College Rankings". Retrieved 2010-07-17.
- Founded by Brian Leiter then of the University of Texas at Austin, now[update] University of Chicago
- IVRI. "Overview & Methodology". Retrieved 10 September 2012.
- "Harvard Number One University in Eyes of Public". Gallup.com. 2003-08-26. Retrieved 2010-06-08.
- "College Rankings". Global Language Monitor. Retrieved 2009-11-03.
- "PQI". Global Language Monitor. Retrieved 2009-11-03.
- Goldsmith, Belinda (2009-06-10). Fahmy, Miral, ed. Web 2.0 crowned one millionth English word. Los Angeles, CA: Reuters. Retrieved 2009-11-03
- Zimmer, Benjamin (2009-01-03). "The "million word" hoax rolls along". Language Log, Linguistic Data Consortium. Retrieved 2009-11-03.
- Walker, Ruth (2009-01-02). "Save the date: English nears a milestone". The Christian Science Monitor. Retrieved 2009-01-14.
- Sutter, John D. (2009-06-10). English gets millionth word on Wednesday, site says. CNN. Retrieved 2009-11-03
- "TrendTopper enhances college reputation". Global Language Monitor. Retrieved 2009-11-03.
- "College Rankings". The Global Language Monitor. Retrieved 2009-11-10.[dead link]
- Rojstaczer, Stuart (2001-09-03). "College rankings are mostly about money". San Francisco Chronicle. Retrieved 2006-12-11.
- Carey, Kevin. "College Rankings Reformed". educationsector.org Retrieved July 28, 2009.
- Jaschik, Scott (20 June 2007). "More Momentum Against ‘U.S. News’". Inside Higher Ed.
- "Annapolis group statement on rankings and ratings". Annapolis Group. 19 June 2007.
- Morse, Robert (22 June 2007). "About the Annapolis Group's Statement". U.S. News and World Report.
- "Criticism of College Rankings – September 23, 1996". Stanford.edu. 1996-09-23. Retrieved 2010-06-08.