U.S. News & World Report Best Colleges Ranking
In 1983, U.S. News & World Report published its first "America's Best Colleges" report. The rankings have been compiled and published annually since 1985 and are the most widely quoted of their kind in the United States. These rankings are based upon data that U.S. News & World Report collects from each educational institution from an annual survey sent to each school. The rankings are also based upon opinion surveys of university faculties and administrators who do not belong to the schools.
The popularity of U.S. News & World Report's Best Colleges rankings is reflected in its 2014 release, which brought 2.6 million unique visitors and 18.9 million page views to usnews.com in one day. Traffic came from over 3,000 sites, including Facebook and Google. U.S. News & World Report continues to publish comprehensive college guides in book form. Robert Morse created the U.S. News Best Colleges rankings methodology, and continues to oversee its application as chief data strategist at U.S. News. In 2014, the Washington Post featured a profile of Morse, exploring his 30-year career with the publication. In October 2014, the U.S. News & World Report published its inaugural "Best Global Universities" rankings. Inside Higher Ed noted that the U.S. News is entering into the international college and university rankings area that is already "dominated by three major global university rankings", namely the Times Higher Education World University Rankings, the Academic Ranking of World Universities, and the QS World University Rankings. Robert Morse stated that "it's natural for U.S. News to get into this space." Morse also noted that the U.S. News "will also be the first American publisher to enter the global rankings space."
The magazine U.S. News & World Report's rankings are based upon information they collect from educational institutions via an annual survey, government and third party data sources, and school websites. It also considers opinion surveys of university faculty and administrators outside the school. Their college rankings were first published in 1983 and have been published in all years thereafter, except 1984.
The US News listings have gained such influence that some Universities have made it a specific goal to reach a particular level in the US News rankings. Belmont University president Bob Fisher stated in 2010, "Rising to the Top 5 in U.S. News represents a key element of Belmont’s Vision 2015 plan." Clemson University made it a public goal to rise to the Top 20 in the US News rankings, and made specific changes, including reducing class size and altering the presentation of teacher salaries, so as to perform better in the statistical analysis by US News. At least one university, Arizona State, has actually tied the university president's pay to an increase in the school's placement in the US News rankings.
The following are elements in the US News rankings.
- Peer assessment: a survey of the institution's reputation among presidents, provosts, and admissions deans of other institutions (15%)
- Guidance Counselor assessment: a survey of the institution's reputation among approximately 1,800 high school guidance counselors (7.5%)
- Retention: six-year graduation rate and first-year student retention rate (20%)
- Faculty resources: class sizes, faculty salary, faculty degree level, student-faculty ratio, and proportion of full-time faculty (20%)
- Student selectivity: standardized test scores of admitted students, proportion of admitted students in upper percentiles of their high school class, and proportion of applicants accepted (15%)
- Financial resources: per-student spending related to academics and public service. (10%)
- Graduation rate performance: comparison between modeled expected and actual graduation rate (7.5%)
- Alumni giving rate (5%)
U.S. News determined the relative weights of these factors and changed them over time. The National Opinion Research Center reviewed the methodology and stated that the weights "lack any defensible empirical or theoretical basis". The first four of the listed factors account for the great majority of the U.S. News ranking (62.5%, according to U.S. News's 2017 methodology), and the "reputational measure" (which surveys high-level administrators at similar institutions about their perceived quality ranking of each college and university) is especially important to the final ranking (accounting by itself for 22.5% of the ranking according to the 2017 methodology).
A New York Times article reported that, given the U.S. News weighting methodology, "it's easy to guess who's going to end up on top: the Big Three, Harvard, Yale and Princeton round out the first three essentially every year. When asked how he knew his system was sound, Mel Elfin, the rankings' founder, often answered that he knew it because those three schools always landed on top. When a new lead statistician, Amy Graham, changed the formula in 1999 to one she considered more statistically valid, the California Institute of Technology jumped to first place. Ms. Graham soon left, and a modified system pushed Princeton back to No. 1 the next year."
A 2010 study by the University of Michigan found that university rankings in the United States significantly affect institutions' applications and admissions. The research analyzed the effects of the U.S. News & World Report rankings, showing a lasting effect on college applications and admissions by students in the top 10% of their class. In addition, they found that rankings influence survey assessments of reputation by college presidents at peer institutions, such that rankings and reputation are becoming much more similar over time.
A 2014 study published in Research in Higher Education removed the mystique of the U.S. News ranking process by producing a ranking model that faithfully recreated U.S. News outcomes and quantified the inherent “noise” in the rankings for all nationally ranked universities. The model developed provided detailed insight into the U.S. News ranking process. It allowed the impact of changes to U.S. News subfactors to be studied when variation between universities and within subfactors was present. Numerous simulations were run using this model to understand the amount of change required for a university to improve its rank or move into the top 20. Results show that for a university ranked in the mid-30s it would take a significant amount of additional resources, directed in a very focused way, to become a top-ranked national university, and that rank changes of up to +/- 4 points should be considered “noise”.
|Top national universities||Rank||Top liberal arts colleges||Rank|
|Princeton University||1||Williams College||1|
|Harvard University||2||Amherst College||2|
|University of Chicago||3||Bowdoin College||3|
|Yale University||3||Swarthmore College||3|
|Columbia University||5||Wellesley College||3|
|Massachusetts Institute of Technology||5||Middlebury College||6|
|Stanford University||5||Pomona College||6|
|University of Pennsylvania||8||Carleton College||8|
|Duke University||9||Claremont McKenna College||8|
|California Institute of Technology||10||Davidson College||10|
|Dartmouth College||11||Washington and Lee University||10|
|Johns Hopkins University||11||Colby College||12|
|Northwestern University||11||Colgate University||12|
|Brown University||14||Harvey Mudd College||12|
|Cornell University||14||Smith College||12|
|Rice University||14||United States Military Academy||12|
|Vanderbilt University||14||Vassar College||12|
|University of Notre Dame||18||Grinnell College||18|
|Washington University in St. Louis||18||Hamilton College||18|
|Georgetown University||20||Haverford College||18|
During the 1990s, several educational institutions in the United States were involved in a movement to boycott the U.S. News & World Report college rankings survey. The first was Reed College, which stopped submitting the survey in 1995. The survey was also criticized by Alma College, Stanford University, and St. John's College during the late 1990s. SAT scores play a role in The U.S. News & World Report college rankings even though U.S. News is not empowered with the ability to formally verify or recalculate the scores that are represented to them by schools. Since the mid-1990s there have been many instances documented by the popular press wherein schools lied about their SAT scores in order to obtain a higher ranking. An exposé in the San Francisco Chronicle stated that the elements in the methodology of the U.S News and World Report are redundant and can be reduced to one thing: money.
On June 19, 2007, during the annual meeting of the Annapolis Group, members discussed the letter to college presidents asking them not to participate in the "reputation survey" section of the U.S. News & World Report survey (this section comprises 25% of the ranking). As a result, "a majority of the approximately 80 presidents at the meeting said that they did not intend to participate in the U.S. News reputational rankings in the future". The statement also said that its members "have agreed to participate in the development of an alternative common format that presents information about their colleges for students and their families to use in the college search process". This database will be web-based and developed in conjunction with higher-education organizations including the National Association of Independent Colleges and Universities and the Council of Independent Colleges. On June 22, 2007, U.S. News & World Report editor Robert Morse issued a response in which he argued, "in terms of the peer assessment survey, we at U.S. News firmly believe the survey has significant value because it allows us to measure the 'intangibles' of a college that we can't measure through statistical data. Plus, the reputation of a school can help get that all-important first job and plays a key part in which grad school someone will be able to get into. The peer survey is by nature subjective, but the technique of asking industry leaders to rate their competitors is a commonly accepted practice. The results from the peer survey also can act to level the playing field between private and public colleges". In reference to the alternative database discussed by the Annapolis Group, Morse also argued, "It's important to point out that the Annapolis Group's stated goal of presenting college data in a common format has been tried before [...] U.S. News has been supplying this exact college information for many years already. And it appears that NAICU will be doing it with significantly less comparability and functionality. U.S. News first collects all these data (using an agreed-upon set of definitions from the Common Data Set). Then we post the data on our website in easily accessible, comparable tables. In other words, the Annapolis Group and the others in the NAICU initiative actually are following the lead of U.S. News".
Some higher education experts, such as Kevin Carey of Education Sector, have asserted that U.S. News and World Report's college rankings system is merely a list of criteria that mirrors the superficial characteristics of elite colleges and universities. According to Carey, the U.S. News ranking system is deeply flawed. Instead of focusing on the fundamental issues of how well colleges and universities educate their students and how well they prepare them to be successful after college, the magazine's rankings are almost entirely a function of three factors: fame, wealth, and exclusivity. He suggests that there are more important characteristics parents and students should research to select colleges, such as how well students are learning and how likely students are to earn a degree.
The question of college rankings and their impact on admissions gained greater attention in March 2007, when Dr. Michele Tolela Myers (the former President of Sarah Lawrence College) shared in an op-ed that the U.S. News & World Report, when not given SAT scores for a university, chooses to simply rank the college with an invented SAT score of approximately one standard deviation (roughly 200 SAT points) behind those of peer colleges, with the reasoning being that SAT-optional universities will, because of their test-optional nature, accept higher numbers of less academically capable students.
In a 2011 article regarding the Sarah Lawrence controversy, Peter Sacks of The Huffington Post criticized the U.S. News rankings' centering on test scores and denounced the magazine's "best colleges" list as a scam:
In the U.S. News worldview of college quality, it matters not a bit what students actually learn on campus, or how a college actually contributes to the intellectual, ethical and personal growth of students while on campus, or how that institution contributes to the public good [...] and then, when you consider that student SAT scores are profoundly correlated [to] parental income and education levels – the social class that a child is born into and grows up with – you begin to understand what a corrupt emperor 'America's Best Colleges' really is. The ranking amounts to little more than a pseudo-scientific and yet popularly legitimate tool for perpetuating inequality between educational haves and have nots – the rich families from the poor ones, and the well-endowed schools from the poorly endowed ones.
- U.S. News Pulls Social Levers to Break Records for 'Best Colleges' Package Archived 2015-01-23 at the Wayback Machine.
- "Amazon's listings of U.S. News "College Guides"". Amazon.com. Retrieved 2015-01-17.
- The U.S. News college rankings guru
- "'U.S. News' to Issue New Global University Rankings". Inside Higher Ed.
- "America's Best Colleges". U.S. News and World Report. 2007.
- A review of US News ranking by NORC Archived 2011-06-05 at the Wayback Machine.
- Thompson, Nicholas (2003): "The Best, The Top, The Most"; The New York Times, August 3, 2003, Education Life Supplement, p. 24
- Bowman, Nicholas and Michael Bastedo,"Getting on the Front Page: Organizational Reputation, Status Signals, and the Impact of U.S. News & World Report Rankings on Student Decisions." personal.umich.edu Retrieved June 29, 2010.
- Bastedo, Michael N. and Nicholas A. Bowman. "The U.S. News & World Report College Rankings: Modeling Institutional Effects on Organizational Reputation." personal.umich.edu Retrieved June 29, 2010.
- Gnolek et al. (2014): "Modeling Change and Variation in U.S. News & World Report College Rankings: What would it really take to be in the Top 20?"; Research In Higher Education, May 18, 2014.
- Christopher B. Nelson, "Why you won't find St. John's College ranked in U.S.News & World Report Archived 2007-09-27 at the Wayback Machine.", University Business: The Magazine for College and University Administrators.
- Diver, Colin. "Is There Life After Rankings". The Atlantic. November 1, 2005.
- Rojstaczer, Stuart (September 3, 2001). "College Rankings are Mostly About Money". San Francisco Chronicle.
- Jaschik, Scott (20 June 2007). "More Momentum Against 'U.S.News'". Inside Higher Ed.
- "ANNAPOLIS GROUP STATEMENT ON RANKINGS AND RATINGS". Annapolis Group. 19 June 2007.
- Morse, Robert (22 June 2007). "About the Annapolis Group's Statement". U.S. News & World Report. Archived from the original on 2 July 2007.
- Carey, Kevin. "College Rankings Reformed" (PDF). educationsector.org. Archived from the original (PDF) on August 23, 2009. Retrieved July 28, 2009.
- Tolela Myers, Michele (11 March 2007). "The Cost of Bucking College Rankings". The Washington Post.
- Sacks, Peter (May 25, 2011). "America's Best College Scam". The Huffington Post. Retrieved April 26, 2016.