Jump to content

College and university rankings

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 128.189.137.90 (talk) at 04:47, 10 October 2008 (Academic Ranking of World Universities). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In higher education, college and university rankings are listings of universities and liberal arts colleges in an order determined by any combination of factors. Rankings can be based on subjectively perceived "quality," on some combination of empirical statistics, or on surveys of educators, scholars, students, prospective students, or others. Rankings are often consulted by prospective students and their parents in the university and college admissions process.

In addition to rankings of institutions, there are also rankings of specific academic programs, departments, and schools. Rankings are conducted by magazines and newspapers and in some instances by academic practitioners. (See, for example, law school rankings in the United States.)

Rankings may vary significantly from country to country. A Cornell University study found that the rankings in the United States significantly affected colleges' applications and admissions[1]. In the United Kingdom, several newspapers publish league tables which rank universities.

International rankings from regional organizations

Several regional organizations provide worldwide rankings, including:

The Times Higher Education - QS World University Rankings

Times Higher Education, a British publication that reports specifically on issues related to higher education, in association with Quacquarelli Symonds, annually publishes the THES - QS World University Rankings, a list of 500 ranked universities from around the world.[2] In comparison with other rankings, many more non-American universities (especially British) populate the upper tier of the THES ranking. The THES - QS ranking faces criticism due to the more subjective nature of its assessment criteria, which are largely based on a 'peer review' system of over 3000 scholars and academics in various fields [3].

Academic Ranking of World Universities

The much-publicised Academic Ranking of World Universities [3] compiled by the Shanghai Jiao Tong University, which was a large-scale Chinese project to provide independent rankings of universities around the world primarily to measure the gap between Chinese and "world class" universities. The results have often been cited by The Economist magazine in ranking universities of the world [4]. As with all rankings, there are issues of methodology, and one of the primary criticisms of the ranking is its bias towards the natural sciences, over other subjects and English language science journals. This is evidenced by the inclusion of criteria such as the volume of articles published by Science or Nature (both Journals devoted to the natural sciences published in English), or the number of Nobel Prize winners (which are predominantly awarded to the physical sciences) and Fields Medalists (mathematics). In addition to the criticisms, a 2007 paper from the peer-reviewed journal Scientometrics finds that the results from the Shanghai university rankings are irreproducible.[4]

Newsweek

In August 2006, the Newsweek magazine of US published a ranking of the Top 100 Global Universities, utilising selected criteria from two rankings (Academic Ranking of World Universities by Shanghai Jiao Tong University and The Times Higher Education), with the additional criterion of library holdings (number of volumes). It aimed at 'taking into account openness and diversity, as well as distinction in research'.[5]

Webometrics

The Webometrics Ranking of World Universities is produced by the Cybermetrics Lab (CINDOC), a unit of the National Research Council (CSIC), the main public research body in Spain. It offers information about more than 4,000 universities according to their web-presence (a computerised assessment of the scholarly contents and visibility and impact of the whole university webdomain).

The Webometrics Ranking is built from a database of over 15,000 universities and more than 5,000 research centers. The Top 4,000 universities are shown in the main rank, but even more are covered in the regional lists. Institutions from developing countries benefit from this policy as they obtain knowledge of their current position even if they are not World-Class Universities.

The ranking started in 2004 and is based on a combined indicator that takes into account both the volume of the Web contents and the visibility and impact of this web publications according to the number of external inlinks they received. The ranking is updated every January and July, providing Web indicators for universities and research centres worldwide. This approach takes into account the wide range of scientific activities represented in the academic websites, frequently overlooked by the bibliometric indicators.

Webometric indicators are provided to show the commitment of the institutions to Web publication. Thus, Universities of high academic quality may be ranked lower than expected due to a restrained web publication policy.

G-Factor

One refinement of the Webometrics approach is the G-Factor methodology, which counts the number of links only from other university websites. The G-Factor is an indicator of the popularity or importance of each university's website from the combined perspectives of the creators of many other university websites. It is therefore a kind of extensive and objective peer review of a university through its website - in social network theory terminology, the G-Factor measures the centrality of each university's website in the network of university websites.

Professional Ranking of World Universities

In contrast to the Academic Ranking of World Universities, the Professional Ranking of World Universities established in 2007 by the École nationale supérieure des mines de Paris intends to measure the efficiency of each university on a professional basis. Its main compilation criterion is the number of Chief Executive Officers (or number 1 executive equivalent) in the among the "500 leading worldwide companies" as measured by revenue who studied in each university. This is based on the Fortune Global 500 2006 ranking. The Academic Ranking of World Universities and the Professional Ranking of World Universities could be considered as complementary and not exclusive since the first one measures the ability of the university to train academically preeminent people while the second one measures its ability to economically train preeminent ones.

Performance Ranking of Scientific Papers for World Universities

The “Performance Ranking of Scientific Papers for World Universities” is a bibliometric based ranking produced by the Higher Education Evaluation and Accreditation Council of Taiwan. The 2007 performance measures are composed of nine indicators(11 years articles、Current articles、11 years citations、Current citations、Average citations、H-index、Highly cited papers、High Impact journal articles、Fields of excellence). The 2008 performance measures are composed of eight indicators. The indicator of “fields of excellence” used in the 2007 ranking has been removed and the weightings of the 2008 indicators have been adjusted accordingly. The indicators representing three different criteria of scientific papers performance: research productivity, research impact, and research excellence. This project employs bibliometric methods to analyze and rank the scientific papers performances of the top 500 universities in the world.[6]

From 2008 HEEACT begins to provide six subject fields based rankings of world universities. The six subject fields are Agriculture & Environment Sciences (AGE), Clinical Medicine (MED), Engineering, Computing & Technology (ENG), Life Sciences (LIFE), Natural Sciences (SCI), and Social Sciences (SOC).The indicators employed in the field-based ranking and their respective weightings in the composite measures are identical to those used in the overall performance ranking.

Wuhan University

Another ranking is by Research Center for Chinese Science Evaluation of Wuhan University. The ranking is based on Essential Science Indicators (ESI), which provides data of journal article publication counts and citation frequencies in over 11,000 journals around the world in 22 research fields. The website for this global university ranking by Wuhan University has been translated into English.

Regional and national rankings

The following regional and national rankings are presented based on alphabetical ordering of the respective ranking's focal country followed by its title.

Canada

Maclean's, a news magazine in Canada, ranks Canadian Universities on an annual basis known as the Maclean’s University Rankings. [5] The criteria used by the magazine are based upon a number of factors, which include characteristics of the student body, classes, faculty, finances, the library, and reputation. The rankings are split into three categories: primarily undergraduate (schools that focus on undergraduate studies with few to no graduate programs), comprehensive (schools that have both extensive undergraduate studies and an extensive selection of graduate programs), and medical doctoral (schools that have a professional medical program and a selection of graduate programs).

These rankings have received scrutiny and criticism from universities. For example, the University of Calgary produced a formal study examining the methodology of the ranking, illuminating the factors that determined the university's rank, and criticizing certain aspects of the methodology. In addition, the University of Alberta and the University of Toronto have both expressed displeasure over Maclean's ranking system. A notable difference between rankings in the United States and Maclean's rankings, however, is that Maclean's does not include privately-funded universities in its rankings. However, the vast majority and the best-known universities in Canada are publicly funded.

Beginning in September 2006, a number (over 20) of Canadian universities, including several of the largest and most prominent, jointly refused to participate in Maclean's survey. [6] The president of the University of Alberta, Indira Samarasekera, wrote of this protest that Maclean's initially filed a "Freedom of Information" request but that "it was too late" for the universities to respond. Samarasekera further stated, "Most of [the universities] had already posted the data online, and we directed Maclean’s staff to our Web sites. In instances where the magazine staff couldn’t find data on our Web site, they chose to use the previous year’s data."[7]

India

In India there is no formal system of rankings for Colleges and Universities. Various rankings are carried out by interested organizations and associations, mostly newspapers and magazines which carry out their rankings independently and based on factors they consider important to them. However the most followed and regarded rankings is done by a popular national weekly magazine India Today which conducts a review of the performance of more than 150 universities and colleges in India each year and comes up with its rankings titled Top 10 Colleges in India in each area of arts, commerce, engineering, science, law and medicine. In certain areas, even city wise rankings are provided.

The ranking process of India Today starts in mid-March and the rankings are generally published in the second week edition of May each year right after the time when the results of the various boards are announced and the students are deciding the colleges and universities to apply for graduate education. The rankings take into account a number of factors like (1) Reputation, (2) Curriculum, (3) Quality of Academic Input, (4) Student Care, (5) Admission Process, (6) Infrastructure, (7) Job Prospects etc. However these seven factors are the major ones and in fact each college is ranked in each factor while the overall ranking is determined based upon an undisclosed formula wherein different weights are assigned to each factor. Besides the top 10 colleges, the weekly also gives a list of other good colleges and universities which have failed to make it to the top 10 list but carry considerable prospects.Top 10 engineering colleges(latest Ranking Updated 25 Aug 2008) in India are
1.IIT (Kharagpur)
2.IIT (Kanpur)
3.IIT (Madras)
4.IIT (Delhi)
5.BITS(Pilani,Goa)
6.IIT (Kolkata)
7.IIT (Roorkee)
8.PSG Tech(Coimbatore)
9.CEG,Anna University(Chennai)
10.IIT (guwahati)

Other rankings are generally in the field of Business Schools and are carried out by various business magazines such as Business World, Business Today, Business Review Weekly.

European Union

The European Commission also weighed in on the issue, when it compiled a list of the 22 European universities with the highest scientific impact [7], measuring universities in terms of the impact of their scientific output. This ranking was compiled as part of the Third European Report on Science & Technology Indicators [8], prepared by the Directorate General for Science and Research of the European Commission in 2003 (updated 2004).

Being an official document of the European Union (from the office of the EU commissioner for science and technology), which took several years of specialist effort to compile, it can be regarded as a highly reliable source (the full report, containing almost 500 pages of statistics is available for download free from the EU website). Unlike the other rankings, it only explicitly considers the top European institutions, but ample comparison statistics with the rest of the world are provided in the full report. The report say "University College London comes out on top in both publications (the number of scientific publications produced by the university) and citations (the number of times those scientific publications are cited by other researchers)" however the table lists the top scoring university as "Univ London" indicating that the authors have confused the University of London with its constituent colleges.

In this ranking, the top two European universities are also Oxford and Cambridge, as in the Jiao Tong and Times ranking. This ranking, however, stresses more the scientific quality of the institution, as opposed to its size or perceived prestige. [citation needed] Thus smaller, technical universities, such as Eindhoven (Netherlands) and München (Germany) are ranked third, behind Cambridge, and followed by University of Edinburgh in the UK. The report does not provide a direct comparison between European and US/world universities - although it does compute complex scientific impact score, measured against a world average.

France

Le Nouvel Observateur[8] and other popular magazines occasionally offer rankings (in French) of universities, "Grandes écoles" and their preparatory schools, the "Prépas".

Germany

CHE UniversityRanking The English version of the German CHE University Ranking is provided by the DAAD.

CHE ExcellenceRanking In December 2007, a new ranking was published in Germany from the Centre for Higher Education Development. The CHE "Ranking of Excellent European Graduate Programmes" (CHE ExcellenceRanking for short) included the disciplines of biology, chemistry, mathematics and physics. The ranking is designed to support the search for master’s or doctoral programmes at higher education institutions (HEIs). Alongside this, the CHE wants to highlight the research strengths of European HEIs and provide those HEIs listed in the ranking with ideas for the further improvement of their already excellent programmes.

CHE ResearchRanking Every year, the CHE also publishes a ResearchRanking showing the research strengths of German universities. The CHE ResearchRanking is based on the research-related data of the CHE UniversityRanking.

Ireland

The Sunday Times compiles a league of Irish universities [9] based a mix of criteria, for example:

  • Average points needed in the Leaving Certificate (end-of-secondary-school examination) for entry into an undergraduate course
  • Completion rates, staff-student ratio and research efficiency
  • Quality of accommodation and sports facilities
  • Non-standard entry (usually mature students or students from deprived neighbourhoods)

Italy

Every year La Repubblica, in collaboration with CENSIS compiles a league of Italian universities. http://www.repubblica.it/speciale/2007/guida_universita/index.html

Philippines (Asia)

Academic rankings in the Philippines are conducted by the Professional Regulation Commission and the Commission on Higher Education, and this is based on the average passing rates in all courses of all Philippine colleges and universities in the board tests.[9][10]

Switzerland

The swissUp Ranking provides a ranking for Swiss university and polytechnic students. The rankings are based on comparisons with German and Austrian universities.

UK

HESA (Higher Education Statistics Agency) oversees three yearly statistical returns (Financial, Student and Staff) which must be compiled by every HEI in the UK. These are then disseminated into usable statistics which make up a major part of the HE ranking e.g. Student Staff Ratio, Number of Academic Staff with Doctorates and Money spent on Student Service. HESA also conduct a survey of Destination of Leavers from Higher Education that is widely used in league tables as a measure of employability of graduates.

The Research Assessment Exercises (RAE) are attempts by the UK government to evaluate the quality of research undertaken by British Universities. Each subject, called a unit of assessment is given a ranking by a peer review panel. The rankings are used in the allocation of funding each university receives from the government. The last assessment was made in 2001. The RAE provides quality ratings for research across all disciplines. Panels use a standard scale to award a rating for each submission. Ratings range from 1 to 5*, according to how much of the work is judged to reach national or international levels of excellence. Higher education institutions (HEIs) which take part receive grants from one of the four higher education funding bodies in England, Scotland, Wales and Northern Ireland.

The Guardian has an Institution Ranking Table based on RAE results.

Standards of undergraduate teaching are assessed by the Quality Assurance Agency for Higher Education (QAA), an independent body established by the UK's universities and other higher education institutions in 1997. The QAA was under contract to the Higher Education Funding Council for England to assess quality for universities in England in a system of subject review. This replaced a previous system of Teaching Quality Assessments (TQAs) which aimed to assess the administrative, policy and procedural framework within which teaching took place did directly assess teaching quality. As this system of universal inspection was hugely burdensome, it was replaced by a system of information provision, one part of which is a national student survey which has been run three times, and publishes scores which have been used by the league table industry. The rankings have had to create artificial differences, however, as students are generally very satisfied.

Other European countries

Ukraine

Ministry of Education and Science of Ukraine performs official yearly university evaluations.[11] Zerkalo Nedeli newspaper ranked the top 200 Ukrainian universities in 2007.[12]

USA

Center for College Affordability & Productivity (CCAP) College and University rankings

The Center for College Affordability & Productivity (CCAP), a two year old research organization based in Washington, DC evaluates schools [10] based on student ratings (posted on ratemyprofessor.com), graduation rates, percentage of students winning Rhodes or Fulbright scholarships. For vocational success, they turn to Who's Who in America. The focus is to evaluate schools based on the success of individuals affiliated with that institution.

U.S. News & World Report College and University rankings

The best-known American college and university rankings [11] have been compiled since 1983 by the magazine U.S. News & World Report and are based upon data which U.S. News collects from each educational institution either from an annual survey sent to each school or from the school's website. It is also based upon opinion surveys of university faculty and administrators who do not belong to the school. [13] . The college rankings were not published in 1984, but were published in all years since. The precise methodology used by the U.S. News rankings has changed many times, and the data are not all available to the public, so peer review of the rankings is limited. As a result, many other rankings arose and seriously challenged the result and methodology of US News's ranking, as shown in other rankings of US universities section below.

Top 40 "National Universities" according to US News & World Report, 2007

The U.S. News rankings, unlike some other such lists, create a strict hierarchy of colleges and universities in their "top tier," rather than ranking only groups or "tiers" of schools; the individual schools' order changes significantly every year the rankings are published. The most important factors in the rankings are:

  • Peer assessment: a survey of the institution's reputation among presidents, provosts, and deans of admission of other institutions
  • Retention: six-year graduation rate and first-year student retention rate
  • Student selectivity: standardized test scores of admitted students, proportion of admitted students in upper percentiles of their high-school class, and proportion of applicants accepted
  • Faculty resources: average class size, faculty salary, faculty degree level, student-faculty ratio, and proportion of full-time faculty
  • Financial resources: per-student spending
  • Graduation rate performance: difference between expected and actual graduation rate
  • Alumni giving rate

All these factors are combined according to statistical weights determined by U.S. News. The weighting is often changed by U.S. News from year to year, and is not empirically determined (the National Opinion Research Center methodology review said that these weights "lack any defensible empirical or theoretical basis"). Critics have charged that U.S. News intentionally changes its methodology every year so that the rankings change and they can sell more magazines. The first four such factors account for the great majority of the U.S. News ranking (80%, according to U.S. News's 2005 methodology), and the "reputational measure" (which surveys high-level administrators at similar institutions about their perceived quality ranking of each college and university) is especially important to the final ranking (accounting by itself for 25% of the ranking according to the 2005 methodology).[14]

A New York Times article reported that, given the U.S. News weighting methodology, "it's easy to guess who's going to end up on top: Harvard, Yale and Princeton round out the first three essentially every year. In fact, when asked how he knew his system was sound, Mel Elfin, the rankings' founder, often answered that he knew it because those three schools always landed on top. When a new lead statistician, Amy Graham, changed the formula in 1999 to what she considered more statistically valid, the California Institute of Technology jumped to first place. Ms. Graham soon left, and a slightly modified system pushed Princeton back to No. 1 the next year."[15] A San Francisco Chronicle article argues that almost all of US News factors are redundant and can be boiled down to one characteristic: the size of the college or university's endowment."[16]

Faculty Scholarly Productivity rankings

The Faculty Scholarly Productivity Index by Academic Analytics ranks universities based on faculty publications, citations, research grants and awards.[17][18] A total of 354 institutions are studied.

The Top American Research Universities

A research ranking of American universities is researched and published in the Top American Research Universities by The Center for Measuring University Performance. The list has been published since 2000. The measurement used in this report is based on data such as research publications, citations, recognitions and funding. The information used can be found in public-accessible materials, reducing the possibility of manipulation. The research method is consistent from year to year and any changes are explained in the publication itself. References from other studies are cited.

Washington Monthly College rankings

The Washington Monthly's "College Rankings" began as a research report in 2005 and introduced its first official rankings in the September 2006 issue. It offers American university and college rankings [19] based upon the following criteria:

  • a. "how well it performs as an engine of social mobility (ideally helping the poor to get rich rather than the very rich to get very, very rich)"
  • b. "how well it does in fostering scientific and humanistic research"
  • c. "how well it promotes an ethic of service to country" [20].

Global Language Monitor Internet-based rankings

In September 2008, the Global Language Monitor ranked the nation’s colleges and universities "according their appearance on the Internet, throughout the Blogosphere, as well in the global print and electronic media" [21].

The schools were also ranked according to ‘media momentum’ defined as having the largest change in media citations over the last year, among other criteria.

The purpose of the methodology was to perceive the schools through the eyes of the world at large since “Prospective students, alumni, employers, and the world at large believe that students who are graduated from such institutions will carry on the all the hallmarks of that particular school" [22].

GLM used its proprietary Predictive Quantities Indicator (PQI) software for what it called its TrendTopper Media Buzz Analysis. It employed the Carnegie Foundation for the Advancement of Teaching’s classifications to distinguish between Universities and Liberal Arts Colleges. The schools were ranked according to their positions in early September, a mid-year snapshot, and used the last day of 2007 as the base.

Other rankings of US universities

Other organizations which compile general US annual college and university rankings include the Fiske Guide to Colleges, Princeton Review, and College Prowler. Many specialized rankings are available in guidebooks for undergraduate and graduate students, dealing with individual student interests, fields of study, and other concerns such as geographical location, financial aid, and affordability.

One commercial ranking service is Top Tier Educational Services. [12] Student centered criterion are used and despite the two year completely updated study, the rankings are updated every quarter from new input data. The criterion uses subjective data, such as peer assessment, desirability, and objective data, such as SAT, GPA.

Such new rankings schemes measures what decision makers think as opposed to why. They may or may not augment these statistics for reputation with hard, qualitative information. The authors discuss their rankings system and methodology with students but do not share their specific research tools or formulas. Again, the problem with such a ranking that uses subjective opinions is that it is very prone to personal bias, prejudice and bounded rationality. Also, public universities will be penalized because besides an academic mission, they have a social mission. They simply cannot charge as much money, or be as selective, as private universities. Also, the fact that the ranking service is a commercial company raises the question whether there are any hidden business motives behind its rankings.

Among the rankings dealing with individual fields of study is the Philosophical Gourmet Report or "Leiter Report" (after its founding author, Brian Leiter of the University of Texas at Austin), a ranking of philosophy departments. This report has been at least as controversial within its field as the general U.S. News rankings, attracting criticism from many different viewpoints. Notably, practitioners of continental philosophy, who perceive the Leiter report as unfair to their field, have compiled alternative rankings.

Avery et al. recently published a working paper for the National Bureau of Economic Research titled "A Revealed Preference Ranking of U.S. Colleges and Universities." Rather than ranking programs by traditional criteria, their analysis uses a statistical model based on applicant preferences. They based their data on the applications and outcome of 3,240 high school students. The authors feel that their ranking is less subject to manipulation compared to conventional rankings (see criticism below).

The Gourman Report, which was last published in 1996, ranked the quality of undergraduate majors.

There also exist Gallup polls that ask American adults, "All in all, what would you say is the best college or university in the United States?"[13]

Boeing has announced it will begin ranking universities by matching employee valuations with information about the colleges its engineers attended. This will help show which colleges have produced the workers it considers most valuable. These rankings will be shared with 150 universities, along with critiques based on the work records of their graduates. Boeing has stated that these rankings would not be made public.[23]

Criticism (North America)

American college and university ranking systems have drawn criticism from within and outside higher education in Canada and the United States. Some institutions critical of the ranking systems include Reed College, Alma College, Mount Holyoke College, St. John's College, Earlham College, MIT, and Stanford University.

2007 movement

On 19 June, 2007, during the annual meeting of the Annapolis Group, members discussed the letter to college presidents asking them not to participate in the "reputation survey" section of the U.S. News and World Report survey (this section comprises 25% of the ranking). As a result, "a majority of the approximately 80 presidents at the meeting said that they did not intend to participate in the U.S. News reputational rankings in the future."[24] However, the decision to fill out the reputational survey or not will be left up to each individual college as: "the Annapolis Group is not a legislative body and any decision about participating in the US News rankings rests with the individual institutions."[25] The statement also said that its members "have agreed to participate in the development of an alternative common format that presents information about their colleges for students and their families to use in the college search process."[25] This database will be web based and developed in conjunction with higher education organizations including the National Association of Independent Colleges and Universities and the Council of Independent Colleges.

U.S. News and World Report editor Robert Morse issued a response on 22 June, 2007, in which he argued, "in terms of the peer assessment survey, we at U.S. News firmly believe the survey has significant value because it allows us to measure the "intangibles" of a college that we can't measure through statistical data. Plus, the reputation of a school can help get that all-important first job and plays a key part in which grad school someone will be able to get into. The peer survey is by nature subjective, but the technique of asking industry leaders to rate their competitors is a commonly accepted practice. The results from the peer survey also can act to level the playing field between private and public colleges."[26] In reference to the alternative database discussed by the Annapolis Group, Morse also argued, "It's important to point out that the Annapolis Group's stated goal of presenting college data in a common format has been tried before [...] U.S. News has been supplying this exact college information for many years already. And it appears that NAICU will be doing it with significantly less comparability and functionality. U.S. News first collects all these data (using an agreed-upon set of definitions from the Common Data Set). Then we post the data on our website in easily accessible, comparable tables. In other words, the Annapolis Group and the others in the NAICU initiative actually are following the lead of U.S. News."[26]

References

  1. ^ Study shows college rankings do matter
  2. ^ the QS rankings
  3. ^ Peer Review Methodology
  4. ^ Răzvan V. Florian (2007). "Irreproducibility of the results of the Shanghai academic ranking of world universities". Scientometrics. 72 (1): 25–32. doi:10.1007/s11192-007-1712-1. {{cite journal}}: Unknown parameter |month= ignored (help)
  5. ^ The Top 100 Global Universities - 2006. Retrieved, August 15, 2008.
  6. ^ "Performance Ranking of Scientific Papers for World Universities". Retrieved 2008-09-05.
  7. ^ Samarasekera, Indira (2 April 2007). "Rising Up Against Rankings". Inside Higher Ed. {{cite web}}: Check date values in: |date= (help)
  8. ^ http://fr.wikipedia.org/wiki/Le_Nouvel_Observateur
  9. ^ Vanzi, Sol Jose. "Xavier University Cagayan beats UP in State Tests Average". Philippine Headline News Online.March 29, 2000.
  10. ^ "UP is no. 1 based on PRC exams". UP Newsletter, Vol. XXVIII, No. 09. September 01, 2007.
  11. ^ "Ministry of Education and Science of Ukraine [[:Template:Uk icon]]". Retrieved 2007-09-28. {{cite web}}: URL–wikilink conflict (help)
  12. ^ "200 Best Ukrainian Universities [[:Template:Uk icon]]". Retrieved 2007-08-10. {{cite web}}: URL–wikilink conflict (help)
  13. ^ "America's Best Colleges". U.S. News and World Report. 2007.
  14. ^ A review of US News ranking by NORC
  15. ^ Thompson, Nicholas (2003): "The Best, The Top, The Most;" The New York Times, August 3, 2003, Education Life Supplement, p. 24
  16. ^ Rojstaczer, Stuart (2001-09-03). "College rankings are mostly about money". San Francisco Chronicle. Retrieved 2006-12-11.
  17. ^ Academic Analytics
  18. ^ The Chronicle of Higher Education
  19. ^ The Washington Monthly College Rankings
  20. ^ The Washington Monthly's Annual College Guide
  21. ^ [1]
  22. ^ [2]
  23. ^ Basken, Paul (September 19, 2008). "Boeing to Rank Colleges by Measuring Graduates' Job Success". The Chronicle of Higher Education. p. 1.
  24. ^ Jaschik, Scott (20 June 2007). "More Momentum Against 'U.S. News'". Inside Higher Ed. {{cite web}}: Check date values in: |date= (help)
  25. ^ a b "ANNAPOLIS GROUP STATEMENT ON RANKINGS AND RATINGS". Annapolis Group. 19 June 2007. {{cite web}}: Check date values in: |date= (help)
  26. ^ a b Morse, Robert (22 June 2007). "About the Annapolis Group's Statement". U.S. News and World Report. {{cite web}}: Check date values in: |date= (help)

See also