Research Assessment Exercise
The Research Assessment Exercise (RAE) is an exercise undertaken approximately every 5 years on behalf of the four UK higher education funding councils (HEFCE, SHEFC, HEFCW, DELNI) to evaluate the quality of research undertaken by British higher education institutions. RAE submissions from each subject area (or unit of assessment) are given a rank by a subject specialist peer review panel. The rankings are used to inform the allocation of quality weighted research funding (QR) each higher education institution receives from their national funding council. Previous RAEs took place in 1986, 1989, 1992, 1996 and 2001. The most recent results were published in December 2008.
Various media have produced league tables of institutions and disciplines based on the 2008 RAE results. Different methodologies lead to similar but non-identical rankings.
The first exercise of assessing University research in the UK took place in 1986 under the Margaret Thatcher Government. It was conducted by the University Grants Committee, a predecessor of the present Higher Education Funding Councils. The purpose of the exercise was to determine the allocation of funding to UK Universities at a time of tight budgetary restrictions. The committee received submissions of research statements from subject areas ("cost centres") within Universities, and issued quality rankings labelled "outstanding", "above average", "average" or "below average". The research funding allocated to Universities (called "quality-related" funding) depended on the quality ratings of the subject areas.
Two subsequent research assessments were conducted in 1989 and 1992 under the name "research selectivity exercise" by the Universities Funding Council. These were followed by "research assessment exercises" conducted in 1996, 2001 and 2008 jointly by the various UK Higher Education Funding Councils.
The 2008 RAE used a four-point quality scale, and returned a profile, rather than a single aggregate quality score, for each unit. The quality levels—based on assessment of research outputs, research environment and indicators of esteem—are defined as:
|4*||Quality that is world-leading in terms of originality, significance and rigour|
|3*||Quality that is internationally excellent in terms of originality, significance and rigour but which nonetheless falls short of the highest standards of excellence|
|2*||Quality that is recognised internationally in terms of originality, significance and rigour|
|1*||Quality that is recognised nationally in terms of originality, significance and rigour|
|Unclassified||Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.|
Each unit of assessment was given a quality profile - a five-column histogram - indicating the proportion of the research that meets each of four quality levels or is unclassified.
In 1992, 1996 and 2001, the following descriptions were used for each of the ratings.
|2001 & 1996 Rating||1992 Rating||Description|
|5*||5*||Research quality that equates to attainable levels of international excellence in more than half of the research activity submitted and attainable levels of national excellence in the remainders.|
|5||5||Research quality that equates to attainable levels of international excellence in up to half of the research activity submitted and to attainable levels of national excellence in virtually all of the remainder. (Same definition)|
|4||4||Research quality that equates to attainable levels of national excellence in virtually all of the research activity submitted, showing some evidence of international excellence. (Same definition)|
|3a||3||Research quality that equates to attainable levels of national excellence in over two-thirds of the research activity submitted, possibly showing evidence of international excellence. (Research quality that equates to attainable levels of national excellence in a majority of the sub-areas of activity, or to international level in some)|
|3b||3||Research quality that equates to attainable levels of national excellence in more than half of the research activity submitted. (Research quality that equates to attainable levels of national excellence in a majority of the sub-areas of activity, or to international level in some)|
|2||2||Research quality that equates to attainable levels of national excellence in up to half of the research activity submitted. (Same definition)|
|1||1||Research quality that equates to attainable levels of national excellence in none, or virtually none, of the research activity submitted. (Same definition)|
These ratings have been applied to "units of assessment", such as French or Chemistry, which often broadly equate to university departments. Various unofficial league tables have been created of university research capability by aggregating the results from units of assessment. Compiling league tables of universities based on the RAE is problematic, as volume and quality are both significant factors.
The assessment process for the RAE focuses on quality of research outputs (which usually means papers published in academic journals and conference proceedings), research environment, and indicators of esteem. Each subject panel determines precise rules within general guidance. For RAE 2008, institutions are invited to submit four research outputs, published between January 2001 and December 2007, for each full-time member of staff selected for inclusion.
In response to criticism of earlier assessments, and developments in employment law, the 2008 RAE does more to take into account part-time workers or those new to a sufficient level of seniority to be included in the process.
The RAE has not been without its critics. Amongst the criticisms is the fact that it explicitly ignores the publications of most full-time researchers in the UK, on the grounds that they are employed on fixed term contracts. According to the RAE 2008 guidelines, most research assistants are "not eligible to be listed as research active staff". Publications by researchers on fixed term contracts are excluded from the Assessment Exercise unless those publications can be credited to a member of staff who is eligible for the RAE. This applies even if the member of staff being assessed only made a minor contribution to the article. The opposite pheonomenon is also true, where non-research active staff on permanent contracts, such as lecturers who have been responsible primarily for teaching activities have also found themselves placed under deeper contractual pressure by their employing universities to produce research output. Another issue is that it is doubtful whether panels of experts have the necessary expertise to evaluate the quality of research outputs, as experts perform much less well as soon as they are outside their particular area of specialisation.
The RAE has had a disastrous impact on the UK higher education system, leading to the closure of departments with strong research profiles and healthy student recruitment. It has been responsible for job losses, discriminatory practices, widespread demoralisation of staff, the narrowing of research opportunities through the over-concentration of funding and the undermining of the relationship between teaching and research.
The official Review of Research Assessment, the 2003 "Roberts Report" commissioned by the UK funding bodies, recommended changes to research assessment, partly in response to such criticisms.
The House of Commons Science and Technology Select Committee considered the Roberts report, and took a more optimistic view, asserting that, "the RAE had had positive effects: it had stimulated universities into managing their research and had ensured that funds were targeted at areas of research excellence", it concluded that "there had been a marked improvement in universities' research performance". Nevertheless, it argued that "the RAE in its present form had had its day", and proposed a reformed RAE, largely based on Roberts' recommendations.
Planned changes to RAE system
It was announced in the 2006 Budget that after the 2008 exercise a system of metrics would be developed in order to inform future allocations of QR funding. Following initial consultation with the higher education sector, it is thought that the Higher Education Funding Councils will introduce a metrics based system of assessment for subjects in science, technology, engineering and medicine. A process of peer review is likely to remain for mathematics, statistics, arts, humanities and social studies subjects.
- "RAE 2008". Research Assessment Exercise. Retrieved 20 August 2013.
- "Definitions". RAE 2008. Retrieved 20 August 2013.
- RAE 2008 Guidelines Para 79
- Madden, Andrew (19 December 2008). "The researchers the RAE forgot". Guardian. Retrieved 20 August 2013.
- Corbyn, Zoe. "RAE's non-specialist gambit could have led to blunders, says study". Times Higher Education. Retrieved 20 August 2013.
- "RAE 2008". University and College Union. Retrieved 20 August 2013.
- Review of research assessment - report by Sir Gareth Roberts to the UK funding bodies, May 2003
- Science and Technology Committee, Eleventh Report, 15 September 2004
- Shepherd, Jessica (30 January 2007). "A difficult patch". Guardian. Retrieved 20 August 2013.
- "Research Excellence Framework". REF 2014. Retrieved 20 August 2013.