Research Excellence Framework

From Wikipedia, the free encyclopedia

The Research Excellence Framework (REF) is a research impact evaluation of British Higher Education Institutions (HEIs). It is the successor to the Research Assessment Exercise and it was first used in 2014 to assess the period 2008–2013.[1][2] REF is undertaken by the four UK higher education funding bodies: Research England, the Scottish Funding Council (SFC), the Higher Education Funding Council for Wales (HEFCW), and the Department for the Economy, Northern Ireland (DfE).

Its stated aims are to:

  • inform the allocation of block-grant research funding to HEIs based on research quality;
  • provide accountability for public investment in research and produce evidence of the benefits of this investment; and
  • provide insights into the health of research in HEIs in the UK.[3]

Critics argue, inter alia, that there is too much focus on the impact of research outside of the university system, and that impact has no real relevance to the quality of research.[citation needed] It is suggested that REF actually encourages mediocrity in published research, and discourages research which might have value in the long term.[citation needed] It has repeatedly been argued that REF does more harm than good to higher education.[4]

The latest REF was in 2021, with results released in May 2022, continuing the previous assessment model of focusing on research outputs, research impact and research environment.[5] This process was slightly delayed because of the COVID-19 pandemic.[6]

In June 2023, it was announced that the next exercise would conclude in 2028, with submissions in 2027.[7]

History[edit]

In June 2007 the Higher Education Funding Council for England (HEFCE) issued a circular letter announcing that a new framework for assessing research quality in UK universities would replace the Research Assessment Exercise (RAE), following the 2008 RAE.[8] The following quote from the letter indicates some of the original motivation:

Our key aims for the new framework will be:

  • to produce robust UK-wide indicators of research excellence for all disciplines which can be used to benchmark quality against international standards and to drive the Council's funding for research
  • to provide a basis for distributing funding primarily by reference to research excellence, and to fund excellent research in all its forms wherever it is found
  • to reduce significantly the administrative burden on institutions in comparison to the RAE
  • to avoid creating any undesirable behavioural incentives
  • to promote equality and diversity
  • to provide a stable framework for our continuing support of a world-leading research base within HE.

The letter also set out a timetable for the development of the REF. HEFCE undertook a consultation exercise during September–December 2009, soliciting responses from stakeholders on the proposals.[9] These include for example the response from Universities UK,[10] and the response from the University and College Union.[11]

In July 2010 (following the May 2010 general election), the Universities and Science minister David Willetts announced that the REF will be delayed by a year in order to assess the efficacy of the impact measure.[12]

In July 2016, Lord Nicholas Stern's review was published, drafting general guidelines for the next REF in 2021.[13] In general, the review was supportive with the methodology used in 2014 to evaluate universities' research, however it emphasised the need for more engagement with the general public and the increase of number of case studies that undertook interdisciplinary approach.[13] The Research-impact.org team at Loughborough University Business and Economic School have been experimenting with crowdfunding for research in order to increase the university's researchers' public engagement.[14]

Research impact[edit]

REF's impact is defined as "an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia".[15]

Grading criteria[edit]

Submissions are assessed according to the following criteria:[16]

  • Four star: Quality that is world-leading in originality, significance and rigour.
  • Three star: Quality that is internationally excellent in originality, significance and rigour but which falls short of the highest standards of excellence.
  • Two star: Quality that is recognised internationally in originality, significance and rigour.
  • One star: Quality that is recognised nationally in originality, significance and rigour.
  • Unclassified Quality: that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.

Performance rankings[edit]

Two publishers, Times Higher Education,[17] (THE) and Research Professional News[18] (RPN; used by The Guardian and other newspapers)[19][20] produced overall rankings of institutional results in the 2021 REF based on research power, market share and quality (GPA). The THE institutional GPA is an average of each institution's GPA across the units of assessment weighted by the number of full time equivalent staff submitted to each unit of assessment, the "research power" measure is this institutional GPA multiplied by the number of full time equivalent staff submitted by the institution, and the "market share" measure uses the weighting used to calculate block grants from the institutional profiles, with 4* grades given a weighting of four, 3* grades given a weighting of one, and 2* and below given a weighting of zero.[21]

Ranking THE quality (GPA) THE research power THE market share RPN power rating
1 Imperial College London University of Oxford University of Oxford University of Oxford
2 Institute of Cancer Research University College London University College London University College London
3 = University of Cambridge
= London School of Economics
University of Cambridge University of Cambridge University of Cambridge
4 University of Edinburgh University of Edinburgh University of Edinburgh
5 University of Bristol University of Manchester University of Manchester University of Manchester
6 University College London King's College London King's College London King's College London
7 University of Oxford University of Nottingham Imperial College London Imperial College London
8 University of Manchester University of Leeds University of Bristol University of Bristol
9 King's College London Imperial College London University of Nottingham University of Nottingham
10 = University of York
= London School of Hygiene and Tropical Medicine
University of Bristol University of Leeds University of Leeds

Controversies and criticism[edit]

A particular source of criticism has been the element of the REF that addresses the "impact" of research. The articles below raise two objections. The main one is that "impact" has been defined to mean impact outside the academy. If researchers were required to pursue this form of impact, it would undermine academic freedom. The other is that impact—as currently construed—is hard to measure in any way that would be regarded as fair and impartial.[22][23][24]

The Higher Education Funding Council for England argue that their measure of "impact" is a broad one which will encompass impact upon the "economy, society, public policy, culture and the quality of life".[22] However, the assessment structure does make what impact practically can be claimed rather narrow (4 page limit, no method section, 10 impact references, 10 research references and only 1 page to summarize the research and the impact respectively). These strict discursive guidelines alongside the REF's dated notion of how research impact functions (teaching research impact excluded, linear model, etc.) does restrict what impact is suited practically more for the assessment.[citation needed]

Another area of criticism, which the REF inherited from the structure of the RAE, is that for most full-time staff members submission normally consists of four published 'research output items'. There is no recognition of the difference between a book and an article in terms of research value. Therefore, the REF system discourages long-term projects that strive for excellence. This problem is particularly evident in the humanities, where most of the ground-breaking research is traditionally not published in articles. Therefore, many researchers are pushed towards a relatively mediocre activity, which will allow them to produce one or two books during the assessment period, but not the kind of monograph that normally would need four or five years of research and writing.[citation needed]

Moreover, the system of the four published items discourages long-term projects with relatively high research risk in the sciences as well, since researchers are reluctant to engage in projects or experiments that may not be successful and may not lead to a publication. Since most of the ground-breaking research in the sciences takes place with precisely such risky and imaginative projects, the type of research activity that is encouraged by the REF structure is quite conservative. Also, in terms of the impact of the examined research, in the history of the sciences and the humanities it is not unusual to take some time until the full impact of a discovery is made. The present system has a vista of only four or five years.[citation needed]

The Times Higher Education also revealed that some universities appeared to be "gaming" the REF system. This included "REF Poaching", in which staff with established research records were headhunted from their universities immediately before the REF, giving the poaching institution full credit for their publications without having taken the risk of supporting the researcher. It also included employing large numbers of staff on 0.2 FTE contracts, the lowest level of employment that qualifies them for REF submission.[25]

In addition to such concerns about what really can be measured by four research output items, and how impact may be measured, the whole system is often criticized as unnecessarily complex and expensive, whereas quality evaluation in the digital age could be much simpler and effective.[26]

The system, with its associated financial implications, has also been criticised for diverting resources from teaching. As such, increases in student fees may often not have resulted in more staff time being spent on teaching.[citation needed]

In July 2016, Lord Nicholas Stern's review was published, drafting general guidelines for the next REF in 2021.[27] One of the recommendations was to increase research public engagement. Research engagement means enhancing delivery of the benefits from research. It also means making the public more aware of the research findings and their implications. One mechanism for public engagement is crowdfunding for research, where dedicated platforms host crowdfunding campaigns for university research, in a range of topics. Crowdfunding for research has two advantages: one, it is a source for a relatively high guaranteed funding, with a rate of around 50%, second, it is a very effective tool to engage with the general public.[14]

One problem that the Stern review did not address in relation to the research impact assessment, is that the structure of case study design template on which impact is assessed, does not contain a method section, and thereby making the assessment of what type of impact was claimed a rhetoric game of who can claim the most (cf. Brauer, 2018).[28] Thereby, grand claims are incentivized by the assessment structure. The problem occurs, because qualitative judgments of the significance and reach of the impact (without an account of the underlying method) cement contemporary values into the assessment, as such; "[…] call it socially constructed, mutual learning, social practice whatever, the key is that we can’t separate characteristics of Impact from the process imposed on value and recognise it as such." (Derrick, 2018:160)[29] When checking the reference of current claims, these were either not accessible (e.g. the relevant websites were taken down), referenced in such a way that it didn't reflect self-authorship or testimonials of individuals connected to the researcher (Brauer, 2018:142-147). Similarly, Sayer (2014)[30] criticizes the overall peer review of the REF process, describing it as poor simulacrum of standard academic quality and that the assessment process is further complicated by the sheer workload of the assessment (p. 35). On a similar note, a RAND study found that the majority of the references were never consulted, certain assessment panels were discouraged from using the internet and the reference help structure of the REF took sometimes two weeks to produce associated references.[31] Thereby, the external impact focus disciplines the assessment into focusing on external values.[32]

In 2018, it was said that REF has negative effects on the humanities.[33]

See also[edit]

References[edit]

  1. ^ "Results & submissions : REF 2014". Retrieved 22 December 2014.
  2. ^ Atkinson, Peter M. (11 December 2014). "Assess the real cost of research assessment". World View. Nature (paper). 516 (7530): 145. Bibcode:2014Natur.516..145A. doi:10.1038/516145a. PMID 25503199.
  3. ^ "Early decisions made for REF 2028". www.ukri.org. 2023-06-15. Retrieved 2023-06-20.
  4. ^ Bishop, Dorothy (2016-03-03). "Clarity of purpose in the TEF and the REF". Times Higher Education (THE). Retrieved 2022-08-05.
  5. ^ England, Higher Funding Council of. "2017 : Funding bodies confirm shape of REF 2021 - REF 2021". www.ref.ac.uk. Retrieved 2018-06-29.
  6. ^ "Further update on coronavirus (COVID-19) and REF timetable - REF 2021".
  7. ^ "Early decisions made for REF 2028". www.ukri.org. 2023-06-15. Retrieved 2023-06-20.
  8. ^ Eastwood, David (6 March 2007). "Future framework for research assessment and funding". HEFCE. circular letter number 06/2007. Archived from the original on 2 February 2010.
  9. ^ "Research Excellence Framework: Second consultation on the assessment and funding of research". HEFCE. September 2009. 2009/38. Retrieved 10 January 2015.
  10. ^ "Universities UK response to HEFCE consultation on the Research Excellence Framework (REF)". Universities UK. 13 December 2009. Archived from the original (.doc) on 16 July 2011.
  11. ^ "Response to the Research Excellence Framework: Second consultation on the assessment and funding of research" (PDF). University and College Union. December 2009.
  12. ^ Baker, Simon (8 July 2010). "REF postponed while Willetts waits for impact 'consensus'". Times High. Educ.
  13. ^ a b Stern, Lord Nicholas; et al. (July 2016). "Building on Success and Learning from Experience" (PDF). gov.uk. UK Government. Retrieved 3 January 2017.
  14. ^ a b Rubin, Tzameret (2017). "Is it possible to get the crowd to fund research, isn't it the government's role?". AESIS. Retrieved 2016-12-23.
  15. ^ McLellan, Timothy (2020-08-25). "Impact, theory of change, and the horizons of scientific practice". Social Studies of Science. 51 (1): 100–120. doi:10.1177/0306312720950830. ISSN 0306-3127. PMID 32842910. S2CID 221326151.
  16. ^ "Assessment framework and guidance on submission" (PDF). Research Excellence Framework. July 2011. p. 43. REF 02.2011.
  17. ^ "REF 2021 Main Online Table". Times Higher Education. 12 May 2022.
  18. ^ "REF 2021: The top 10". Research Professional News. 12 May 2022.
  19. ^ "Oxford and UCL tipped to win lion's share of grants in UK research audit". The Guardian. 12 May 2022.
  20. ^ "North East universities dubbed 'powerhouses' with strongest team of researchers outside London". 12 May 2022.
  21. ^ "REF 2021: Times Higher Education's table methodology". Times Higher Education. 12 May 2022.
  22. ^ a b Shepherd, Jessica (13 October 2009). "Humanities research threatened by demands for 'economic impact'". Education. The Guardian. London.
  23. ^ Oswald, Andrew (26 November 2009). "REF should stay out of the game". The Independent. London.
  24. ^ Fernández-Armesto, Felipe (3 December 2009). "Poisonous Impact". Times Higher Education.
  25. ^ Jump, Paul (26 September 2013). "Twenty per cent contracts rise in run-up to REF". Times Higher Education.
  26. ^ Dunleavy, Patrick (10 June 2011). "The Research Excellence Framework is lumbering and expensive. For a fraction of the cost, a digital census of academic research would create unrivalled and genuine information about UK universities' research performance". London School of Economics.
  27. ^ Stern, L. (2016). Building on Success and Learning from Experience: An Independent Review of the Research Excellence Framework.
  28. ^ Brauer, R. (2018): What research impact? Tourism and the changing UK research ecosystem. Guildford: University of Surrey (PhD thesis). available at: http://epubs.surrey.ac.uk/id/eprint/846043
  29. ^ Derrick, G. (2018). The evaluators’ eye: Impact assessment and academic peer review. Berlin: Springer.
  30. ^ Sayer, D. (2014). Rank hypocrisies: The insult of the REF. Sage.
  31. ^ "Evaluating the Submission Process for the Impact Element of REF". www.rand.org. Retrieved 2019-05-03.
  32. ^ "Measuring the Societal Impact and Value of Research". www.rand.org. Retrieved 2019-05-03.
  33. ^ Study International Staff (December 7, 2018). "Beware the 'Research Excellence Framework' ranking in the humanities". SI News. Retrieved September 19, 2019.

External links[edit]