Russell Group response to Lord Stern s review of the Research Excellence Framework (REF)

Similar documents
Russell Group response to the consultation on the second Research Excellence Framework

The Research Excellence Framework: A brief guide to the proposals

University Alliance response to the consultation on the second Research Excellence Framework

STERN Review Outcomes REF preparations and the Future of RAPG. Professor Pam Thomas

Lord Stern s review of the Research Excellence Framework

REF 2021 Post-Stern. David Sweeney Director (Research, Education and Knowledge Exchange. Inside Government R&D Conference.

Universities Scotland response to the UK Government Green Paper: fulfilling our potential, teaching excellence, social mobility and student choice

Consultation on the second Research Excellence Framework

The Research Excellence Framework

Initial decisions on the Research Excellence Framework 2021

Taking the Stern Review Forward: Next Steps for REF 2021

Panel criteria and working methods

KEF Metrics. KEF metrics technical advisory group. Submission from the Royal Academy of Engineering. February 2018

Policy Briefing. The future of research assessment: the principles of reform

Consultation on the Concordat to Support the Career Development of Researchers

REF 2021 Decisions on staff and outputs

School of Computing FACULTY OF ENGINEERING. The next REF. Tony Cohn

Research and Innovation Strategy

Capturing Research Impacts: a review of international frameworks. Sonja Marjanovic IEA Meeting November 2010 Brussels

Research Excellence Framework

Higher Education Funding Council for England Call for Evidence: KEF metrics

MULTILATERAL AID REVIEW: ONE-YEAR UPDATE

REF consultation: Impact, Environment & Institutional level assessment

Input to HEFCE consultation on KEF metrics

Have we adopted the recommendations? What about the REF?

CONSULTATION ON USE OF RESOURCES AND WELL- LED ASSESSMENTS - NHS Providers response

Roles and recruitment of the expert panels

Innovative Research Universities Australia

COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS

Nurse Review of Research Councils

Planning Responsibly in Medical Education. Interim PRIME Capacity Guide for Health Services

Job description Interim Head of Research

Self-evaluation form Fast Track to Innovation

Piloting a streamlined approach to validation - a joint project by The Open University, Independent Higher Education and QAA INTERIM REPORT

Scottish Apprenticeship Advisory Board (SAAB) Defining an Apprenticeship: Consultation

Teaching Excellence Framework: Technical Consultation for Year 2 A response by Universities Wales

OFFICIAL 26 July a. An overview of the OfS data strategy, including our vision, and b. The process of consultation and timetable for publication

Consultation questions

WORK IN SUPPORT OF CHARTER PRINCIPLES

Consultation on the second Research Excellence Framework: BU Response

Discussion Paper: Assessing impacts arising from public engagement with research

How to map excellence in research and technological development in Europe

Submission to the Department for Business, Innovation and Skills Technical Consultation for Year Two of the Teaching Excellence Framework

Integrated Performance and Incentive Framework: Achieving the Best Health Care Performance for New Zealand

Economic Data Inquiry Scottish Enterprise

BEUC RESPONSE TO THE STAKEHOLDER CONSULTATION ON SMART REGULATION

Business Plan

KEY LESSONS FROM SUCCESSFUL ASC 606 IMPLEMENTATIONS

Who s Assessing Who? The A&DC Thought Leadership Series. The Role of Applicant Perceptions in the Selection Process

Policy Recommendations Adopted by 33rd Session of the COMCEC

Analysis report on the consultation on. revalidation for pharmacy professionals: what we did

ATHENA SWAN: ANALYSIS & ACTIONS

NIHR Information Strategy Version 2.0

REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL AND THE COMMITTEE OF THE REGIONS

End Point Assessment Plan HR Consultant / Partner Standard Level 5

Introduction: qtaasessment.doc

ICAEW REPRESENTATION 92/17

Procurement Functional Leadership Progress Report October 2015 to March 2016

AHRC Creative Industries Policy and Evidence Centre Workshop findings

Policy and Procedures for the Supporting Research Excellence Scheme

CARDIFF METROPOLITAN UNIVERSITY CODE OF PRACTICE ON PREPARING SUBMISSIONS FOR THE 2014 RESEARCH EXCELLENCE FRAMEWORK

HARTPURY our STRATEGY FOR OUR FUTURE

Architecture Planning Adding value to projects with Enterprise Architecture. Whitepaper. September By John Mayall

Transformation in Royal Mail

REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS

The development of a quality scorecard to support primary care commissioning and contracting

research. World leading environment will be:

Citation Statistics (discussion for Statistical Science) David Spiegelhalter, University of Cambridge and Harvey Goldstein, University of Bristol We

Research Metrics. Glenn Lyons, 18 May 2016

INSTITUTION OF CIVIL ENGINEERS (ICE) SCOTLAND WRITTEN SUBMISSION

VALUE CAPTURE DISCUSSION PAPER HOW CAN WE MAKE BETTER USE OF VALUE CAPTURE?

Asset Management Policy

Strategic Guidance to the Institute for Apprenticeships

End Point Assessment Plan Aerospace Engineer Standard

COMMISSION OF THE EUROPEAN COMMUNITIES

OFFICIAL ONR GUIDE SECURITY GOVERNANCE AND LEADERSHIP. Nuclear Security Technical Assessment Guide. CNS-TAST-GD-1.1 Revision 0

EVALUATION OF THE ESRC CENTRE FOR BUSINESS RELATIONSHIPS, ACCOUNTABILITY, SUSTAINABILITY AND SOCIETY (BRASS)

ENGINEERS AUSTRALIA ACCREDITATION BOARD ACCREDITATION MANAGEMENT SYSTEM EDUCATION PROGRAMS AT THE LEVEL OF PROFESSIONAL ENGINEER S02

REF Accountability Review: Costs, benefits and burden

faster better Transforming the development of quality apprenticeship standards

Connected and Autonomous Vehicles 2 - Stream 3 FS. How To Read Your Feedback

IRM s Professional Standards in Risk Management PART 1 Consultation: Functional Standards

Standards for Doctoral programmes in Forensic Psychology

Defra s strategic policy statement to Ofwat

Self-evaluation form Form 1: Research and innovation actions Innovation actions Form 2: Coordination & support actions

REVISED OPERATING MODEL FOR QUALITY ASSESSMENT

Institute of Leadership & Management. Creating a coaching culture

real purpose Performance Management and Succession Planning Overview S o c i a l E n t e r p r i s e L e a d e r s h i p P r o g r a m m e

SPECIMEN PAPER. 992 Risk Management in Insurance

HUMAN RESOURCES STRATEGY HUMAN RESOURCES STRATEGIC PLAN

9 February Monitoring Group C/o International Organization of Securities Commissions Calle Oquendo Madrid SPAIN

EXPERTS IN DELIVERING WORLD CLASS PEOPLE SOLUTIONS

Risk Management Policy

Technology Strategy Board Triennial Review. Department for Business, Innovation and Skills

Role Description Director, Human Resources

The Merlin Principles. The Elements of each Principle

Discussion paper. The Auditor- General s observations on the quality of performance reporting

Management Board 16 June 2011 Budapest, Hungary

2019 Business Plan Principles and Approach Consultation Customer Feedback Report Publication Date: 16 th July 2018

CHARACTER ELSEVIER COMPETENCY FRAMEWORK

Transcription:

Russell Group response to Lord Stern s review of the Research Excellence Framework (REF) 1. Summary The REF is a fundamental part of the UK s dual support system for research funding. Whilst there is certainly scope to improve the REF and give more emphasis to critical mass at the institutional level, it should be recognised the RAE/REF has succeeded in driving up the quality of research in the UK by supporting a research culture focused on excellence. The REF should not be used as a regulatory or compliance tool as this undermines its focus on excellence. As a mechanism to underpin the distribution of QR funding, the REF is a relatively costeffective exercise, although we would like to see costs reduced further. This could be achieved by, for example, extending the period between REF exercises, introducing a red tape challenge to identify simple practical changes, ensuring REF and Research Council requirements align where practical, introducing an institution-wide (but overall less burdensome) environment assessment and streamlining impact assessment. We would welcome a lighter touch REF overall and there are some areas where metrics can be useful, but peer review should remain at the heart of the process, with metrics used where appropriate to complement and aid human judgement. By maintaining the broad structure of the REF and broadly similar rules, universities will be able to draw on their experience of REF 2014 in order to prepare submissions more efficiently. Major changes to the exercise will require universities to invest additional time and effort in understanding new structures, processes and rules. Metrics may be seen at first sight as a quicker and cheaper alternative to peer review, but it is essential to be aware of the many shortcomings in their use. Importantly, metrics lack the element of judgement provided by the long-established and internationallyrespected peer review process. 2. Purpose of QR and the REF 2.1 The REF serves several important purposes. In particular, the RAE/REF has played a key role in driving up the quality of research in the UK by supporting a research culture focused on excellence. 2.2 The REF is also the only UK-wide assessment of the health of individual disciplines and yields important information about institutional and departmental performance, giving universities a robust view of their research and allowing them to benchmark against each other. The results provide an indicator of research reputation across the UK which has helped to build the profile and status of UK research globally. In turn, this focus on excellence helps businesses and international partners to identify key research groups and universities to engage with, supporting efforts to drive university-business collaboration and inward investment as well as widening the UK s academic links internationally. 2.3 In addition, the REF process provides the public and policy makers with confidence that public investment in research is producing high quality research with tangible impact. russellgroup.ac.uk

2.4 Of course, the REF is also a fundamental part of the UK s dual support system for research funding and helps to protect quality-related research funding (QR) funding. The combination of stable core funding and competitively awarded grants provided through the dual support system ensures the diversity and breadth of research in the UK. 2.5 QR funding should not be viewed in isolation, but should be seen as a mechanism which complements funding allocated via the Research Councils (and won competitively from other sources), enabling institutions to plan ahead and invest strategically. QR funding in its current form is vital in providing a stable base of research funding over reasonably long time periods when other areas of funding are uncertain. QR allows universities to respond to new opportunities and challenges, to work constructively with charities and business, and ensures they are not wholly concentrating on shorter-term research needs. Changing the basis on which QR funding is made risks undermining the main strength of this stream: its flexibility. 2.6 It is not yet known which organisation will take responsibility for the next REF and the allocation of QR funding, but it is essential there is appropriate expertise in the organisation to deal with issues pertaining to research and research funding; QR should continue to be kept at an arm s length from Government. 3. Driving excellence 3.1 The REF should recognise and reward the very highest levels of excellence in research and should avoid driving growth in lower quality research. It should also recognise the importance of higher concentrations of research excellence, critical mass and multi-disciplinarity. High concentrations of research excellence across disciplines are a foundation for innovative, interdisciplinary research collaborations often critical to solving global challenges. 1 It would be helpful if greater emphasis could be given to this institutional-level excellence in the next REF. 3.2 Increasing international representation on panels and sub-panels would help ensure judgements on what constitutes world-leading quality take a truly international perspective. This could also boost the international visibility and credibility of the REF. 3.3 A key element to bolstering excellence through the REF is ensuring interdisciplinary research is properly recognised. Further clarity on, and encouragement of, cross-referring between panels may help ensure support for interdisciplinary research. 3.4 It would also be helpful for the REF to provide enhanced recognition for research that is done in groups and even collaboratively across institutions. The extent to which this occurs does vary according to discipline but should be a consideration in particular in the sciences. 3.5 There was a notable rise in ratings at the low-middle end in REF 2014, making it difficult to measure the increase in quality at the top end of the scale. For future exercises REF panels should consider steps to reduce the impact of such changes so the full range of marks is used to truly distinguish the highest levels of excellence. 3.6 There are pros and cons to insisting on 100% submission for all eligible staff in the REF. For now, we would suggest the proportion of eligible staff entered should be indicated in the environment section of the REF. This will enable panels to tell whether there has been a high degree of selectivity and consider what this may mean in terms of the quality of the research 1 We have highlighted numerous examples of this in our recent publication and film on the impact of research at Russell Group universities, drawing on REF impact case studies, see: http://www.russellgroup.ac.uk/media/5324/engines-of-growth.pdf 2

environment (e.g. if it only allows a very small percentage of FTEs to succeed or if the submission is dependent on many small fractional FTEs). 3.7 In order to maintain the link between impact and research excellence, impact case studies should continue to be linked to research rated at least at 2* level. 3.8 There is benefit in carrying out assessment of outputs at the Unit of Assessment (UoA) level; and the level of granularity in REF 2014 was about right. Further reducing the number of UoAs may make it hard to identify excellence within individual fields and could undermine the confidence of some parts of the academic community who may be concerned their research will be assessed by sub-panels without appropriate expertise. However, efforts should be made to ensure appropriate UoA size and sufficiently coherent subject combinations. A key concern with REF 2014 was the large size of some UoAs under Main Panel A in particular. 3.9 The REF should not be used as a regulatory or compliance tool as this undermines its focus on excellence. For example, pursuing open access must not be allowed to compromise the quality of submissions in the next REF. Research outputs should be judged solely on their own merits (i.e. on research quality factors, not on where or how they are published). 4. Reducing burden, red tape and bureaucracy 4.1 As a mechanism to underpin the distribution of QR funding, the REF is already a relatively cost-effective exercise, although we would still like to see costs reduced further. 4.2 The cost of REF 2014 has been estimated to be roughly 2.4% of the 10.2 billion in QR research funds projected to be distributed between 2015-16 and 2020-21 2, although we recognise the costs are higher when compared to the change in funding distribution between exercises. The costs of the REF per unit of QR could be reduced by extending the period between REF exercises; other cost-reduction measures can also be considered as highlighted below, but it is important not to lose the value of REF in seeking to reduce costs. Alternatively, increasing the amount of QR funding distributed over the same REF time period could reduce the relative burden of a new REF. 4.3 Much of the additional administrative activity generated as a result of REF, including the process of internal review, performance management and strategy development, is likely to occur within institutions regardless of the REF as these mechanisms are a part of good management practice. Therefore, at least some of the quoted costs should be considered as part of the normal business of universities in managing their research performance. 4.4 By maintaining the broad structure of the REF, and broadly similar rules, universities should be able to draw on their experience of REF 2014 in order to prepare submissions more efficiently. Similarly, the team running the REF will also be able to draw more readily on REF 2014 experience. Any major changes to the exercise will require universities to invest additional time and effort in understanding new structures, processes and rules (and there will also be costs to HEFCE in developing and putting these changes into operation). 4.5 As well as looking for similar approaches from one REF to the next, the overall burden associated with research management could be reduced by ensuring REF and Research Council requirements align where possible. For example, designing compatible open access rules and making research information systems more interoperable. The Funding Councils and the Research Councils should work closely to ensure they adopt consistent approaches wherever possible which are practical and low-burden for research-intensive universities that will be affected most. 2 REF accountability review: costs, benefit and burden. Technopolis for HEFCE (July 2015). 3

4.6 As part of its future REF consultation with universities, HEFCE could run a red tape challenge by collating feedback on implementation and administrative issues as well as on the structure and method of assessment. Issues to consider include aligning deadlines for impact, staff inclusion and publication, with a later deadline for submission including hard copy and redacted outputs. 4.7 We recognise that the transfer of researchers from one institution to another, particularly close to the REF census deadline, can create challenges. Solutions to address this should be considered (such as the possibility of applying a weighting depending on the length of time an individual has been at an institution), provided any new rules do not create undue additional burden in the system or simply move problems to an earlier date. Outputs 4.8 To reduce the amount of work required by panels, a sampling mechanism could be considered, at least for UoAs with high volume submissions, whereby panellists could assess just two or three of the four outputs. A pilot could be conducted to test whether assessing a proportion of the submitted outputs would lead to significantly different results from assessing them all and whether an algorithm could be designed to implement sampling in a fair way. Impact 4.9 A large part of the cost and burden of REF 2014 was due to the novelty of including impact assessment as a new dimension. 3 However, this has yielded valuable insights into the wider social, economic and cultural impacts of research and is also helping encourage researchers to consider these potential benefits as an important part of the research activity. 4.10 Increasing the impact weighting to 25% for the next REF could be considered, but greater flexibility would be welcome around impact case studies themselves and the overall number that universities have to submit could be reduced. Provided there is at least one impact case study per UoA to which the university is submitting, institutions should be able to choose to submit case studies at the level they consider most appropriate: UoA, Main Panel or institutional level. This could aid the recognition of cross-disciplinarity within institutions and could also reduce the cliff edge issue regarding staff selection in each UoA. 4 4.11 Impact can mean different things for different disciplines and the one size fits all definition used for REF 2014 should be reviewed. For example, knowledge exchange could be more clearly emphasised in the humanities disciplines, as long as this is linked to excellent research. 4.12 In addition, institutions should be able to re-submit REF 2014 case studies where subsequent impact can be demonstrated. HEFCE should publish guidance on the rules around re-submission as early as possible. Environment and templates 4.13 The most useful elements of the impact template could be incorporated into the environment template in a simplified form to reduce the workload for both institutions and panel members. 3 As noted in the REF Accountability Review: Costs, benefits and burden. 4 As part of an analysis of a sample of Russell Group REF impact case studies we found that 59% of case studies were interdisciplinary. There is, therefore, a strong argument that impact is, in fact, less tied to a single specific discipline and moving the impact assessment to the panel level may be beneficial for interdisciplinary research. More information can be found in our report Engines of Growth. 4

4.14 An overarching environment submission covering the whole institution would also help reduce burden and repetition. Here, metrics could be used more extensively and successfully alongside qualitative information in bullet point form. Sufficient detail would still be needed at disciplinary level, but universities would be able to demonstrate institutional strategy, critical mass of research excellence and key interdisciplinary links across the institution all in one place, rather than repeating this for each UoA. Time periods 4.15 The cumulative impact of the recent HE Green Paper, Nurse Review and Stern Review will be a disruptive one and time will be needed for the outcome of these consultations to be implemented and embedded into a new over-arching architecture for UK higher education research. It is likely HEFCE will also run a subsequent consultation on detailed aspects of the next REF and its implementation. As such, we would suggest considering a delay to the next REF. The gap between RAE 2001 and the subsequent RAE was pushed back to seven years without undermining the system and it is reasonable to assume the interval could be stretched again (rather than reverting to a six-year cycle). 4.16 A short delay may be needed anyway to allow enough time for any successor organisation to HEFCE to run the next REF with confidence. This should also give institutions sufficient time to understand and implement the new guidance. One of the weaknesses of REF 2014 was that guidance was published late in the process and there were difficulties when changes and updates to the guidance came at different points during the exercise. 5 With additional time, HEFCE (or its successor) should be able to minimise the need for any updates to the guidance at a later stage. 4.17 Since REF time periods are already longer than Parliamentary terms (and maybe even longer in future), it would be reassuring to have a cross-party commitment that, if a REF exercise has already been timetabled, it will be seen through to conclusion with concomitant funding to follow. This would ensure effort put in by institutions and researchers to prepare for a REF is not wasted by the potential cancellation of an exercise due to political change whilst it is already underway. 5. Metrics and peer review 5.1 It is essential the REF methodology is based on the latest thinking about research assessment, utilises peer review and continues to have the confidence of the higher education sector and other stakeholders, in all countries of the UK and internationally. 5.2 A large part of the REF s success is due to the rigour of the assessment process, making it essential that expert peer review is retained at the core of the exercise. Although we would like to see a reduction in the administrative burden associated with REF, this should not be at the expense of reducing or removing expert peer review, particularly for outputs and impact. 5.3 Whilst metrics may be seen at first sight as a quicker and cheaper alternative to peer review, there are many known shortcomings in their use, for example in relation to equality and diversity issues; differences between and within different disciplines that metrics would struggle to capture; and the ability to game data. Importantly, metrics lack the element of judgement that is at the heart of the long-established peer review process as the independent review on metrics concluded: 5 Evaluating the 2014 REF: Feedback from participating institutions report prepared by the Funding Councils (March 2015). 5

Even if technical problems of coverage and bias can be overcome, no set of numbers, however broad, is likely to be able to capture the multifaceted and nuanced judgements on the quality of research outputs that the REF process currently provides. 6 5.4 We do not see a strong argument for extending the use of metrics for assessment of outputs beyond what was permitted within REF 2014. The element of judgement that peer review enables is essential when evaluating a diverse range of outputs across the research spectrum and even more so when considering the quality of interdisciplinary research. A one-size-fits-all approach will not work across UoAs (particularly for arts and humanities) so we would propose sub-panels should be able to decide whether they would find it useful to have bibliometric data made available to them to inform peer review. As the field of research metrics continues to evolve, their use in future iterations of REF should be kept under review. 5.5 There is no convincing evidence that there should be any significant role for metrics in assessing impact; peer review must remain the primary method of assessment here. Again, the diversity of impacts delivered as a result of excellent research makes comparisons and judgements very difficult unless by peer review. However, some of the quantitative data provided within the impact case studies could be more standardised in places to improve comparison between, and analysis of, case studies. In addition, moving to an online format with a degree of standardisation of basic metadata (e.g. name of institution, UoA, disciplines/subject areas involved, funding details) would help facilitate analysis and the use of case studies post-publication. 5.6 Metrics could be used much more extensively and successfully in the environment section of REF. Indeed, this whole section could be simplified radically. Moving more towards metrics here would build on a tested precedent as some quantitative data is already used for the assessment of the research environment. Using additional existing data sets such as HEBCI data collected by HESA on collaborative research and contract income could be considered. Focusing on data that is largely collected already could also allow institutions to benchmark themselves on a more regular basis. 5.7 Nevertheless, it would be helpful to retain the opportunity to provide qualitative contextual information in the environment section. This information should be provided in short bullet points with a fixed word limit in order for the assessment to focus on the content of the information provided rather than the quality of written presentation. 5.8 It was suggested in the recent HE Green Paper that REF results could be refreshed with metrics in between peer reviewed exercises. The implications of this idea should be considered very carefully in order not to create perverse incentives in the system, not to undermine the validity and credibility of the peer review exercises and not to add burden and cost inadvertently. March 2016 6 The Metric Tide, Report of the independent review on the role of metrics in research assessment and management (July 2015). 6