Evaluation Framework: Research Programmes and Schemes

Similar documents
Impact Summary and Pathways to Impact Frequently Asked Questions AHRC

COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS

Lord Stern s review of the Research Excellence Framework

Governing Body Geneva, November 2002 PFA. ILO evaluation framework. Evaluation within a strategic budgeting context TENTH ITEM ON THE AGENDA

Independent Supplementary Reviews of the Regional Development Agencies

AUDIT Where are we now? ONGOING MEASUREMENT Are we getting there?

Social work. Handbook for employers and social workers. Early Professional Development edition

APPENDIX 1 DRAFT REVIEW AGAINST THE CODE OF CORPORATE GOVERNANCE

Background and context. Overview. Impact Drivers. Research Impact

PERFORMANCE MANAGEMENT FRAMEWORK

Consultation on the Concordat to Support the Career Development of Researchers

Significant Service Contracts Framework

IMC/02/03 Evaluation Plan ( Programmes)

NATIONAL INSTITUTE FOR HEALTH AND CARE EXCELLENCE. Health and Social Care Directorate. Indicator Process Guide. Published December 2017

Service Evaluation. An introductory guide

NOT PROTECTIVELY MARKED. HM Inspectorate of Constabulary in Scotland. Inspection Framework. Version 1.0 (September 2014)

PEACE IV PROGRAMME ( ) European Territorial Cooperation Programme United Kingdom-Ireland (Ireland - Northern Ireland) EVALUATION PLAN

National Commissioning Board. Leading Integrated and Collaborative Commissioning A Practice Guide

Module 5: Project Evaluation in

Oversight of the Private Infrastructure Development Group

Job description and person specification

ESRC Leadership Fellowship Call specification

Future-Focused Finance Accreditation

ATHENA SWAN: ANALYSIS & ACTIONS

TEAM LEADING (HAULAGE) LEVEL 3 APPRENTICESHIP

EXCELLENCE 1.1 Quality, innovative aspects and credibility of the research (including inter/multidisciplinary aspects)

Value For Money Strategy 2016/21

North Sea Offshore Authorities Forum

IRM s Professional Standards in Risk Management PART 1 Consultation: Functional Standards

ACU strategy

KEF Metrics. KEF metrics technical advisory group. Submission from the Royal Academy of Engineering. February 2018

The Chartered Project Professional Standard

Our Research Strategy

András G. Inotai, Stephen Ryan 1,2. Forthcoming in Competition Policy Newsletter

BIIAB Unit Pack. BIIAB Level 3 NVQ Diploma in Sales (QCF) 601/6785/3

Performance Management at Rochford District Council December 2011 PERFORMANCE MANAGEMENT AT ROCHFORD DISTRICT COUNCIL

Appendix 1 METROPOLITAN POLICE AUTHORITY AND METROPOLITAN POLICE SERVICE COMMUNITY ENGAGEMENT STRATEGY

Public Engagement with Research

Research and Innovation Strategy

Irish Aid. Evaluation Policy

Professional Development: Leadership for Performance Improvement

Apprenticeship End-Point Assessment Plan Regulatory Compliance Officer (Level 4)

GLASGOW CITY TEST SITE SUMMARY AND KEY EVALUATION FINDINGS: August 2011

The Research Excellence Framework: A brief guide to the proposals

ASSOCIATION OF PERSONAL INJURY LAWYERS Standard of competence for Fellows

The 9 knowledge Areas and the 42 Processes Based on the PMBoK 4th

Northumbria University Athena SWAN Action Plan April 2015

MEASURING THE EFFECTIVENESS OF INTERNAL AUDIT

Assurance Review Process - Lessons Learned

Level 3 NVQ Certificate in Management (QCF) Qualification Specification

Introductory guide to understanding impact

Corporate Strategy

COAG STATEMENT ON THE CLOSING THE GAP REFRESH. 12 December 2018

UOA 28, Mechanical, Aeronautical and Manufacturing Engineering

POSITION DESCRIPTION

Pay Band Agenda for Change Band 8a / Grade 7

Equality and Diversity Strategy for BBSRC as an. Investor Employer Partner 2014/ /17. November 2014

Summary of MRC Unit and Institute Quinquennial Reviews

Achieve. Performance objectives

Chartered Institute of Internal Auditors

White Paper Realising benefits: How to plan for success

A simple Procurement Process Flow diagram is shown in Appendix 1 and should be considered when carrying out spend.

BUSINESS PLAN

Workforce Development Strategy _. Workforce Development Strategy

Imperial Coaching Strategy

Evidence, Learning and Impact Manager Programme Evidence, Learning and Impact

NZAID Evaluation Policy Statement

QUALIFICATION HANDBOOK

5. Board appraisal good practice guide

INTERNATIONAL EDUCATION STANDARD 7. CONTINUING PROFESSIONAL DEVELOPMENT (Revised) CONTENTS. Scope of this Standard... A1 A7

COLLECTIONS RESEARCH STRATEGY

GUIDELINES FOR PUBLIC SECTOR REVIEWS AND EVALUATIONS

The Quality Profession Driving Organisational Excellence

LADY MANNERS SCHOOL CAREER, EMPLOYABILITY AND ENTERPRISE POLICY

Team Leader/ Supervisor Apprenticeship. Assessment Plan

good third Evaluation

Interim Review of the Public Engagement with Research Catalysts Programme 2012 to 2015

THE COSTS AND BENEFITS OF DIVERSITY

(3) All ongoing and fixed term staff members will have an individual VU Develop Plan annually agreed no later than March each year.

SCHOOL: Barclay Secondary School, Waltham Forest, London, E5 Opening L13 L17 additional allowance for an exceptional candidate

Trade-Related Assistance: What Do Recent Evaluations Tell Us?

Compiled by Rajan Humagai

WEST OF BERKSHIRE SAFEGUARDING ADULTS BOARD

AUSTRALIAN ENGINEERING COMPETENCY STANDARDS STAGE 2 - EXPERIENCED PROFESSIONAL ENGINEER IN LEADERSHIP AND MANAGEMENT

ADMINISTRATION OF QUALITY ASSURANCE PROCESSES

Public Service Improvement Framework

Consumer participation in health and community organisations in Melbourne s west

Committed to Excellence Information Brochure Helping with your decision to apply

Role Type Pay Band Location Duration Reports to:

Pobal Strategic Plan

JOB AND PERSON SPECIFICATION. Results Based Management (RBM) Adviser/ Team Leader

WORKLOAD MANAGEMENT FRAMEWORK: THE WORKLOAD ALLOCATION PROCESS AND THE WORKLOAD MODEL

Learning to monitor think tanks impact: Three experiences from Africa, Asia and Latin America 1

Workshop II Project Management

Integration Mgmt / Initiating Process Group 4.1 Develop Project Charter

List of Professional and National Occupational Standards for Youth Work

Value for Money Strategy

Staff development planning template

In response to the reports written by Estyn, the Wales Audit Office and the Public Accounts Committee on absence management, the Welsh Government

Transcription:

Evaluation Framework: Research Programmes and Schemes November 2015 BBSRC Polaris House North Star Avenue Swindon SN2 1UH www.bbsrc.ac.uk

Why BBSRC Evaluates the Research and Training it Supports EVALUATION EVIDENCE Evaluation provides evidence on performance and achievements Research Quality Economic and Societal Impacts ACCOUNTABILITY BBSRC has many stakeholders to which it is accountable FUNDING DECISIONS Evidence from evaluation supports funding decisions at a number of levels POLICY AND PRACTICE BBSRC learns from its evaluations and uses the results to inform operations and decision making government research community the public industry charities other funding agencies end-users (e.g. farmers, medical practitioners) BIS bid to Treasury Science budget BBSRC budget Internal BBSRC funding decisions Development of policies and programmes Maintaining and improving performance 2

Overview of BBSRC s Evaluation Framework BBSRC is committed to the effective evaluation of its investments in research and training, as part of its strategy for evidence-based decision making. Evaluation helps BBSRC to account for the funds that it allocates, informs specific decisions on future research funding, and helps to improve BBSRC s policy and practice. Evaluation is an important tool for examining the relevance, performance, efficiency and impact of BBSRC s programmes and schemes in relation to their stated objectives and BBSRC s wider strategic aims. Evaluation provides the evidence required to assess the overall quality of research within the BBSRC portfolio. It provides assurance that BBSRC is funding the highest quality research and that the international standing of the UK in the biosciences is being maintained. BBSRC also uses evaluation to demonstrate the wider impacts and benefits arising from its investments. This includes identifying the broader economic and societal impacts of the research we fund, as well as examining the interactions of BBSRC-supported researchers with other stakeholders such as industry and the public. When prioritising evaluations, BBSRC focuses on its major investments. The need to provide evidence of performance in strategically important areas is a significant consideration. BBSRC evaluations are evidence based. Data on the outputs and outcomes of individual projects are collected from grant holders through researchfish. These provide the basis for indicators that help demonstrate the quality and impact of the BBSRC research portfolio as a whole. Evidence gathered during evaluations is assessed by a Panel of independent academic, industrial and other experts. The Panel produces a report of their findings and recommendations. BBSRC evaluations are designed to ensure that lessons learned are fed back into policy-making, programme design and operation. Results from evaluations are reported to the appropriate BBSRC Strategy Panels, who consider the conclusions and whether and how they might best be addressed. BBSRC ensures that the results of its evaluations are widely disseminated. Evaluation reports are made publicly available on the BBSRC website. BBSRC has an ongoing evaluation programme and conducts regular evaluations of its research investments and the funding schemes which guide them. 3

Introduction BBSRC invests over 450 million in bioscience research, training and facilities each year. As a publicly funded body, BBSRC must account to government and other stakeholders for the funds it allocates and explain the achievements of the research it supports. Formal evaluation is an important means of meeting this requirement. Furthermore, evaluation has a vital role in informing funding decisions, BBSRC s policy, and its mechanisms and processes. This document sets out BBSRC s approach to evaluation, addresses the questions of why such evaluation is important, and describes how BBSRC conducts evaluations of its research programmes and schemes. It complements BBSRC s Assessment Strategy 1, which describes key elements of the Council s approach to rigorous assessment, investment appraisal, on-going monitoring and ex-post evaluation. General Principles of Evaluation Evaluation is an essential part of the evidence, experience and expert judgement used to make policy decisions and to manage programmes. Evaluation is a systematic and, as far as possible, objective process examining a programme s or policy s relevance, performance, efficiency and impact (both expected and unexpected) in relation to stated objectives. To be fully effective, evaluations must be designed to ensure that lessons learned are fed back into policy-making, programme design and operation. Successful evaluations involve collaboration between those carrying out the evaluation, the community whose work is being evaluated (the recipients of the research funding), and the bodies that will consider and implement the recommendations. Furthermore, conducting an evaluation is not just a means to an end: the process itself generates lessons for individuals, for teams, and for the organisation as a whole. Evaluation is distinct from appraisal of ideas at the outset, which is used to decide which policies and projects are taken forward 2. It is also distinct from monitoring and audit. Whereas monitoring is continuous and focused on day-to-day activities, and audit concentrates on accounting for resources, evaluation takes a step back from day-to-day running of a programme, scheme or organisation, to look at the programme or scheme as a whole, in terms of its objectives, its achievements, and lessons learned. The UK government provides guidelines on best practice for conducting evaluations of policies, programmes and projects in the HM Treasury Green Book: Appraisal and Evaluation in Central Government 3 and the HM Treasury Magenta Book: Guidance for Evaluation 4. BBSRC s evaluation framework is informed by these guidelines, and by the Department for Business, Innovation and Skills evaluation strategy 5. 1 www.bbsrc.ac.uk/documents/1511-bbsrc-assessment-strategy/ 2 The role of appraisal in BBSRC is not discussed in this document. However, BBSRC makes extensive use of appraisal - for example, the peer review assessment of research proposals. 3 www.gov.uk/government/publications/the-green-book-appraisal-and-evaluation-in-central-governent 4 www.gov.uk/government/publications/the-magenta-book 5 www.bis.gov.uk/policies/economics-statistics/economics/evaluation 4

Why BBSRC Evaluates the Research and Training it Supports Evaluation is valuable because it helps BBSRC to account for the research and training funds that it allocates, it informs decisions on future research funding, and it helps to improve BBSRC s research policy and practice. Evaluation provides evidence of the economic and societal impacts arising from BBSRC investments, and provides assurance on the overall quality of research within the BBSRC portfolio. Accountability: As a publicly funded body, BBSRC needs to account to government and to the public. BBSRC also needs to be accountable to the research community it supports, and to other relevant stakeholders including industry, other funding agencies, charities, and other end-users of research. Evaluation provides evidence of achievements and progress in BBSRC-funded research, enabling the Council to demonstrate its effectiveness. overview, setting the framework for future funding decisions. The findings from past evaluations played an important role in providing evidence of the impact of BBSRC-funded research in the 2004, 2007 and 2010 Spending Reviews, and contributed to the good allocation BBSRC received from the Science Budget. Funding decisions: Evaluation contributes to future funding decisions at a number of levels, both externally and internally: Justifying funding: the Department for Business, Innovation and Skills (BIS) requires Research Councils to bid for future budgets. As part of the negotiations, Councils must submit evidence of the impact of the funds that they previously allocated. Evaluation provides both quantitative and qualitative evidence of achievements and impacts from funded research and training. There is increasing emphasis from BIS and the Treasury for Research Councils to demonstrate the economic and societal impact of their research funding, and evaluation contributes to meeting this goal. Internal funding decisions: evaluation enables identification of achievements, progress against research objectives, and reasons for unsatisfactory progress. This facilitates the development of a strategic BBSRC s Research Equipment Initiative was reviewed in 2012. The Review Panel concluded that the initiative had provided excellent support for mid-range equipment to the UK bioscience community, and represented very good value for money. Lessons from the evaluation were subsequently used to inform the development of the new Advanced Life Sciences Research Technology Initiative. An evaluation of BBSRC s Follow-on Fund in 2014 demonstrated the success of the scheme in enabling ideas arising from BBSRC research to realise wider benefits through their application. The evaluation contributed to BBSRC s decision to retain the Follow-on Fund scheme, and helped inform the balance of investment between this scheme and the newly introduced Impact Acceleration Accounts. 5

BBSRC policy and practice: The evidence and strategic insight discussed above is also valuable for BBSRC s policy and practice: Development of research policies and specific research programmes: the results of evaluation inform policy decisions and the design of new schemes, programmes and processes. Maintaining and improving performance: evaluation enables managers to (i) share with others the lessons they have learnt and the good practice that they have developed; and (ii) identify weaknesses and improve processes, which is especially helpful for ongoing schemes such as responsive mode. BBSRC s Industrial CASE studentship schemes were reviewed in 2013. The evaluation showed that the schemes supported high-quality training and that the industry placement was an essential component that delivered wide-ranging benefits to the students. However, the evaluation noted that the requirement to participate in a placement was not being met for all students. As a result, BBSRC strengthened its processes for monitoring the uptake of placements. In addition, the evaluation contributed to BBSRC s decision to modify the mandatory requirement for industry to make financial contributions to the studentship, thereby encouraging greater participation from small and medium-sized enterprises. Indirect benefits: The questioning nature of evaluation and the process of gathering feedback from the scientific community can yield other benefits. Examples of this include improving the research community s perception of the Research Council ( we value your views ), and motivating BBSRC programme staff. In addition, evaluation can help foster changes in behaviour among groups involved in the evaluation process. For example, an increased emphasis on identifying the broader impacts of research may help embed a culture within the scientific community that recognises the importance of deriving economic and societal impact from excellent research. A rolling evaluation programme of BBSRC Research Committee responsive mode portfolios (completed in 2009) identified a number of issues that were subsequently addressed by BBSRC. For example: The strategic Longer and Larger grants (slola) programme was introduced in response to the declining number of research grants of over three years duration Committee Priority Areas were withdrawn as they were too numerous and poorly understood by the research community Further impetus was given to the development of systems to collect data on the longer-term impacts of research funding, by the repeated observation that final reports do not fully capture these outcomes. Outcomes data are now collected through the researchfish system for up to five years after the award has ended. 6

Reporting Dissemination The results and recommendations of each evaluation are collated into a report and submitted to BBSRC senior management and the appropriate BBSRC Strategy Panel(s) for consideration. Two of the main reasons for conducting evaluations are to inform funding decisions and to improve performance. For this to happen, the findings of evaluations need to be fed back into decision-making and strategic planning. BBSRC s senior management and BBSRC Strategy Panels consider the conclusions from evaluation reports and whether and how they might best be addressed. Specifically: BBSRC uses findings from evaluations to inform bids for future funding at the level of both the Science Budget and the BBSRC allocation, and as part of the regular reporting of performance. BBSRC ensures that the results from its evaluations are widely disseminated. Evaluation reports are made publicly available through publication on the BBSRC website. The reports of past evaluations can be accessed at: www.bbsrc.ac.uk/researchevaluation Future developments BBSRC is committed to the evaluation of its research and training portfolio. We continue to strengthen and improve our evaluation methods, as we gain more experience of evaluation and as new methods are developed. We also continue to work with the other Research Councils (through RCUK s Performance Evaluation Network) to coordinate our approaches to evaluation and identify best practice. BBSRC senior management and BBSRC s Strategy Panels use evaluation findings to inform strategic planning and funding decisions, for example: the funding balance (responsive mode, schemes, etc.) the design of new initiatives, and deciding which initiatives to fund the design of new schemes (or reform of existing schemes) the review of existing, and design of new, mechanisms and processes. Related activities The evaluation programme is closely linked to other BBSRC activities that provide evidence on performance and achievement: Institute assessment 6 : BBSRC conducts quinquennial reviews of strategic investments at research institutes. Impact evidence reports 7 : A programme which produces qualitative impact evidence in the form of case studies. Benefits realisation: A programme which drives and captures the intended positive benefits from the investments BBSRC makes in major capital projects. 6 www.bbsrc.ac.uk/about/policies/reviews/operational/1210-report-of-iae2011/ 7 www.bbsrc.ac.uk/news/impact/ 7

BBSRC s Mechanisms for Evaluation Evaluation Methodology BBSRC s current evaluation methodology is described in Appendix 1. It is developed both from our previous experience and international best practice in research evaluation. The methodology is continually improved and updated in the light of experience, with the aim of strengthening our evaluation methods, incorporating the assessment of the economic and societal impacts of research into the evaluation process, and reducing the burden on the research community. Selecting areas for evaluation When prioritising evaluations, BBSRC focuses on its major investments. An initiative or research area may be selected to provide evidence of performance on objectives within BBSRC s Strategic Plan, BBSRC s Delivery Plan, or government priorities. Other drivers include how the research will continue to be funded, providing assurance on performance across the whole portfolio, as well as avoiding excessive burden being placed on the research community by the evaluation programme. BBSRC works with RCUK to link the BBSRC evaluation programme to the wider evaluation of major cross-council programmes (e.g. cross-council funding of bioinformatics, genomics). Evaluation Timetable BBSRC has a timetable for the evaluation of its research grants portfolio to ensure that its schemes and programmes are evaluated regularly. Specifically: Major research investments Two or three each year. BBSRC makes major research investments through its responsive mode funding and research initiatives (time-limited research funding in strategically important areas). Evaluations of these investments focus on discreet scientific fields or priorities within the portfolio. Where appropriate, BBSRC aims to evaluate similarly themed initiatives and / or responsive mode research together, identifying common lessons and recommendations. Other schemes As resources allow. BBSRC operates a number of other schemes with specific objectives that are not necessarily defined by the scientific field they support (e.g. knowledge exchange schemes, studentships and fellowships, New Investigator, international Partnering Award schemes). Schemes are selected for evaluation in consultation with colleagues from BBSRC s Science Group and Innovation & Skills Group. The evaluation timetable is reviewed annually by BBSRC s senior management. 8

Appendix 1 BBSRC s Evaluation Methodology Evaluation Methodology BBSRC s research evaluation involves a number of key steps: 1. Defining the purpose and objectives of the evaluation: what is the evaluation for, who is the audience, how will they use the results? 2. Defining the scope of the evaluation: which programmes/research initiatives will it cover, which grants will be included, and over which period? 3. Designing the methodology: to include issues such as what evidence is needed to fulfil the evaluation s objectives, what methodologies will be used to gather this evidence? Grant holders submissions to researchfish 8 are a primary source of data on the outcomes and achievements of BBSRC investments. Defining the purpose and objectives The purpose and objectives of the evaluation are decided at the outset and aim to address the reasons for evaluation: assessing performance, identifying achievements, justifying funding, accountability and informing BBSRC policy and practice. They define the evaluation s terms of reference, which will provide the guidelines for the evaluators and peer-reviewers throughout the course of the evaluation. BBSRC uses logic charts to define the framework for evaluation of its research programmes and schemes. Logic charts are diagrams representing the objectives and desired impacts of a project or scheme. They put the scheme in its wider context, showing its links to the longer-term aims of the organisation within which it sits. 4. Conducting the evaluation: whether the evaluation will take place in-house or be commissioned externally, how will the analysis be conducted, and the results reported, to ensure that they are relevant and useful to the target audience? 5. Reporting the results and conclusions to relevant audiences: submission of reports to BBSRC senior management and BBSRC Strategy Panels, dissemination of the results to the research community and wider public. 6. Ensuring that the conclusions are taken into account in decision making. 8 For details see: www.rcuk.ac.uk/research/researchoutcomes/ 9

Logic charts comprise a number of levels: Overall Objectives High level objectives, often found in policy statements Scheme Objectives May be strategic, structural and technical Activities The actions used to support the scheme s objectives Immediate impacts Outputs and effects expected during the research supported by the scheme Intermediate impacts Expected to be achieved at, or shortly after, the end of the research supported Economic and societal impacts Ultimate effects that may be expected some time after the research supported by the scheme ends The dotted arrows show how the levels link hierarchically, i.e. that immediate impacts result from the activities, that intermediate impacts should achieve the programme objectives, and that ultimate economic and societal impacts should demonstrate that the overall objectives have been achieved. Evaluation focuses on the extent to which the impact levels (shaded blue) of the logic chart have been achieved, in relation to the objective and activity levels. Detailed logic charts are prepared for each evaluation. Defining the scope The scope of the evaluation depends on the size and type of programme or scheme to be reviewed, and on the nature of the evaluation s objectives. The primary driver within BBSRC evaluations is to provide evidence of the scientific quality of the research within the portfolio, and its economic and societal impact. Historically, economic impact has been derived from excellent science, and it is a fundamental assumption that poor quality research will have little positive impact. 10

Designing the methodology A variety of evaluation tools are available, each with advantages and disadvantages, and each suited to different evaluation questions. To ensure that the data collected are reliable, a combination of tools is used in order to triangulate the results. The following table provides a brief description of the different methodologies available, annotated with comments relevant to the BBSRC context. Method Description Comments Peer review / Expert judgment Opinions and recommendations are sought from experts specific to the field. Extensively used by BBSRC both for grant appraisal and for evaluation of responsive mode and initiatives. Will remain at the heart of BBSRC s evaluation procedures. Survey Asking multiple parties a series of questions, generating both quantitative and qualitative data. Questionnaire responses from grant holders are one of the primary sources of information in BBSRC evaluations. Bibliometrics Analysis of publications and citations data, using it as an indicator of science quality. Analyses of publication data have been an important part of previous evaluations, although BBSRC does not use more detailed bibliometric tools. International benchmarking Comparison of UK research with research quality in other countries, usually by peer review. BBSRC undertakes strategic assessment of its research in the international context, through inclusion of international members on evaluation Review Panels, and/or international referees. Case study In-depth analysis of a sample of a specific issue(s). Increasingly used by BBSRC, particularly to inform evaluation of economic impact. Economic analysis Social analysis Historical tracing Identifying the economic benefits and effects of research activities. Identifying and studying the contribution of research to quality of life. Tracing backward from a research outcome to identify precursor developments. With the current government emphasis on outcomes and impact, these approaches are increasing useful for demonstrating BBSRC s achievements against the high-level objectives identified in its Royal Charter and Strategic Plan, namely improved quality of life and economic prosperity in the UK. However, these methods are resource-intensive and therefore expensive. Meta-evaluation Drawing together the results of single evaluations to identify high level and wide reaching issues or conclusions. Likely to be undertaken intermittently by BBSRC. 11

When deciding the methodology to be used for a specific evaluation, BBSRC considers what it is possible to measure accurately, given the available time and resources. We aim to gather the most appropriate and reliable data or indicators that can be used to assess research quality and the wider economic and societal impact of the funded research. These data also allow comparisons to be made against expectations or targets, and provide scope for international benchmarking. For some of the expected outputs of research funding, it is relatively easy to measure performance (e.g. number of research articles published). However, for other areas providing reliable indicators is a challenging process. For example, the economic and societal impacts of research funding often have a significant time-lag, and the most significant impacts can take decades to emerge. There are also difficulties with the attribution of impacts from research, and it can prove very difficult to relate major developments back to a single research grant or research group. To address some of the difficulties in measuring economic and societal impacts, BBSRC measures progress towards achieving these outcomes. Milestones are identified that may reflect the steps needed to realise the greatest impact from our research funding (e.g. development of intellectual property, collaboration with industry, follow on funding from agencies supporting more strategic research). These serve as proxies and enable progress to be assessed. Past evaluations have measured a wide variety of outputs and outcomes: Research quality research articles and other publications submissions to electronic databases new resources, tools, and technologies further funding academic collaborations international rankings (e.g. citation analysis) Economic and societal impacts skills development and training support for early-career scientists new products, processes and technologies intellectual property spin-out companies knowledge exchange with end-users industrial collaborations contributions to government policy (e.g. 3Rs) science communication and public engagement 12

Conducting the evaluation Reporting the results The evaluation team within the Corporate Policy and Strategy Group is responsible for the evaluation of BBSRC s research. Evaluations are conducted jointly by the evaluation team and the relevant programme team in Science Group or Innovations & Skills Group. In addition, other teams in BBSRC may occasionally commission specific evaluations from external agencies. Evidence gathered during the evaluation process is presented to a Panel of independent experts from academia and industry, which may include international representation. Panels assess the performance of the programme in relation to its objectives, but also address wider issues such as whether there were better ways to achieve the stated objectives or whether there were better uses for the resources. A report including results and recommendations is produced for each evaluation, and made publicly available on BBSRC s website. Addressing the conclusions Evaluation reports are submitted to BBSRC s senior management for consideration. Recommendations are taken forward by the most relevant Strategy Panel(s) and the extent to which the recommendations are addressed is monitored by the Corporate Policy and Strategy Group. 13