Snowball Metrics Webinar

Size: px
Start display at page:

Download "Snowball Metrics Webinar"

Transcription

1 Snowball Metrics Webinar Dr John Green, Chair of Snowball Metrics Steering Committee, University of Cambridge Scott Rutherford, Director of Research and Enterprise, Queen s University Belfast Peta Stevens, Head of University Strategy Office, University of Cambridge Dr Stephen Conway, Associate Director of Research Services, University of Oxford Anna Clements, Enterprise Architect, University of St Andrews Dr Lisa Colledge, Snowball Metrics Project Director, Elsevier 28 November

2 Dr John Green Chair of the Snowball Metrics Steering Committee University of Cambridge 2

3 Snowball Metrics are Tried and tested methodologies that are available free-of-charge to the higher education sector Defined and agreed by Higher Education Institutions to support their strategic decision making Absolutely clear definitions enable apples-to-apples comparisons 3

4 Vision Snowball Metrics Give insight into institutional strengths and weaknesses relative to peers Aspire to become global standards 4

5 Scott Rutherford Director of Research and Enterprise Queen s University Belfast 5

6 The current situation External demand Stakeholders ask for evidence of whether their investments have been worthwhile Heavy burden to provide data to multiple organisations each in a different format Internal demand Increasing thirst for information to drive evidence-based strategies But useful management information is difficult to find Represents perspective of external stakeholders Often out of date There is no balanced suite of metrics available 6

7 The opportunity to change things for the better Snowball Metrics project partners are working collectively to shape research information management ourselves Strategic alignment of resources to strengths and weaknesses can only be done by knowing our position relative to our peers A JISC report confirmed that many institutions in the sector feel the same way JISC: Joint Information Systems Committee (UK) UKs expert on information and digital technologies for education and research The value of working with a supplier 7

8 Snowball Metrics as well as the REF? CURRENT SITUATION Snapshot every 5-6 years Focused approach to measuring outputs and impacts Strategic allocation of researchers Changing methodologies DESIRED SITUATION Snapshots at least every year Broad range of measures Comparable allocation of researchers Stable approach REF: Research Excellence Framework Assesses the quality of research in UK higher education institutions Will inform the selective allocation of HEFCE s research funding HEFCE Higher Education Funding Council for England Provides accountability for public investment in research and evidence of the benefits Establishes reputational yardsticks for comparison 8

9 Peta Stevens Head of Research Strategy Office University of Cambridge 9

10 Benefits of Snowball Metrics within institutions Drive accessibility to good quality data Remove ambiguity in departmental reviews Enable judgment of our size and success in relation to our peers Provide a current picture Output metrics provide better insight into quality than is available now Field-Weighted Citation Impact in particular 10

11 Opportunities for Snowball Metrics outside institutions Moving towards bibliometrics for output measures in the STEM subjects in the REF would reduce work Value in being able to backtrack performance STEM: Science, Technology, Engineering, Mathematics Funding bodies will be interested in the metrics Quality of outputs as well as funding 11

12 Dr Stephen Conway Associate Director of Research Services University of Oxford 12

13 Snowball Metrics shared in Recipe Book Input Metrics Applications Volume Awards Volume Process Metrics Income Volume Market Share Output Metrics Scholarly Output Citation Count h-index Field-Weighted Citation Impact Publications in Top Percentiles Collaboration 13

14 The value of agreeing definitions Certainty that we are comparing like with like Identify common framework for comparisons Institutional organizational structures are different and not suitable to provide context Agreed a common framework in the UK to benchmark within: HESA Cost Centres We needed to assign our data to the HESA Cost Centre framework HESA Cost Centre: a grouping of researchers by field that allows meaningful comparisons between different types of data within a discipline HESA: Higher Education Statistics Agency 14

15 Specifying Awards Volume Removing ambiguity (extracts from the complete Recipe ) Use aggregated values of awards over award lifetime, not the value (to be) spent in any financial year Date used is the date that the award is entered in the institutional grants system Include subsequent financial amendments supplements and reductions Do not include non-financial amendments such as no-cost extensions 15 *Ordered according to productivity 2011 (data source: Scopus)

16 Metrics are tested for feasibility 16

17 Metrics are tested for feasibility 17

18 Metrics are tested for feasibility 18 *Ordered according to productivity 2011 (data source: Scopus)

19 Anna Clements Enterprise Architect University of St. Andrews 19

20 About eurocris eurocris is dedicated to the development of Research Information Systems and their interoperability eurocris produce CERIF, a standard data model that supports interoperability between CRIS CRIS: Current Research Information System CERIF: Common European Research Information Format 20

21 Incorporating existing standards into Snowball Metrics Using existing standards reduces burden on institutions HESA structure and data eurocris new Indicators Task Group will map Snowball Metrics to CERIF CERIF as a common language opens possibilities for global Generation Use Sharing to provide context HESA Higher Education Statistics Agency CRIS: Current Research Information System CERIF: Common European Research Information Format 21

22 eurocris membership is global and covers all interested groups University of Oxford, University College London, University of Cambridge, Imperial Colledge London, University of Bristol, University of Leeds, Queen s 22

23 eurocris Strategic Partners 23

24 Dr Lisa Colledge Snowball Metrics Project Director Elsevier 24

25 How can you get involved? Engagement by the global sector is essential for Snowball Metrics to become global standards to provide context in institutional strategic planning Feedback on your experiences of implementing the recipes to enrich the sector s knowledge Do you share the need for standard metrics to provide context? How relevant to you is the metrics framework as best practise? To what extent do the metrics recipes fit your data? What would help you to adopt Snowball Metrics? 25

26 Questions? 26

27 THANK YOU FOR YOUR ATTENTION!