The CWTS Leiden Ranking and other advanced bibliometric tools

Similar documents
Responsible Research Evaluation: lessons and perspectives

Towards best practices for authorship and research evaluation

Problems and considerations in the design of bibliometric indicators for national performance based research funding systems

The elephant in the room: The problem of quantifying productivity in evaluative scientometrics

The Leiden Ranking 2011/2012: Data Collection, Indicators, and Interpretation 1

LIBER Metrics Working Group. Sarah K. Coombs & Isabella Peters

Societal quality of research in a hospital context

Have we adopted the recommendations? What about the REF?

Assigning publications to multiple subject categories for bibliometric analysis An empirical case study based on percentiles

CONVENTIONAL AND INNOVATIVE METHODS OF RESEARCH ASSESSMENT

MEANINGFUL METRICS AND WHAT TO DO WITH THEM A DDS WORKSHOP

Bibliometric Study of FWF Austrian Science Fund /11

The Leiden Manifesto Under Review: What Libraries Can Learn From It

University-industry interactions and innovative universities: models and measurement

Next-generation altmetrics: responsible metrics and evaluation for open science. EU Expert Group Altmetrics Rene von Schomberg

How to map excellence in research and technological development in Europe

Using Bibliometric Big Data to Analyze Faculty Research Productivity in Health Policy and Management

SciVal Metrics Guidebook

THOMSON REUTERS: INCITES

An Integrated Impact Indicator (I3): A New Definition of Impact with Policy Relevance

Scopus: Search. Discover. Analyse. What s in it for researchers? Tomasz Asmussen Customer Consultant, Research Intelligence DACH

Lord Stern s review of the Research Excellence Framework

Zohreh Zahedi Centre for Science and Technology Studies (CWTS), Leiden University, The Netherlands

Open Science and Changing Scholarly Communication

Scopus: If you can t be found, you don t exist

PRICE Cost Analytics Turbocharges Strategic Supplier Assessment

Horizon 2020 Policy Support Facility

Topic 3 Calculating, Analysing & Communicating performance indicators

A Bibliometric Study of Irish Health Research in an International Context Health Research Board 2007

Demonstrating a shift toward ecosystem based research using scientometrics. 4S Annual Meeting, Cleveland OH November 4, :30 to 3:00 PM

Open Metrics: Originators and their Perceptions. Steffen Lemke Workshop on Open Metrics Uppsala, 23 May 2018

Higher Education Funding Council for England Call for Evidence: KEF metrics

Introduction. According to current research in bibliometrics, percentiles

Tweeting about journal articles: Engagement, marketing or just gibberish? 1

Our Mission. Delivering Strategic Mentoring Programs. Mentoring Technology

Bibliometric Benchmarking Analysis of Universidade Nova de Lisboa, /13

Bibliometric Benchmarking Analysis of Universidade Nova de Lisboa, /13

How Interdisciplinary Is Nano? Alan Porter Technology Policy & Assessment Center

Towards indicators for opening up science and technology policy

Portfolio Credit Risk using QlikView

REACH NEW HEIGHTS A DIVISION OF

The Investment Comparison Tool (ICT): A Method to Assess Research and Development Investments

2013 North American Contact Center Outsourcing New Product Innovation Award

Comments on Key Performance Indicators ( KPI ) Matrix and Statistical Validity

Measuring the Social Impact of Research

Sales Training 101. Scopus Introduction. Peter Porosz customer consultant Belgrade 23 rd January 2015

Large- Scale Analysis of the Accuracy of the Journal Classification Systems of Web of Science and Scopus

Use of Technology for Development and Alumni Relations among CASE Members

Tracking and measuring your research impact: A guide to using metrics and altmetrics.

THIS DOCUMENT, PREPARED BY THOMSON REUTERS, HIGHLIGHTS HOW INCITES HELPS THE HIGHER EDUCATION AND RESEARCH ORGANIZATIONS IN AUSTRALIA AND NEW

LONDON SCHOOL OF HYGIENE & TROPICAL MEDICINE PROFESSIONAL SUPPORT SERVICES DEPARTMENT OF HUMAN RESOURCES HR PARTNER FURTHER PARTICULARS

SWES 2021 STRATEGIC PLAN

Publication Strategies and Ranking

IT Strategic Plan

Academic Research Quality E-Valuation: An Adaptive Hybrid Model based on Journal Rankings and Citing Methodologies

3rd International Conference on World-Class Universities (WCU-3) 2 4 November 2009, Shanghai, China

Aarhus BSS Strategy

Scientometrics and the Digital Existence of Scientist(s): Methodology, Myth(s) & Reality

Ministry of Manpower

JOB DESCRIPTION AMNESTY INTERNATIONAL INTERNATIONAL SECRETARIAT (AIIS)

Segmentation Modeling

JOB DESCRIPTION AMNESTY INTERNATIONAL INTERNATIONAL SECRETARIAT (AIIS)

An Introduction to An Introduction to. BIRST Birst

Invest with Confidence

Cover Page. The handle holds various files of this Leiden University dissertation.

Impact of University Rankings: Global Excellence vs. Local Engagement

December Business and Finance Division Service Assessment Survey. Summary Report To Finance Division December 1, 2006

LITA Altmetrics & Digital Analytics Webinar December 8, 2016

Using the h-index to measure the quality of journals in the field of Business and Management

Evaluating altmetrics acts through their creators how to advance?

Overview and urgent action

Predictive Risk Analytics

The Multidimensional Assessment of Scholarly Research Impact

DIAGNOSTIC AND MONITORING TECHNOLOGIES FOR TRACK AND FIXED INFRASTRUCTURE ASSETS

Six ways English skills affect business results

Ranking research institutions by the number of highly-cited articles per scientist 1

Analytics Manager. The solution for standardised data analysis.

Data Science Advisor (International Consultant) Pulse Lab Jakarta/UN Global Pulse

Whole counting vs. whole-normalized counting: A country level comparative study of internationally collaborated papers on Tribology

TechBlue Advanced Support

Analysis of Khon Kaen University SciVal. Alexander van Servellen Consultant, Elsevier Research Intelligence

Predictive Project Analytics 2.0 Keep your project on target with Deloitte s data-driven insights. Risk Advisory

Astrus Third Party Intelligence

Transformation in the Internal Audit Function Neil White October 5, 2017

CASE STUDY: ENGS FINANCE. ENGS Commercial Finance keeps up with rapid growth, adds new sales tools using ASPIRE data features

CORPORATE STRATEGY vision2025

Chartered Association of Business Schools submission to the Technical Advisory Group s call for evidence on KEF metrics

Solar Energy System Cluster

SciVal to support building a research strategy. Coen van der Krogt Regional Director Research Intelligence, Elsevier Warsaw, July 10, 2018

Monitoring public institutions integrity plans

Benchmarking of Canadian Genomics

IBM Algo Managed Data Analytics Service

RESEARCH STRATEGY HE RAUTAKI RANGAHAU MASSEY UNIVERSITY

SCHWAB OPENVIEW WORKFLOW LIBRARY BUSINESS OPERATIONS WORKFLOW SERIES: BUSINESS REPORTING PROCESS

For the Business. Brand Book

How LinkedIn s Economic Graph helps to create jobs and growth

KPMG International. kpmg.com

Manažment v teórii a praxi 2/2006

KPMG International. kpmg.com

Cangrade White Paper. Powered by data science, customized for you

Transcription:

The CWTS Leiden Ranking and other advanced bibliometric tools Paul Wouters (with Nees Jan van Eck, Ed Noyons and Mark Neijssel) Road to Academic Excellence Russian universities in rankings by subject Mocow, 9-10 April 2015

Centre for Science and Technology Studies (CWTS) Research center at Leiden University focusing on quantitative studies of science (bibliometrics and scientometrics) Bibliometric contract research Monitoring & evaluation Advanced analytics Training & education 1

International research front 2

www.leidenranking.com 3

4

CWTS Leiden Ranking Provides bibliometric indicators of: Scientific impact Scientific collaboration Calculated based on Web of Science data Includes the 750 largest universities worldwide in terms of Web of Science publication output 5

Differences with other university rankings No aggregation of different dimensions of university performance (research, teaching, etc.) into a single overall performance indicator Exclusive focus on measuring universities scientific performance No dependence on survey data or data provided by universities Advanced bibliometric methodology 6

Advanced bibliometric methodology Percentile-based indicators to properly deal with highly skewed citation distributions Exclusion of publications not aimed at international research front Normalization for differences between fields in citation and collaboration practices: Field definitions based on an algorithmically constructed publication-level classification system of science Fractional counting of co-authored publications 7

Average- based vs. percentile- based indicators 8

Average- based vs. percentile- based indicators 9

Exclusion of publications The following publications are excluded: Non-English publications Publications in national scientific journals, trade journals, and popular magazines Publications in fields with a low citation density Retracted publications 10

Exclusion of publications and the effect on PP(top 10%) 11

Publication- level classification system of science Fields are defined at the level of individual publications rather than journals Publications are clustered into ~800 fields based on citation relations Smart local moving community detection algorithm 12

Full vs. fractional counting Full counting is biased in favor of fields with a lot of collaboration and a strong citation advantage for collaborative publications 13

U- Multirank 14

Some conclusions There is no such thing as overall university performance ; do not mix up different dimensions of university performance Use an appropriate bibliometric methodology: Use percentile-based indicators, not average-based indicators Do not blindly follow the selection of journals made by database producers; exclude non-scientific journals and journals with a national focus Use fractional counting, not full counting 15

National research missions 16

Research leaders face key questions How should we monitor our research? How can we profile ourselves to attract the right students and staff? How should we divide funds? What is our scientific and societal impact? What is actually our area of expertise? How is our research trans-disciplinary connected? 17

Research leaders need more, not less, strategic intelligence Increasing demand for information about research: hyper competition for funding globalization industry academic partnerships interdisciplinary research challenges institutional demands on research & university management Increased supply of data about research: web based research deluge of data producing machines and sensors increased social scale of research: international teams large scale databases of publications, data, and applications 18

New trends in assessment Increased bibliometric services at university level available through databases Increased self-assessment via gratis bibliometrics on the web (h-index; publish or perish; etc.) Emergence of altmetrics Increased demand for bibliometrics at the level of the individual researcher Societal impact measurements required Career advice where to publish? 19

Two examples of comparisons (using new normalization) Leiden Ranking of universities University ranking Only dimension is RESEARCH Limited set of 750 (most productive) universities world-wide Yearly update Brazilian Research Ranking Including all organizations (not only universities) Distribution over 5 regions (N,NE, CO, SE and S) Selection of 110 organization meeting current criteria (P>100)

Exploring the results All organizations in Brazil with more than 100 publications in ten years registered in the WoS Only dimension is research: impact and collaboration Developed at CWTS in collaboration with Brazilian sandwich PhD s http://brr.cwts.nl

22

CWTS Monitor - Meaningful Metrics A new interactive way of bibliometric analyses Powerful web-based application: User-friendly reporting interface Robust cleaned WoS database run by CWTS Fair and correct benchmarking by state-of-the-art indicators Highly configurable to client s specific needs Professional bibliometric reporting in your hands Scientists affiliated to the CTWS Institute of Leiden University provide expert support 23

CWTS Monitor: Select- Visualise- Conclude 24

The Leiden Manifesto (forthcoming) Quantitative evaluation should support qualitative, expert assessment. Measure performance in accordance with the research missions of the institution, group or individual researcher. Enable indicators to promote locally- relevant research. Keep data collection and analytical processes open, transparent and simple. Allow those evaluated to verify data and analysis. Account for variation by field in publication and citation practices. Base the assessment of individual researchers on qualitative judgment of their portfolio. Avoid misplaced concreteness and false precision. Recognize systemic effects of assessment and indicators. Scrutinize indicators regularly and update them. In collaboration with: Diana Hicks (Georgia Tech), Ismael Rafols (SPRU/Ingenio), Sarah de Rijcke and Ludo Waltman (CWTS) 25

The best decisions are taken by combining robust statistics with sensitivity to the mission and nature of the research that is evaluated. Decision making about science must be based on high quality processes informed by the highest quality data. 26