Evaluating your research performance and analysing journal quality using SciVal. Notes and documents following the workshop

Size: px
Start display at page:

Download "Evaluating your research performance and analysing journal quality using SciVal. Notes and documents following the workshop"

Transcription

1 DEAKIN UNIVERSITY FACULTY OF SCIENCE, ENGINEERING AND BUILT ENVIRONMENT Research Development Workshop Series Workshop Evaluating your research performance and analysing journal quality using SciVal Thursday 29 October via video conference Notes and documents following the workshop Chair: Dr Anne Drake (Manager, Research and Innovation) Invited speaker for this workshop: Dr Steven Riddell (Research Consultant, Elsevier) Secretary: Teresa Treffry

2 Contents Introduction: Dr Anne Drake (Manager, Research and Innovation) 3 Presentation overview 4 Quick reference guide to SciVal Usage Guidebook SciVal Metrics Guidebook...35 SciVal Manual 141 All the above are bookmarked for ref. Links are also given in the overviewnotes on page 4 This is the final workshop planned for 2015 The notes and slide presentations from all workshops for 2014 and 2015 are available via the following links SEBE staff intranet - SEBE Information for HDR students -

3 Introduction. Chair Dr Anne Drake This is the final presentation in the 2015 SEBE workshop series. Previous workshops have looked in depth at the issues to be considered when entering into an industry partnership, the skills needed for clear scientific writing and also the many opportunities to improve our C.V s and track records by applying for prizes and awards. The guest speaker for this session is Dr Steven Riddell (Research Consultant, Elsevier) Steven will demonstrate the applications of SciVal, software which has been recently implemented by the University. Deakin has a subscription to this research performance assessment tool which uses data from Scopus. SciVal provides easy access to the research performance of 5,500 research institutions and 220 nations worldwide. It enables you to visualise research performance, benchmark relative to peers, develop collaborative partnerships and analyse research trends.

4 Overview and notes from workshop.... Secretary: Teresa Treffry Presentation Dr Steven Riddell stressed the importance of maintaining 1 profile - for ERA, collaborations and job applications among other things. As a one off from January 2016 every academic will have a Scopus profile through Deakin. Anyone who has a Scopus profile can be brought into SciVal enabling you to create working groups when collaborating on a project or creating grant applications. A Scopus login is needed to access SciVal. [If you don t already have one see following - create a Scopus login] A live demonstration of the ways that SciVal can be used covered an overview of the system, usage and metrics. For detailed information see the links following Quick reference guide to SciVal Usage Guidebook SciVal Metrics Guidebook SciVal Manual

5 Elsevier Research Intelligence SciVal Quick Reference Guide Version 2.0 June 2015

6 Elsevier s new generation of SciVal offers quick, easy access to the research performance of 5,500 research institutions and 220 countries worldwide. A ready-to-use solution with unparalleled power and flexibility, SciVal enables you to navigate the world of research and devise an optimal plan to drive and analyze your performance. Data source SciVal is based on output and usage data from Scopus, the world s largest abstract and citation database for peer-reviewed publications. The Scopus database covers over 30 million publications from 1996 until the present: 21,000 serials from 5,000 publishers. These include: 20,000 peer-reviewed journals 390 trade publications 370 book series 5.5 million conference papers Metrics SciVal offers a broad spectrum of industry-accepted and easy-to-interpret metrics including Snowball Metrics which is a global standard metrics defined and agreed by higher education institutions for institutional strategic decision making through benchmarking. Metrics in SciVal help the institutions to measure an institution s or a country s productivity, citation impact, collaboration, subject disciplinarity and more. Additionally SciVal includes usage data from ScienceDirect the largest scientific full text database with more than 2,500 journals and 26,000 books. 2 Elsevier Research Intelligence SciVal

7 Build your views on the world s research Visualize research performance Access comprehensive research performance summaries of any desired research entities, identify their unique research strengths and multidisciplinary research areas. Retrieve at-a-glance, standardized reports instantly Access competency maps for all institutions and countries Benchmark your progress Compare the performance of any institutions, countries, and pre-defined groups, or create your own research areas and monitor progress over time. Perform in-depth analyses to meet your specific objectives by selecting any combination of subject areas and metrics from a comprehensive set Identify your relative strengths and weaknesses to optimize your strategy Develop collaborative partnerships Identify and analyze existing and potential collaboration opportunities based on publication output and citation impact. Explore your institution s current and prospective partnerships on Google maps Identify your top collaborative institutions and co-authorship by drilling into specific subject areas and self-defined research topics Analyze research trends Analyze the research trends of any Research Area with citation and usage data, to discover the top performers, rising stars and current developments in the field. View the overall performance of a Research Area, then dig deeper into the activity and impact of the institutions, countries, authors and journals involved and adjust your research strategy accordingly Usage information complements citation data to give a more complete picture of research performance 3

8 Visualize research performance Comprehensive summaries of any desired research entities such as institutions, countries, research groups and topics. Overview tab provides you with at-a-glance research performance overviews of any selected institutions, countries, research topics and more. Select entity panel allows you to select any research entities from: Institutions and Groups Researchers and Groups Countries and Groups Research Areas and Groups Add an institution or a country by typing the name in the search box, and SciVal will provide you with a list of pre-defined institutions, countries and groups to select from. Select year range from: 3 years 3 years + current year 3 years + current year + beyond current year 5 years 5 years + current year 5 years + current year + beyond current year Filter subject area using 27 top level and 334 lower level subject areas based on Scopus All Subject Journal Classification (ASJC). Metric themes tabs provide comprehensive understanding of the selected research entity based on: Summary Publications Citations Authors (for institutions and research areas) Collaboration Competencies (circle map, matrix map) Institutions (for countries) 4 Elsevier Research Intelligence SciVal

9 Benchmark your progress Assess your relative strengths and weaknesses by making custom selections of research groups, indicators and subject areas to compare and benchmark against. Select a country, region or world from Country and Groups to benchmark your relative performance, or create research areas using journals and subject classifications to benchmark against a research topic. Benchmarking tab provides advanced capabilities to perform in-depth analyses by combining flexible set of entities and metrics. Using the entity panel, select any desired combination of research entities you wish to benchmark. Add institutions or countries by typing in the name, and SciVal will provide you with a list of pre-defined institutions and countries to select from. Select year range between 1996 and the current year. Filter subject area using 27 top level and 334 lower level subject areas based on Scopus ASJC. Select any combination of metrics from the pull down list. Switch view between chart and table. Add researchers, publication sets, research areas and groups by creating your own (see pages 8 and onwards). 5

10 Identify and evaluate existing and potential collaboration partners Get access to a list of institutions that you collaborate or have the potential to collaborate with. Start with a worldwide view of your institution s collaboration landscape, and then zoom in to individual collaborating institutions and researchers anywhere in the world. Current collaboration tab helps you to explore existing collaboration opportunities providing ranking of institutes and authors based on output and impact related metrics. Potential collaboration tab identifies institutions that you haven t yet co-authored any publications with. Select map view to understand an institution s existing collaboration landscape on Google maps. Click on each region to zoom in and understand collaboration country wise or state wise. Select year range from: 3 years 3 years + current year 3 years + current year + beyond current year 5 years 5 years + current year 5 years + current year + beyond current year Filter subject area using 27 top level and 334 lower level subject areas based on Scopus ASJC or your self-defined Research Areas. 6 Elsevier Research Intelligence SciVal

11 Select table view to access list of collaborating institutions. Search institutions by name. Limit collaborating institutions by region, country and segments using a drop-down box. Sort collaborating institutions by impact using: Citations Citations per Publication Field-Weighted Citation Impact Select institutions to: Assess output and impact of co-authored publications relative to performance of the entire institution View the subject area spread of co-authored publications Identify collaborating authors from each institution and identify which authors collaborate with each other. Evaluate your potential collaboration partners Once you have identified potential institutions and researchers to collaborate with, you can: Glance through overview module to: gain comprehensive overview of selected institutions specify top authors per subject field of your interest explore the institution s research competencies Compare candidate institutions using benchmarking module to: assess unique strengths of selected institutions by combining different metrics test scenarios by modeling teams with selected researchers benchmark performance against potential competitors Review their collaboration partners using collaboration module to: find out if anyone from your institution have co-author relationship understand the top collaborators per discipline and how beneficial those collaborations are 7

12 Analyze Research Trends Analyze the research trends of any Research Area with citation and usage data, to discover the top performers, rising stars and current developments in the field. Trends tab provides the ability to perform advanced topic centric analysis of any Research Area with usage and citation data. Using the entity selection panel, select the Research Area you wish to analyse. Either choose one you have defined or select from the 334 pre-defined Research Areas based on the Scopus journal classifications (ASJC). Select year range from: 3 years 3 years + current year 3 years + current year + beyond current year 5 years 5 years + current year 5 years + current year + beyond current year Summary tab provides an at-a-glance view of your Research Area. Key metrics at the top of the page highlight the overall research performance. The word cloud gives a visual description of the developments within the field. Entity tabs provide comprehensive understanding of the selected Research Area based on: Institutions Countries Authors Journals Keyphrases 8 Elsevier Research Intelligence SciVal

13 How are keyphrases calculated? SciVal uses the Elsevier Fingerprint Engine to extract distinctive keyphrases within the Research Area. The text mining is done through applying a variety of Natural Language Processing techniques to the titles and abstracts of the documents in the Research Area in order to identify important concepts. Concepts are matched against a set of thesauri spanning all major disciplines. For each document the distinctive keyphrases are selected based on Inverse Document Frequency (IDF), by incorporating a factor that diminishes the weight of words that occur frequently in the document set, and increases the importance of words that occur rarely. The top 50 keyphrases with the highest word weight are then selected to populate the word cloud for your choses Research Area. Each keyphrase is then given a relevance between 0 and 1 with 1 given to the most frequently occurring keyphrase. Remaining keyphrases are given a value based on their relative frequency. The relevancy value dictates the size of the keyphrase in the word cloud. 9

14 Define your own Research Areas SciVal offers a flexibility to define your own Research Areas, representing a field of research defined by you. Research Areas can represent a strategic priority, an emerging area of science, or any other topic of interest using below as the building blocks: Search terms Search for publication sets using keyword(s) Entities Select and combine any of the below Institutions (+ groups) Countries (+ groups) Journal categories Journals Competencies Select and combine competencies of any desired institutions or countries Note: Computation of Research Areas with more than 1500 publications will take between hours. You will be notified when the Research Area is available. Research Areas will be made available across the platform to: Assess your institution s performance within the field Identify top institutions and keywords See the publication and citation trends Explore new and potential collaboration partners Pre-defined entities SciVal is a ready-to-use solution with access to pre-defined 5,500 institutions, 220 countries and groups. Several groups of institutions and countries are made available such as EU27, US states, German Bundesländer and more. Pre-defined Research Areas are available using the 334 subject areas based on Scopus All Subject Journal Classification (ASJC) 10 Elsevier Research Intelligence SciVal

15 Define Researchers and Groups SciVal allows you to assess what if scenarios by modeling your own team and comparing its performance against your peers. Adding researchers from entity selection panel: You can define researchers from entity selection panel by combining and refining Scopus pre-defined Author Profiles. Define a new Researcher by name search. Select the author name variants of the researcher you are looking for. Click Next step and save your researcher or Review publications to refine author s publication list before saving. These changes will be reflected to Scopus within two weeks. [Repeat to define each researcher.] Define a group of researchers by selecting and combining your self-defined researchers. When you combine author name variants, these changes will be reflected in Scopus within 2 weeks. You can also review the list of publications that will be sent to Scopus for further refinement. 11

16 Define Researchers and Groups (continued) Adding researchers from My SciVal: My SciVal will allow you to view all your researchers (self-defined or pre-defined by Scopus) where you can choose, tag and add them to your entity panel. Select researchers from the list or by name search. Click Add to entity selection panel. Select to add researchers to your existing list or replace them. Clicking View in Benchmarking will take you directly to the benchmarking module. Click Add to panel. How is my team performing? Once your research teams are defined, you can benchmark against institutions, countries, or the world s average using metrics such as Field Weighted Citation Impact 12 Elsevier Research Intelligence SciVal

17 Importing researchers and groups: You can upload up to 300 Scopus Author-IDs (one AU-ID per researcher) in one action rather than to create each researcher one-by-one. Click Define a new entity, then Import a list of Researchers (text file). Click Create a new Group within these x researchers to save them as a group. Save and Finish. 13

18 Define Publication Sets and Groups You can create publication sets which you can use for grant applications, performance assessment and project managements. Creating a subset of Researcher s publications: You can select publications from your researcher s publication history to create a group. Click to Define a new Publication Set Note: You need to have pre or self-defined researchers added to your select-entity panel to activate this menu. Select a Researcher from your list. Select your desired publications and save. Importing publication lists: In case you have a set of publications that cannot be retrieved by keyword search, you can now upload them to SciVal. Go to My SciVal, select Publication Sets from select-entity panel, then click Define a new entity and select Import a Publication Set. Select ID format and upload text file. Scopus EID Unique identifier assigned to all of Scopus records. PubMed ID Unique identifier assigned to PubMed records. DOI (Digital Object Identifier) Unique identifier assigned to a digital object such as journal articles. Confirm publications and save. 14 Elsevier Research Intelligence SciVal

19 Managing My SciVal My SciVal can be used to manage large number of researchers and other entity types as well as to easily browse, filter, tag and move entities to the entity-selection panel. Add tags by departments and projects to manage researchers and research groups with ease. Add researchers to select-entity panel, import researchers and publication sets using your own list. View groups of countries and institutions. Edit Research areas by adding more search terms or applying more filters. 15

20 SciVal is part of the Elsevier Research Intelligence portfolio of products and services which serve research institutions, government agencies, and funders plan wisely and invest strategically to maximize research performance. For more information about SciVal, please visit elsevier.com/research-intelligence/scival MKT Copyright 2015 Elsevier B.V. All rights reserved. SciVal is a registered trademark of Elsevier Properties S.A., used under licence.

21 Elsevier Research Intelligence Usage Guidebook Version 1.01 March 2015

22 Contents 1. Usage as a data source The importance of research intelligence based on multiple data sources Why are usage metrics valuable? Optimal decisions recognise that research is multi-faceted Usage metrics are complementary to other types of research metrics Viewing activity can occur as soon as research is available online Usage reflects engagement of the whole research community Not all research is published with the expectation of being cited What if different data sources give different messages? 7 2. Usage data and metrics Usage data sources Anonymized usage data What do usage data mean? Which usage events are included? Selection of the appropriate usage source Selection of appropriate metrics Clarity on the question being asked Factors besides performance that affect the value of a metric Do I need to address these factors? Discipline Database coverage Manipulation Usage metrics Metric: Views Count Metric: Views per Publication Metric: Field-Weighted Views Impact 22 2 Elsevier Research Intelligence

23 Foreword Usage data are generated when requests are made for online scholarly information. These data hold intelligence about the interest in research outputs, and are an important piece of the jigsaw puzzle that builds up to a complete picture of the impact of research on academia and society. Usage data are especially exciting for other reasons as well: They begin to accumulate as soon as an output is available online, and are more immediate than citation activity, so that an emerging trend or research talent may be more quickly spotted than via citation activity They reflect the interest of the whole research community, including undergraduate and graduate students, and researchers operating in the corporate sector, who tend not to publish and cite and who are hidden from citation-based metrics They can help to demonstrate the impact of research that is published with the expectation of being read rather than extensively cited, such as clinical and arts and humanities research The availability of online usage data is a relatively recent phenomenon, and research metrics derived from usage data are not yet commonplace. Usage-based insights into impact are less familiar than insights based on publication and citation data, and funding awards data, and there are questions that have not yet been answered. This Guidebook provides information about usage data and metrics to answer some of your questions, and to help you to start to include this intelligence in the picture that you build of the impact of research. But of course, using such information will stimulate more questions, and we do not have all the answers yet. We are very much looking forward to working with you to learn about the new insights you can gain from this innovative information, and to answer some of the open questions. I hope that you find this Guidebook useful, and perhaps even interesting. Dr Lisa Colledge Elsevier March 2015 elsevier.com/research-intelligence 3

24 1. Usage as a data source 1.1 The importance of research intelligence based on multiple data sources Research intelligence aims to understand, as completely as possible, an entity s impact on the world. This entity may be, for example, a single publication or a set of several publications, a researcher or a team or network, a research area, an institution or a country, or the research financed by a particular funder. Whatever the entity is, its total impact is multi-dimensional 1, and is the combination of many different outputs and outcomes, such as productivity, frequency with which it has been read and cited, generation of intellectual property and spin-out companies that employ people in the region, and impact on society. This is illustrated in Figure 1. Data sources alone will never be enough to tell the whole story. Human judgment is essential to supplement and interpret the intelligence that is present in the data. Understanding the total impact of research, on both the research community and on society, can be seen as a jigsaw puzzle, with quantitative inputs forming some of the pieces, and qualitative inputs forming others. All of the pieces are needed to see the complete picture, but a good impression can be gained from having several pieces in place, even if there are a few gaps. We aim to make the quantitative section of that jigsaw puzzle as complete as possible, and to expand the range of data sources that we offer. Research encompasses many activities: publishing novel contributions, reading and citing, producing raw data and sharing it with others, collaborating within and between academia and business, building up a reputation and being considered an authority, and providing benefits to society outside the world of academia. These outputs and outcomes also act as a means of attracting talent and securing funding. Usage data is generated by those who access electronic research publications. They visit a database of publications, and select and view information about their question or interest. These actions are captured by the database and form a data source called usage data. Usage data is associated with different types of databases, such as commercial, institutional, and disciplinary, and these are all rich sources of intelligence about how research literature is being consumed. Peer review Expert opinion Narrative Research performance assessment to develop strategy Research Metrics Input Metrics: Enabling Research Process Metrics: Doing Research Output and Outcome Metrics: Sharing Research Recruit and evaluate researchers Secure and manage funding Establish partnerships Search, discover, read, review, experiment, analyze Manage Data Publish & Disseminate Partner with businesses Esteem (authority & reputation amongst peers) Societal-economic impact Impact (benefit to society) Get published Get viewed Get cited Figure 1 Research workflow. The impact of an entity, whether a single publication, a researcher, an institution, or the research financed by a particular funder, for example, is multi-dimensional, and can best be understood by combining metrics measuring a combination of inputs, processes, and outputs and outcomes. Usage data are generated by those who view electronic publications ( Get viewed ). Quantitative input should always be interpreted using judgment, and complemented by qualitative input. 1: J. Bollen, H. Van de Sompel, A. Hagberg, and R. Chute, A Principal Component Analysis of 39 Scientific Impact Measures (2009), PLoS ONE 4(6): e6022. doi: /journal.pone Elsevier Research Intelligence

25 1.2 Why are usage metrics valuable? Optimal decisions recognise that research is multi-faceted There are many ways in which research can be considered excellent: for instance, it may be well cited, but it may also receive few citations and yet be well read. Optimal decisions about research performance will likely draw on the widest base of information possible, and use both qualitative and quantitative input. Quantitative input is the most reliable when it is based on metrics that draw on a combination of data types, such as usage, publication and citation, collaboration, altmetrics, funding / awards information, and so on (see Figure 1). All of these pieces of information are complementary; usage data can reveal everything that has been viewed, and citation data represent a selection that the author of a publication has chosen to make. Combining them tells the most complete story, and can best support decision-making processes Usage metrics are complementary to other types of research metrics The initial reaction to new research, the influence it goes on to develop, and its ultimate use by others in their own research is a complex phenomenon that cannot be adequately measured by a single criterion. Metrics drawn from multiple data sources reflect different types of behavior, with different motivations underpinning them, and all may be important in their own right. Viewing activity, that produces usage data, is sometimes considered only in terms of how well it can predict citation activity. However, citations should not be seen as the leading research outcome against which all other behavior is to be compared. Viewing activity is important in its own right, and not only in relation to citation activity (see Box 1) Viewing activity can occur as soon as research is available online Viewing activity is typically detected before citations start to be received (Figure 2). Viewing metrics provide an early indication of interest in an output, or set of outputs. An emerging trend, or hot topic in research, or a new talent, may be more quickly spotted via viewing activity, since it is more immediate than citation activity. This does not mean, however, that only research that has recently become available is viewed. An analysis of subscribed ScienceDirect full-text usage demonstrates that older publications continue to be well used Usage reflects engagement of the whole research community Non-publishing and hence non-citing or cited users are estimated to constitute one-third of the research community 2. This includes large numbers of undergraduate and graduate students, as well as researchers operating in the corporate sector. By incorporating demand from these users, which is hidden from citation-based approaches, usage-based metrics may provide a more representative indication of how research publications perform. This takes us a step closer to measuring scholarly influence on the entire research and student community Not all research is published with the expectation of being cited Clinical research is primarily aimed at practitioners who are working with patients. These practitioners tend to read voraciously to stay up to date with new clinical advances so that they can offer their patients the best care, but they are less likely to publish original research themselves. They may eventually be cited in reviews, but this type of clinical research may be poorly cited, but very well viewed and / or read. Viewing activity may be more appropriate than citations as an indicator of impact in these disciplines. Researchers in Arts & Humanities usually do not publish frequently, tend not to include long reference lists in publications, and their output may be of local or regional interest. The volume of citations, or citation potential, is therefore low. This should not necessarily be interpreted as the research being poor, but as a reflection of the behavior inherent in this field. Citations may not be the most useful indicator of impact in these cases, but the amount of interest in these outputs could still be significant, and this may be better measured by viewing activity based on usage data. 2: D.J.S. Price and S. Gürsey, Studies in Scientometrics I. Transience and continuance in scientific authorship (1976), International Forum on Information and Documentation, 1(2), 17-24; and C. Tenopir and.w. King, Towards electronic journals: Realities for scientists, librarians, and publishers (2000), Washington, DC: Special Libraries Association. elsevier.com/research-intelligence 5

26 Box 1: Relationship between viewing and citation activities Viewing and citation statistics reflect different types of behavior, with different motivations underpinning them. Both are important in their own right, rather than only as predictors of each other. If there is any relationship between viewing and citation activities, it should be considered as a cycle: the influence of viewing on citation, and the influence of citation on viewing. Table 1 shows the disciplinary correlation between full-text downloads and citations at the journal level. It is based on download counts received in the year of publication, and citation counts in the third year after publication. Correlation indicates the relationship between two sets of data, but does not necessarily mean that one causes the other; it is possible that there are one or more additional factors involved. The extent of correlation, between downloads and citations, depends on the discipline, and, while the research did not have access to sufficient information about the user and reader populations to rigorously test the reasons for this variable correlation, the authors of this research suggest that: Disciplines in which the correlation is high, such as Biochemistry and Molecular Biology, tend to be specialized, and the author and the reader populations tend to coincide Disciplines in which the correlation is lower may have a reader population that is much broader than the publishing (cited and citing) research community. This would include readers interested in humanities and social science research from outside these disciplines, and practitioners using technical information from engineering and nursing journals Correlation (Pearson s R) Over 0.65 Disciplines Agricultural and Biological Sciences Biochemistry and Molecular Biology Business Chemical Engineering Chemistry Decision Sciences 0.40 to 0.65 Computer Science Dentistry Earth Sciences Economics and Finance Engineering Immunology Materials Science Mathematics Medicine Energy Environmental Science Physics and Astronomy Below 0.40 Arts and Humanities Health Professions Neuroscience Nursing Pharmacology Veterinary Science Psychology Social Sciences Table 1 Disciplinary correlation between full-text downloads and citation data at the journal level 3. This publication goes on to note that there is more variance in the correlation between downloads and citations at the level of individual publications. In an applied science journal, the citation counts of highly downloaded articles (>2,000 downloads) showed a strong scatter, and the journal contained highly downloaded papers which were not highly cited. Similarly, a case study of the journal Tetrahedron Letters 4 found no statistical evidence of a relationship between early full-text downloads of a publication, and the citations it subsequently received. However, more of the highly cited publications than would be expected were also highly downloaded, leading to the hypothesis that a small group of publications that were both highly downloaded and cited were responsible for driving the apparent correlation. The correlation between usage and citations is also greatly reduced when only non-english language journals are investigated 5. Citations also correlate with increased usage. The case study of Tetrahedron Letters, for instance, found that during the 3 months after receiving a citation, the number of full-text downloads received by a publication increased by 25% compared to what would be expected if the citation had not been received 3. 3: Table 1 is based on Figure 6 in G. Halevi and H.F. Moed, Usage patterns of scientific journals and their relationship with citations (2014), Proceedings of the science and technology indicators conference 2014 Leiden, p : H.F. Moed, Statistical relationships between downloads and citations at the level of individual documents within a single journal (2005), Journal of the American Society for Information Science and Technology, 56(10), : V.P. Guerrero-Bote and F. Moya-Anegón, Downloads versus citations and the role of publication language (2014), Research Trends 37, available at 6 Elsevier Research Intelligence

27 Number of downloads Downloads Citations Number of citations Month (1 = Jan 2008) Figure 2 Usage data accumulates quickly, and is a more immediate indicator of attention than citation data 6. The chart shows the example of one publication whose corrected proof appeared online on 4 March 2008 (month 3 in the figure), and whose corrected paginated proof appeared online on 22 August 2008 (month 8). 1.3 What if different data sources give different messages? The data sources underpinning research metrics are based on the day-to-day activities of researchers, students, and readers, and therefore offer useful windows into behavior and trends. Distinct metrics often reinforce each other s message, giving a high degree of confidence in the analysis and conclusions. Extra confidence due to this reinforcing effect may be especially warranted when the metrics are calculated from distinct data sources. There will be other situations in which metrics, whether from the same or distinct data sources, appear to give conflicting information. A common reaction is that one or other data source must be incorrect, but this can be a valuable signal that further investigation would be useful. Research metrics only reflect what is present in the data produced by the research community itself, and so a more productive approach is to try to understand the reason for the apparent discrepancy. For instance: If there is high usage but little or no citation activity, is this because too little time has passed since publication for citations to have accumulated in this discipline? Or is this a discipline where citations are not to be expected? If there is citation activity but no or little usage, is this because the usage cannot be captured in the data source that is being viewed? This would be the case when using ScienceDirect usage data for publications that are not included in a journal published by Elsevier and that are therefore not available on ScienceDirect. This could be addressed by selecting Scopus usage instead Research metrics alone will never be enough to provide a complete picture, and human judgment and other sources of insight are essential to supplement and interpret the intelligence that is present in the data. 6: Figure 2 is based on work conducted for the following paper, and is reproduced with permission: G. Halevi and H.F. Moed, Usage patterns of scientific journals and their relationship with citations (2014), Proceedings of the science and technology indicators conference 2014 Leiden, p elsevier.com/research-intelligence 7

28 2. Usage data and metrics 2.1 Usage data sources The usage metrics in Elsevier s tools draw on anonymized usage data from our commercial databases ScienceDirect 7 and Scopus 8. All ScienceDirect and Scopus usage data are COUNTER-compliant, and are audited every year: COUNTER (Counting Online Usage of Networked Electronic Resources) is an international initiative serving librarians, publishers and intermediaries by setting standards that facilitate the recording and reporting of online usage statistics in a consistent, credible and compatible way 9. Scopus is the world s largest abstract and citation database, and delivers a comprehensive overview of global research output. Scopus indexes content from over 5,000 publishers, including Elsevier, and its usage data offer the optimal representation of what is being viewed across multiple publishers. Its content is determined by the independent and international Scopus Content Selection and Advisory Board 10. See Box 2 for more information about the content of Scopus. ScienceDirect, Elsevier s leading information solution for researchers, contains global, peer-reviewed, full-text content published by Elsevier. Users can view Elsevier s final version of a publication. ScienceDirect usage is driven by Elsevier-published content that is being viewed globally. The Research4Life 11 program ensures that electronic usage from developing countries is reflected in the counts. The content available online via ScienceDirect also continues to be available in print, and usage of coffee table print journals, for instance, cannot be captured. See Figure 3 for more information about the content of ScienceDirect. The relationship between Scopus and ScienceDirect is illustrated in Figure Anonymized usage data The metrics that Elsevier produces are anonymized. They do not provide any information about what a particular institution s users are viewing, and it is not possible to see the usage of a particular customer. For instance: SciVal 12 displays information about the total views that an institution s publications have received. This can be sliced by country and sector (academic, corporate, government, or medical). This global usage includes the institution s own views, but also the views from all other institutions in the world. You can see what is being viewed, but not who is responsible for the viewing My Research Dashboard 13 allows researchers to immediately see the impact of their research in terms of what is being downloaded, shared, and cited, and the country and discipline of viewers. Researchers can see which type of users are viewing their output and how often, but not the institution to which these users are affiliated 7: 8: 9: Elsevier was one of the six founding publishers of Research4Life, which provides more than 6,000 institutions in more than 100 developing countries with free or low-cost access to the latest peer-reviewed online content: Elsevier Research Intelligence

29 Box 2: Scopus content Scopus is the world s largest abstract and citation database, and contains information on global research across all disciplines: science, mathematics, engineering, technology, health and medicine, social sciences, and arts and humanities. Scopus content covers: Journals Over 21,000 titles from more than 5,000 international publishers (see the journal title list 14 ) This includes more than 2,800 gold open access journals More than 365 trade publications are indexed Articles-in-press are available from more than 3,750 journals and publishers Books Almost 70,000 stand-alone books (see the book title list 14 ), with more than 75,000 expected during 2015 through the Books Expansion Project 15 More than 420 book series Conference papers Approximately 6.5 million conference papers from over 17,000 worldwide events, including High energy physics from the inspire database 16 Computer science conferences and workshops from the DBLP Computer Science Bibliography 17 Society meetings including the IEEE, American Chemical Society (ACS), Association for Computing Machinery (ACM), Society of Petroleum Engineers (SPE), The Minerals, Metals & Materials Society (TMS), American Geophysical Union (AGU), European Society of Cardiology (ESC), International Society for Chemotherapy (ISC), American Society for Information Security (ASIS), Japan Society of Mechanical Engineers (JSME), and many more Agricultural and Biological Sciences Arts and Humanities Biochemistry, Genetics and Molecular Biology Business, Management and Accounting Chemical Engineering Chemistry Computer Science Decision Sciences Dentistry Earth and Planetary Sciences Economics, Econometrics and Finance Energy Engineering Environmental Science Health Professions Immunology and Microbiology Materials Science Mathematics Medicine Multidisciplinary Neuroscience Nursing Pharmacology, Toxicology and Pharmaceutics Physics and Astronomy Psychology Social Sciences Veterinary 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Figure 3 ScienceDirect content coverage. ScienceDirect content (articles and reviews only) is shown as a percentage of Scopus content within the Scopus subject classification, for the period Analysis was performed using data from 16 December : Scopus journal and book title lists are available at 15: 16: 17: elsevier.com/research-intelligence 9

30 2.3 What do usage data mean? The common feature of all types of usage activity is that a user makes a request to a service for a particular piece of scholarly information 18. This request may be made for a variety of reasons which are unknown by the service that records the request: perhaps they are referring to a reading list, or a colleague has just published or informed them about something; perhaps they were intrigued by the title of the publication and requested it to see whether it was of interest (and it may or may not have been); or perhaps they intend to read the information and incorporate it into their own research (and may or may not eventually do so). The most that we can say is that usage reflects an interest or need for particular information. 2.4 Which usage events are included? SciVal aims to give the most complete picture of the viewing activity within a particular research area. Therefore, it does not attempt to distinguish between different types of requests for information, and includes the following usage events: Scopus the sum of abstract views, and clicks on the link to view full-text at the publisher s website. These events cover all views from both commercial and trial customers ScienceDirect the sum of abstract and full-text views. Full-text views comprise PDF downloads, HTML requests, Mobipocket downloads 19, and epub downloads 20. These events cover all views from commercial customers, trial customers, guests, learned society accounts, and Research4Life 11 Activity from ClinicalKey 21 and Health Advance 22 is not included Activity that is generated by Elsevier s internal editorial processes, such as by journal editors and reviewers using online tools, is not included My Research Dashboard 13 aims to provide researchers with the latest usage and citation activity for their publications. It helps researchers to showcase their output by providing early details about: how their publications are being downloaded in ScienceDirect, shared in Mendeley 23, and, according to Scopus, how they are being cited; metrics about the country and disciplinary distribution of their readers; detailed information about the search terms used in ScienceDirect to find their publications; and comparative metrics in relation to similar publications. The usage metrics draw on ScienceDirect usage data and show counts of full-text views, which comprise PDF downloads, HTML requests, Mobipocket downloads 19 and Epub downloads 20. These events cover all views from commercial customers, trial customers, guests, learned society accounts, and Research4Life. 18. M. J. Kurtz, M.J., & J. Bollen, Usage Bibliometrics (2010) Annual Review of Information Science and Technology 44 (1), pp Mobipocket produces an e-book reader for mobile phones, personal digital devices, and desktop operating systems. 20. epub is an open distribution format standard for digital publications, developed by the International Digital Publishing Forum Elsevier Research Intelligence

31 3. Selection of the appropriate usage source SciVal offers both Scopus and ScienceDirect usage data, and users can select the data source from which usage metrics are calculated. This section highlights differences between usage data from Scopus and ScienceDirect to assist this selection; the relationship between them is illustrated in Figure 4. Scopus indexes content from over 5,000 publishers, including Elsevier, and its usage data offer the optimal representation of what is being viewed across multiple publishers. See Box 2 for more information about the content of Scopus ScienceDirect usage is driven by Elsevier-published content that is being viewed globally. The content of other publishers is not available on this platform. See Figure 3 for more information about the content of ScienceDirect Figure 4 Relationship between Scopus and ScienceDirect. Scopus drives usage to ScienceDirect, along with many other search and discovery platforms. The platforms shown are illustrative only, and are not intended to be an exhaustive list. PubMed EBSCO NextBio Inno-360 ScienceDirect Global views of full-text content published by Elsevier Scopus Abstract and indexing database Optimal representation of what researchers are viewing across >5,000 publishers Web of Science Google Scholar Usage data collected since Content Type of usage (see section 2.4) Organization-type of user base. See Figure 5 Full-text content published by Elsevier. See Figure 3 Abstract views Full-text downloads >90% academic >90% academic Journals, books and conference papers from > 5,000 publishers globally. See Box 2 Abstract views Links to publisher to view full text 0% 20% 40% 60% 80% 100% Academic Corporate Government Scopus ScienceDirect Medical Other Figure 5 Scopus and ScienceDirect usage activity by organization-type. Data are based on activity from January-September elsevier.com/research-intelligence 11

32 4. Selection of appropriate metrics This topic has been covered in detail in section 4 of the SciVal Metrics Guidebook 24, and is not repeated in its entirety here. Key points are detailed, together with additional information relevant to usage data. The aim of using research metrics as an input into decision making is to complement qualitative inputs and increase confidence that the judgment is the optimal one. Elsevier offers a range of research metrics from which to select, and appropriate selection depends on two important factors: The question that is being asked Awareness of other factors, beyond performance, that can influence the value of a metric. These may or may not be important in the context of the question being asked 4.1 Clarity on the question being asked The types of questions asked typically fall into three groups: Evaluation of performance, such as is conducted by a national body on its research institutions for the purposes of allocating national funding, or by a line manager to provide input into career development discussions. It is important in these situations that variables besides differences in performance are accounted for to ensure that the assessment is fair. It would not be advisable to compare a research group in chemistry with one in immunology using metrics that do not take into account the tendency for higher output and citation rates in immunology, for instance Demonstration of excellence, such as in support of an application for competitive funding, or for promotional purposes to attract post-graduate students to a research institution. The aim in these situations is typically to find a way to showcase a particular entity, and a user may be able to benefit from the factors besides performance that affect a metric. For instance, a large institution may choose to use one of the Power metrics that tend to increase as an entity becomes bigger, whereas a small institution may prefer to use a size-normalized metric Scenario modeling, such as in support of the decision of which academic to recruit to an existing research team, or of how to structure a reorganization. The importance of factors besides performance that affect metrics may or may not be important, depending on the particular scenario that is being modeled 24: 12 Elsevier Research Intelligence

33 4.2 Factors besides performance that affect the value of a metric There are six factors, besides performance, that may affect the value of a metric: Size Discipline Publication-type Database coverage Manipulation Time Discipline, database coverage and manipulation are considered further in this Guidebook. The reader is referred to the SciVal Metrics Guidebook 24 for information on the remainder Do I need to address these factors? Sometimes these factors may not need to be addressed at all, or may be used to advantage. A large institution that is aiming to present its performance favorably in order to attract students may purposefully use a metric like Views Count that does not take size into consideration. This would not, however, be a suitable approach for the evaluation of entities of varying size. Metrics themselves may address some of these factors, as summarized in Table 2. If metrics are being used to evaluate entities of different sizes, then using the size-normalized metric Views per Publication instead of Views Count would compensate for this difference. If these entities also have a very different disciplinary profile, then Field-Weighted Views Impact might be the preferable choice. The tools in which metrics are embedded may provide an answer, even if the metric itself does not. Views per Publication, itself sensitive to disciplinary differences, could still provide a useful in evaluating institutions with distinct disciplinary profiles if functionality is used to slice and dice these institutions to a common disciplinary portion. Size-normalized? Resistant to database coverage? Difficult to manipulate? Fieldnormalized? Publication-typenormalized? Timeindependent? Views Count Views per Publication Field-Weighted Views Impact Table 2 Characteristics of usage metrics elsevier.com/research-intelligence 13

34 4.2.2 Discipline Academics working in different disciplines display distinct characteristics in their approach to, consumption of, and communication of research findings. These behavioral differences are not better or worse than each other, but are merely a fact associated with particular fields of research. The volume of usage per discipline, in Scopus and ScienceDirect, is shown in Figure 6a. Figure 6b shows the average Views per Publication, per discipline. Values are smaller for recent years because publications that became available online in 2013 have had less time to be viewed than those that became available in 2010, and so have accumulated fewer total counts. The difference in magnitude of the metric between the two data sources reflects the difference in the total volume of usage data available. 0% 2% 4% 6% 8% 10% 12% 14% 16% Agricultural and Biological Sciences Arts and Humanities Biochemistry, Genetics and Molecular Biology Business, Management and Accounting Chemical Engineering Chemistry Computer Science Decision Sciences Earth and Planetary Sciences Economics, Econometrics and Finance Energy Engineering Environmental Science Immunology and Microbiology Materials Science Mathematics Medicine and Dentistry Neuroscience Nursing and Health Professions Pharmacology, Toxicology and Pharmaceutical Science Physics and Astronomy Psychology Social Sciences Veterinary Science and Veterinary Medicine Scopus ScienceDirect Figure 6a shows the proportion of the total usage of Scopus and ScienceDirect, over the period January-September 2014, per discipline. 14 Elsevier Research Intelligence

35 Agricultural and Biological Sciences Arts and Humanities Biochemistry, Genetics and Molecular Biology Business, Management and Accounting Chemical Engineering Chemistry Computer Science Decision Sciences Dentistry Earth and Planetary Sciences Economics, Econometrics and Finance Energy Engineering Environmental Science Health Professions Immunology and Microbiology Materials Science Mathematics Medicine Multidisciplinary Neuroscience Nursing Pharmacology, Toxicology and Pharmaceutics Physics and Astronomy Psychology Social Sciences Veterinary Figure 6b shows Views per Publication per discipline calculated using Scopus usage data (top) and ScienceDirect usage data (bottom). Date of data cut was 16 December elsevier.com/research-intelligence 15

36 4.2.3 Database coverage For information about the coverage of: Scopus, see section 2.1 and Box 2 ScienceDirect, see section 2.1 and Figure Manipulation Ease of manipulation of usage data is sometimes a concern in using it as an input for decision making. It is probably true that it is easier to manipulate usage data than citation data. This is another reason for drawing on multiple research metrics as input into your questions: it is much more difficult to manipulate the data underlying multiple metrics, especially if they are drawn from distinct data sets. Moreover, there are industry guidelines in place to limit the effectiveness of attempted manipulation of usage data. Scopus and ScienceDirect usage data are COUNTER-compliant, and are audited every year. COUNTER (Counting Online Usage of Networked Electronic Resources) is an international initiative serving librarians, publishers and intermediaries by setting standards that facilitate the recording and reporting of online usage statistics in a consistent, credible and compatible way 25. COUNTER have prepared clear guidelines in their Code of Practice 26 that address these concerns, and excerpts from this code are quoted here: All users double-clicks on an http-link should be counted as only 1 request. The time window for occurrence of a double-click should be set at 10 seconds between the first and second mouse-click. There are a number of options to trace that a double-click is coming from one and the same user: IP-address, session-cookie, user-cookie or user name The downloading and rendering of a PDF, image, video clip or audio clip may take longer than the rendering of an HTML page. Therefore requests by one and the same IP/username/session- or user cookie for one and the same PDF, image, video clip or audio clip should be counted as a single request if these multiple requests occur within a 30 seconds time window. These multiple requests may also be triggered by pressing a refresh or back button on the desktop by the user When two requests are made for one and the same article within the above time limits the first request should be removed and the second retained. And additional requests for the same article within these time limits should be treated identically: always remove the first and retain the second. This means that the removal of double-clicks may occur over a longer period than the 10 or 30 seconds mentioned, since the timing of the latest click is always gauged relative to the most recent; 25 clicks made at 5 second intervals over a period of around 2 minutes will still be counted as 1 view 25: 26: COUNTER Code of Practice for e-resources: Release 4 page Elsevier Research Intelligence

37 5. Usage metrics This section covers the usage metrics that are currently available from Elsevier. It shares their method of calculation, situations in which they are useful, and situations in which care should be taken. It also suggests usage metrics that might be considered useful partners, either to address shortcomings of a particular metric, or to highlight information that is naturally complementary. These suggestions should not be taken as rules, but as guidelines that may sometimes be useful, and are summarized in Table 3. For these metrics... Views Count Views per Publication Field-Weighted Views Impact... useful partner metrics are: Views Count Views per Publication Field-Weighted Views Impact Natural complementary partner metrics Communicate information about magnitude of metrics values Avoid display of viewing dip in recent years Table 3 Suggested partner metrics elsevier.com/research-intelligence 17

38 5.1 Metric: Views Count Views Count indicates the total usage impact of an entity: how many views have this entity s publications received? Views Count is a: Usage Impact metric Power metric : its value tends to increase as the size of the entity increases Views Count may be displayed in a chart or table with months and / or years: In SciVal, the years show the date on which items became available online; this may be different to the official publication date. They do not refer to the years in which publications were viewed In My Research Dashboard, the months and years show when a publication was viewed This metric is useful to: Benchmark the views received by entities of similar size, and that fall into similar disciplines, such as multidisciplinary institutions with a similar number of research staff, or international collaboration networks in similar disciplines Showcase the performance of entities that are large in comparison to a group of peers, when this metric is likely to give high numbers Showcase the performance of entities that have published a few noticeably highly viewed publications that will have a positive effect on the total for the entire data set Give an early indication of interest in output that has recently become available, for example in the very early stages of a new strategy, or of early-career researchers Showcase the interest of the whole research community, and not only the two-thirds who publish and therefore cite. The one-third which does not tend to publish includes large numbers of undergraduate and graduate students, as well as researchers operating in the corporate sector Demonstrate interest in outputs produced in disciplines with low citation potential, such as clinical research and the arts and humanities that are generally well read but poorly cited Provide transparency on the underlying data to build trust in research metrics 18 Elsevier Research Intelligence

39 This metric should be used with care when: Benchmarking the visibility of entities of obviously different sizes, when this Power metric may most closely reflect entity size rather than differences in views received. Users are advised to use the size-normalized metrics Views per Publication or Field-Weighted Views Impact to compare the visibility of entities of different sizes Benchmarking the usage of entities with distinct disciplinary profiles. The average usage between disciplines is variable (see Figure 6b), and it is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences. When comparing entities made up of a mixture of disciplines, such as an Institution or an interdisciplinary Research Group, it is advised to apply a Research Area filter to focus on one field that is common between all the entities, or to select Field-Weighted Views Impact which will take this into account Revealing the extent to which each of an entity s outputs are viewed, since one or a few publications with a very high number of views can conceal a sizeable body of unviewed or poorly viewed material There may be gaps in output in the database coverage For Scopus usage, this will mainly apply when entities are small, and a single missing output may have a significant negative impact on apparent usage For ScienceDirect, this will depend on the proportion of the entity s outputs that are published in Elsevier titles (see Figure 3) The only way to account for this is to be vigilant; consider also limiting the use of Views Count to comparing larger data sets in the same discipline where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison The people who will use the metrics do not like to see a trend that dips in recent years. This typically happens with Views Count because the most recent outputs have had less time to receive views than older ones. Users are advised to use Field-Weighted Views Impact to avoid this drop, if it is of concern Useful partner metrics are: Views per Publication and Field-Weighted Views Impact, which bring complementary perspectives on total views received. Both account for differences in the size of entities being compared, and Field-Weighted Views Impact also accounts for differences in viewing behavior between disciplines Field-Weighted Views Impact avoids the dip in recent years due to the most recent outputs having had less time to receive views than older ones Views Count is calculated analogously to Citation Count. For a worked example of the calculation underlying this metric, please see Example 3 in the SciVal Metrics Guidebook : SciVal Metrics Guidebook pages elsevier.com/research-intelligence 19

40 5.2 Metric: Views per Publication Views per Publication indicates the average usage impact of an entity s publications: how many views have this entity s publications received on average? Views per Publication is a: Usage Impact metric Views per Publication may be displayed in a chart or table with months and / or years: In SciVal, the years show the date on which items became available online; this may be different to the official publication date. They do not refer to the years in which publications were viewed In My Research Dashboard, the months and years show when a publication was viewed This metric is useful to: Benchmark the average usage impact of publications within a body of work or entity Compare the average visibility of publications of entities of different sizes, but in related disciplines, such as Researchers working in a similar Research Area Showcase the performance of entities that have published a few highly viewed papers that will have a positive effect on the average of the entire data set Give an early indication of interest in output that has recently become available, for example in the very early stages of a new strategy, or of early-career researchers Showcase the engagement of the whole research community, and not only the two-thirds who publish and therefore cite. The one-third which does not tend to publish includes large numbers of undergraduate and graduate students, as well as researchers operating in the corporate sector Demonstrate interest in output produced in disciplines with low citation potential, such as clinical research and the arts and humanities that are generally well read but poorly cited Provide transparency on the underlying data to build trust in research metrics 20 Elsevier Research Intelligence

41 This metric should be used with care when: Benchmarking the usage of entities with distinct disciplinary profiles. The average usage between disciplines is variable (see Figure 6b), and it is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences. When comparing entities made up of a mixture of disciplines, such as an interdisciplinary collaboration network, it is advised to apply a Research Area filter to focus on one field that is common between all the entities, or to select Field-Weighted Views Impact which will take this into account Revealing the extent to which each of an entity s outputs are viewed, since one or a few publications with a very high number of views can conceal a sizeable body of unviewed or poorly viewed material There may be gaps in output in the database coverage For Scopus usage, this will mainly apply when entities are small, and a single missing publication may have a significant negative impact on apparent usage For ScienceDirect, this will depend on the proportion of the entity s publications that are published in Elsevier titles (see Figure 3) The only way to account for this is to be vigilant; consider also limiting the use of Views per Publication to comparing larger data sets in the same discipline where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison Entities are small, such that the metric may fluctuate significantly and appear unstable over time, even when there is complete database coverage. Views per Publication calculates an average value, and is strongly influenced by outlying publications in a small data set The people who will use the metrics do not like to see a trend that dips in recent years. This typically happens with Views per Publication because the most recent publications have had less time to receive views than older ones. Users are advised to use Field-Weighted Views Impact to avoid this drop, if it is of concern Useful partner metrics are: Field-Weighted Views Impact, which is a natural complement to Views per Publication and takes into account behavioral differences between disciplines Field-Weighted Views Impact avoids the dip in recent years due to the most recent publications having had less time to receive views than older ones Views per Publication is calculated analogously to Citations per Publication. For a worked example of the calculation underlying this metric, please see Example 3 in the SciVal Metrics Guidebook : SciVal Metrics Guidebook pages elsevier.com/research-intelligence 21

42 5.3 Metric: Field-Weighted Views Impact Field-Weighted Views Impact indicates how the number of views received by an entity s publications compares with the average number of views received by all other similar publications in the same data universe: how do the views received by this entity s publications compare with the world average for that database? Similar publications are those publications in the database that have the same publication year, publication type, and discipline, as represented by the Scopus classification system. A Field-Weighted Views Impact of 1.00 indicates that the entity s publications have been viewed exactly as would be expected based on the global average for similar publications in the same database; the Field-Weighted Views Impact of World, that is, of either the entire Scopus database or the entire ScienceDirect database, is 1.00 A Field-Weighted Views Impact of more than 1.00 indicates that the entity s publications have been viewed more than would be expected based on the global average for similar publications in the same database; for example, 3.87 means 287% more viewed than world average within the same database A Field-Weighted Views Impact of less than 1.00 indicates that the entity s publications have been viewed less than would be expected based on the global average for similar publications in the same database; for example, 0.55 means 45% less cited than world average within the same database Publications can be allocated to more than one category in the Scopus classification system. When we calculate the expected views for similar publications, it is important that these multi-category publications do not exert too much weight; for example, if a publication P belongs to both parasitology and microbiology, it should not have double the influence of a publication that belongs to only one or the other of these. This is accounted for in this metric calculation by distributing publication and views counts equally across multiple categories; publication P would be counted as 0.5 publications for each of parasitology and microbiology, and its views would also be shared equally between them. Field-Weighted Views Impact is a: Usage Impact metric Field-Weighted Views Impact may be displayed in a chart or table with months and / or years: In SciVal, the years show the date on which items became available online, this may be different to the official publication date. They do not refer to the years in which publications were viewed In My Research Dashboard, the months and years show when a publication was viewed This metric is useful to: Benchmark entities regardless of differences in their size, disciplinary profile, age, and publication-type composition, such as an institution and departments within that institution Easily understand the prestige of an entity s usage performance by observing the extent to which its Field-Weighted Views Impact is above or below the world average of 1.00 Present usage data in a way that inherently takes into account the lower number of views received by relatively recent publications, thus avoiding the dip in recent years seen with Views Count and Views per Publication 22 Elsevier Research Intelligence

43 Gain insight into the usage performance of an entity in a discipline with relatively poor database coverage, since gaps in the database will apply equally to the entity s publications and to the set of similar publications Use as a default to view usage data, since it takes into account multiple variables that can affect other metrics Give an early indication of the interest in output that has recently become available, for example in the very early stages of a new strategy, or of early-career researchers Showcase the engagement of the whole research community, and not only of the two-thirds who publish and therefore cite. The one-third which does not tend to publish includes large numbers of undergraduate and graduate students, as well as researchers operating in the corporate sector Demonstrate interest in output produced in disciplines with low citation potential, such as clinical research and the arts and humanities that are generally well read but poorly cited This metric should be used with care when: Information about the magnitude of the number of views received by an entity s publications is important. In these situations, it is advised to use Views Count or Views per Publication Demonstrating excellent performance to those who prefer to see high numbers; Views Count or Views per Publication would be more suitable in these circumstances Entities are small, such that the metric may fluctuate significantly and appear unstable over time, even when there is complete database coverage. Field-Weighted Views Impact calculates an average value, and is strongly influenced by outlying publications in a small data set Trust needs to be built in research metrics. This calculation accounts for multiple normalizations, and the generation of the average views for similar publications requires calculations on the entire database which will be difficult for a user to validate. Users are advised to select simpler metrics, such as Views Count or Views per Publication, if trust in the accuracy of the metrics calculations needs to be built Completely answering every question about performance from a usage perspective. Field-Weighted Views Impact is a very useful metric and accounts for several variables, but using it to the exclusion of other metrics severely restricts the richness and reliability of information that a user can draw on Useful partner metrics are: Views Count and Views per Publication. They indicate the magnitude of the number of views received, to complement the relative view offered by Field-Weighted Views Impact. They are also simple and allow transparency on the underlying data to build trust in the accuracy of metric calculations Field-Weighted Views Impact is calculated analogously to Field-Weighted Citation Impact. For a worked example of the calculation underlying the metric, please see Example 5 in the SciVal Metrics Guidebook 29. For the mathematical notation of this metric, please see page 63 of the SciVal Metrics Guidebook. 29: SciVal Metrics Guidebook pages elsevier.com/research-intelligence 23

44 For more information, please visit: elsevier.com/research-intelligence MKT Copyright 2015 Elsevier B.V. All rights reserved.

45 Elsevier Research Intelligence SciVal SciVal Metrics Guidebook Version 1.01 February 2014

46 Authored by Dr Lisa Colledge and Dr Reinder Verlinde Version 1.01 February Elsevier Research Intelligence SciVal

47 Table of contents 1.0 Scopus: the data source for SciVal metrics Scopus content Scopus content and SciVal Books and SciVal Data currency in Scopus and SciVal Author Profiles Author Profiles in Scopus Authors and Researchers in SciVal Affiliation Profiles Affiliation Profiles in Scopus Affiliations and Institutions in SciVal Do publications belong to Institutions or to Researchers? Organization-types Organization-types in Scopus Organization-types in SciVal Journal metrics in Scopus and SciVal SciVal and Metrics Groups of Metrics in SciVal The calculation and display of metrics in SciVal Publications included in the calculation of a metric Deduplication Zero and null values The display of >current year Citation counts Calculation options Research Area filter Publication-type filter Self-citation exclusion Absolute number and percentage options 19 Example 1a: Self-Citation Exclusion 20 Example 1b: Self-Citation Exclusion Selection of appropriate metrics Clarity on the question being asked Factors besides performance that affect the value of a metric Size Discipline Publication-type Database coverage Manipulation Time SciVal metrics: methods and use The display of metrics in SciVal Metric: Scholarly Output Metric: Journal Count Metric: Journal Category Count 43 Example 2: Scholarly Output, Journal Count and Category Count Metric: Citation Count Metric: Cited Publications Metric: Citations per Publication 53 Example 3: Citation Count, Cited Publications and Citations per Publication Metric: Number of Citing Countries 58 Example 4: Number of Citing Countries Metric: Field-Weighted Citation Impact 61 Example 5: Field-Weighted Citation Impact Metric: Collaboration Metric: Collaboration Impact Metric: Academic-Corporate Collaboration Metric: Academic-Corporate Collaboration Impact 73 Example 6: Collaboration, Collaboration Impact, Academic-Corporate Collaboration, and Academic-Corporate Collaboration Impact Metric: Outputs in Top Percentiles 78 Example 7: Outputs In Top Percentiles Metric: Publications in Top Journal Percentiles 84 Example 8: Publication In Top Journal Percentiles Metric: h-indices 90 Example 9: h-indices 94 elsevier.com/research-intelligence/scival 3

48 4 Elsevier Research Intelligence SciVal

49 Foreword The quote often attributed to Albert Einstein, but perhaps more properly attributed to William Bruce Cameron 1, is often referred to when writing a foreword such as this: Not everything that counts can be counted, and not everything that can be counted counts. This is undoubtedly true, but does not follow that nothing should be measured. There is much that can be counted that is important and provides valuable perspectives on trends in academia, and there is an increasing emphasis on this in the world of research today. The specialized work of engineers is used every day by car owners, who know how to use the technology that has been developed by others in an appropriate and responsible way. Car owners welcome the continued contributions of engineers to their vehicles, since it makes their lives easier. In the same way, the field of metrics relies on specialized academic study for its development and to capitalize on advances in technology; these scholarly outputs are increasingly being used by people involved in research, whether directly by conducting research or indirectly in a supporting or enabling role. The majority of these people would not consider themselves to be experts in metrics or their use: this Guidebook is aimed at those people, and at supporting the use of metrics in an appropriate and responsible way. This Guidebook is not an academic tome that aims to provide up-to-theminute information about the latest advances in metrics. It is intended to be a straightforward, practical companion to the use of SciVal, which is part of the Elsevier Research Intelligence suite; new versions of this Guidebook will be produced as SciVal continues to evolve. It provides some facts about how the data underlying the metrics are used, about how the metrics are calculated and displayed, and about variables besides performance that can affect the values of these metrics. It then makes some suggestions about situations in which the metrics are useful, when care should be taken, and how shortcomings may be addressed. Only one absolute in the use of metrics is highlighted in this Guidebook: always use more than 1 metric to give insights into a question, and always support conclusions drawn from metrics by peer review and/or expert opinion. This triangulation of approaches will increase the reliability of conclusions drawn if these distinct types of information reinforce each other, and will flag areas for further investigation if they do not. Beyond this, there are no black-and-white rules. The best approach is to use common sense. I hope that you find this Guidebook useful, and perhaps even interesting. Dr Lisa Colledge Elsevier January : elsevier.com/research-intelligence/scival 5

50 6 Elsevier Research Intelligence SciVal

51 1. Scopus: the data source for SciVal metrics All metrics and information displayed in SciVal at the time of writing this Guidebook are based on Scopus. This section highlights some of the approaches of Scopus that are useful in understanding the metrics in SciVal. The reader can find more extensive information about Scopus online Scopus content The independent and international Scopus Content Selection and Advisory Board reviews titles for inclusion in Scopus on a continuous basis. Information about the process and the acceptance criteria is available online 3. The Board considers journals, conference proceedings, trade publications, book series, and stand-alone books for inclusion. Scopus indexes about 50 million publications. Reference lists are captured for the 29 million records published from 1996 onwards. The additional 21 million pre-1996 records reach as far back as the publication year Scopus content and SciVal SciVal uses Scopus content from 1996 onwards so that the citation counts displayed in SciVal are based on uninterrupted years of data Books and SciVal Scopus indexes both book series, and stand-alone books. Books in SciVal refers to stand-alone books only; their characteristics, and the fact that they do not have journal metrics, sets them apart from the series of books, journals, conference proceedings and trade publications. Scopus links citations to the individual chapters of edited volumes when the information provided by the author allows this, and otherwise to the edited volume itself. Scopus makes either one link or the other, but not both. SciVal credits a Researcher who is the author of a book chapter with the count of citations linked to that particular chapter; it credits a Researcher who is the editor of the entire volume with the citations linked to the volume plus those linked to all the individual chapters to ensure that the editor is credited with the full citation impact of their scholarly contribution. 2: 3: elsevier.com/research-intelligence/scival 7

52 1.2 Data currency in Scopus and SciVal The data in Scopus are updated daily. The data in SciVal are updated every week. SciVal takes a copy of the Scopus database that is then structured to optimally support its metrics and functionality. This means that SciVal may be slightly behind Scopus in its data currency. 1.3 Author Profiles Author Profiles in Scopus Scopus is the only database in the world which has invested in automatically grouping the publications it indexes into those published by a single author. Author Identifiers (Author Profiles) group together publications belonging to one author, and they have 2 modes of input: Publications are automatically grouped into Author Profiles using a matching algorithm: This algorithm looks for similarities in author name, as well as affiliation, journal portfolio, and discipline to match publications together. Users may notice that multiple name variants are grouped within one Author Profile, which indicates the value of this algorithm The information provided by authors is not always consistent or complete, and even if this were the case, the mobility of authors means that there is always some doubt about whether some publications belong together. In situations like these, a balance needs to be made between the precision, or accuracy, of matching, and the recall, or completeness, of the groups formed, and increasing one will reduce the other The Scopus algorithm favors accuracy, and only groups together publications when the confidence level that they belong together, the precision of matching, is at least 99%, such that in a group of 100 papers, 99 will be correctly assigned. This level of accuracy results in a recall of 95% across the database: if an author has published 100 papers, on average 95 of them will be grouped together by Scopus These precision and recall figures are accurate across the entire Scopus database. There are situations where the concentration of similar names increases the fragmentation of publications between Author Profiles, such as in the well-known example of Chinese authors. Equally there are instances where a high level of distinction in names results in a lower level of fragmentation, such as in Western countries A publication that has multiple co-authors will be part of multiple Author Profiles Publications are manually reassigned based on feedback. The matching algorithm can never be 100% correct because the data it is using to make the assignments are not 100% complete or consistent. The algorithm is therefore supplemented by feedback received from the sector, including from the Open Researcher and Contributor ID (ORCID) initiative 4, and that feedback is used to enhance the profiling of authors by the Scopus Author Feedback Team 5 4: 5: 8 Elsevier Research Intelligence SciVal

53 1.3.2 Authors and Researchers in SciVal The presence of Author Profiles in Scopus enables enormous flexibility for SciVal users. Every Author Profile is available to SciVal users to define and view any Researcher in the world in real time, whether they are based at the user s institution, they are a collaboration partner in another country, or they are a Researcher who was unknown until today. Users can build on the Author Profiles to define as many Researchers as they like in their SciVal accounts, and subsequently use their Researchers as the basis to create an unlimited number of: Groups of Researchers. These could represent actual research teams, departments or other organizational groups, international collaboration networks, or models of research teams that are being considered Publication Sets. These are pieces of a Researcher s output, such as publications from a particular affiliation, or those funded by a particular award Groups of Publication Sets. These could represent a selection of publications being submitted to a national evaluation, such as the Research Excellence Framework in the United Kingdom 6, for example SciVal distinguishes between its use of the terms authors and Researchers : Authors are the automatically created Scopus Author Profiles. The author count in the Overview module, for example, is a count of unique Author Profiles; this may be an over-estimation of the number of Researchers, because the recall rate of 95% means that a Researcher s publication portfolio might be split over multiple Author Profiles Researchers are entities which have been created with the benefit of human input. SciVal users can combine Author Profiles, remove any publications that do not belong to an Author Profile, and search for individual publications that should be added. The Profile Refinement Service that populates SciVal with Researchers and Groups of Researchers on behalf of an Institution is a similar manual process, performed by Elsevier on behalf of a customer. SciVal distinguishes these manually created Researchers, whether created by users or by Elsevier, from the automatically generated Author Profiles. All of these manual enhancements are fed back to Scopus and used to improve the quality of the source database for all users of Scopus data. This means that once the feedback has been processed by Scopus, each Researcher in SciVal is represented by a single Author Profile which can be automatically updated as new publications are indexed 6: elsevier.com/research-intelligence/scival 9

54 1.4 Affiliation Profiles Affiliation Profiles in Scopus Scopus is the only database in the world which has invested in automatically grouping the publications it indexes into those published by a single affiliation. These groups of publications belonging to one affiliation are called Affiliation Profiles, and they have 2 modes of input: Publications are automatically grouped into Affiliation Profiles using a matching algorithm: This algorithm looks for similarities in affiliation name, as well as addresses, to match publications together. Users may notice that multiple name variants are grouped within one Affiliation Profile, which indicates the value of this algorithm. Scopus makes use of an authoritative database that contains over 70,000 manually verified institutional name variants to match publications together The information provided by authors is not always consistent or complete, so that there is always some doubt about whether some publications belong together; in situations like these, a balance needs to be made between the precision, or accuracy, of matching, and the recall, or completeness, of the groups formed, and increasing one will reduce the other The Scopus algorithm favors accuracy, and only groups together publications when the confidence level that they belong together, the precision of matching, is at least 99%, such that in a group of 100 papers, 99 will be correctly assigned. This results in a recall of 93% across the database, such that if an Affiliation has published 100 papers, on average 93 of them will be grouped together by Scopus, and the others will be in 1 or more separate groups A publication that has co-authors with multiple affiliations will be part of multiple Affiliation Profiles Publications are manually reassigned based on feedback. The matching algorithm can never be 100% correct because the data it is using to make the assignments are not 100% complete or consistent. The algorithm is therefore supplemented by feedback received from the official authority of the affiliation in question Affiliations and Institutions in SciVal The presence of Affiliation Profiles in Scopus brings enormous benefits for SciVal users, with metrics being pre-calculated and available to view at the click of a mouse. SciVal users also benefit from the availability of Groups of Institutions, such as American states. SciVal distinguishes between its use of the terms affiliations and Institutions : Affiliations are the automatically created Scopus Affiliation Profiles. A medical school will be a separate Affiliation Profile from a university, for instance Institutions are groupings of related Affiliation Profiles which have been manually created as a convenient starting point for SciVal users; approximately 4,500 Institutions have been pre-defined and are available in SciVal. Any medical schools are always grouped together with the university in the SciVal Institutions 10 Elsevier Research Intelligence SciVal

55 1.5 Do publications belong to Institutions or to Researchers? Researchers are mobile, and tend to change affiliations during their careers. This leads to 2 perspectives as to whether publications belong to Institutions or to Researchers: The Institutional Perspective is typically that publications should remain assigned to them even when the Researchers that authored these publications have moved. In other words, the publications are not mobile, despite their authors moving around The Researcher Perspective is generally that publications should be just as mobile as their authors, and should move from affiliation to affiliation as their authors careers develop These approaches are both needed to completely support questions that are asked in different situations. SciVal offers both of these perspectives, because publications are linked to both Affiliation and Author Profiles independently of each other: Institutions and Groups of Institutions in SciVal take the Institutional Perspective Researchers, Groups of Researchers, Publications Sets and Groups of Publication Sets in SciVal take the Researcher Perspective 1.6 Organization-types Organization-types in Scopus Organization-types are assigned to Scopus Affiliation Profiles based on their primary functions. This function is often very clear from the name of the affiliation, and the organization s website is checked for guidance if there is any doubt. Scopus assigns affiliations to the following organization-types: university, college, medical school, hospital, research institute, corporate, law firm, government, military organization, and non-governmental organization Organization-types in SciVal The organization-types used in SciVal are based on aggregations of the Scopus organization-types to group similar functions together, and to simplify the options for the user. SciVal uses 5 organization-types: Academic, Corporate, Government, Medical, and Other. These are composed of the following Scopus organization-types: Academic: university, college, medical school, and research institute Corporate: corporate and law firm Government: government and military organization Medical: hospital Other: non-governmental organization elsevier.com/research-intelligence/scival 11

56 1.7 Journal metrics in Scopus and SciVal Scopus and SciVal use 2 journal metrics that have been developed by academic research teams, and whose methodology has been published in peer-reviewed journals. Information about these metrics is available online, and all the metrics values are also available for free 7. These metrics are: SNIP 8. Source-Normalized Impact per Paper is a ratio between the Raw Impact per Paper, a type of Citations per Publication calculation, actually received by the journal, compared to the Citation Potential, or expected Citations per Publication, of that journal s field. SNIP takes differences in disciplinary characteristics into account, and can be used to compare journals in different fields. The average SNIP value for all journals in Scopus is SJR 9. SCImago Journal Rank is a prestige metric, whose methodology is similar to that of Google PageRank. It weights the value of a citation depending on the field, quality and reputation of the journal that the citation comes from, so that all citations are not equal. SJR also takes differences in the behavior of academics in different disciplines into account, and can be used to compare journals in different fields. The average SJR value for all journals in Scopus is Journal metrics are not calculated for trade publications. Stand-alone books cannot have SNIP and SJR values, since their calculation methodologies depend on items being in a series. 7: 8: 9: 12 Elsevier Research Intelligence SciVal

57 2. SciVal and Metrics 2.1 Groups of Metrics in SciVal SciVal offers a broad range of metrics to: Accommodate the preferences of users in approaching questions from multiple angles Enable users to triangulate their evidence. When seeking input into a question from an evidence-base, it is preferable to use at least 2 different metrics to provide information; if pieces of intelligence gained from multiple metrics reinforce each other, then the user can have a higher degree of confidence that their conclusions are valid SciVal s metrics can be classified within 6 groups, and a metric may be part of more than 1 group, as illustrated in Table 1: Productivity metrics give information on the volume of output of an entity Citation Impact metrics indicate the influence of an entity s output, as indicated by various types of citation counts Collaboration metrics provide information on the research partnerships of an entity Disciplinarity metrics give information on the spread of topics within an entity s publications Snowball Metrics 10 are defined and endorsed by research-intensive universities as providing important insight into institutional strategies: The agreed and tested definitions are shared free-of-charge with the sector in the Snowball Metrics Recipe Book 11, with the ambition that Snowball Metrics become global standards for the higher education sector These recipes can be used by anyone for their own purposes. Elsevier supports Snowball Metrics as recognized industry standards, and is implementing these metrics in relevant systems and tools, including SciVal Snowball Metrics are indicated in the SciVal interface by the following snowflake symbol Power metrics whose value tends to increase as the size of an entity increases. For example, a larger institution will tend to publish more output than a smaller institution, just because of its greater size 10: 11: elsevier.com/research-intelligence/scival 13

58 Productivity Citation Impact Collaboration Disciplinarity Snowball Metric Power metric Scholarly Output Journal Count Journal Category Count Citation Count Cited Publications Citations per Publication Number of Citing Countries Field-Weighted Citation Impact Collaboration Collaboration Impact Academic-Corporate Collaboration Academic-Corporate Collaboration Impact Outputs in Top Percentiles Publications in Top Journal Percentiles h-indices Table 1: Groups of metrics in SciVal. Those metrics that have half a shaded cell in the Power metric column are Power metrics when the Absolute number option is selected, but not when the Percentage option is selected. 2.2 The calculation and display of metrics in SciVal Publications included in the calculation of a metric The ideal situation would be that every publication in a data set is associated with the information needed so that it can be included in the calculation of every metric. In practice this is not the case; authors do not always include complete affiliation information, and publications are not always part of items indexed in Scopus that have journal metrics values, for example. Publications that lack the necessary information are excluded from the metric calculation Deduplication SciVal offers the user the opportunity to investigate aggregate entities formed by the combination of smaller entities; for example, US states are groups of institutions, and geographical regions are groups of countries. The same publication may be part of multiple smaller component entities, and can be added multiple times to the aggregate entity. Say, for example, that the Researcher R1 has co-authored a publication P1 with Researcher R2; in that case, P1 is a part of both R1 and R2. SciVal deduplicates all the publications within an aggregate entity, so that a publication is only counted once even if it is co-authored by several of the component entities. Entities are groups of unique publications, and users can have confidence when creating aggregate entities that SciVal deduplicates the data set. In the above example, P1 is counted once only in a Group of Researchers that is composed of R1 and R2. 14 Elsevier Research Intelligence SciVal

59 2.2.3 Zero and null values SciVal distinguishes between zero and null values (absence of a value), which have distinct meanings for the interpretation of metrics. Consider the metric Scholarly Output, which calculates the number of publications of an entity. If an entity has not published any outputs in a particular time frame, this is displayed as a zero value and the lack of publication is important to understanding an entity s activity. Null values are never displayed for Scholarly Output. This is not the case for a metric like Citation Count, which counts the number of citations that an entity s publications have received. It is not meaningful to display a value of zero citations for a period during which an entity has not published anything, because the entity simply cannot receive citations when there is an absence of publications. This is different from the situation when an entity has published items that have not received any citations. SciVal therefore distinguishes between the following 2 cases: If an entity has published output in a particular period, and that output has not received any citations, then a value of zero is displayed for Citation Count If an entity has not published any output in a particular time frame, then the entity cannot receive any citations during that period. In this case, a null value is plotted in SciVal; users may notice a break in a line when viewing metrics over time in the Benchmarking module, for example This same reasoning is applied to other metrics. It is not possible to be internationally collaborative, for example, if nothing has been published. Outputs in Top Percentiles can only be calculated for the current year from the first data snapshot on or after 1 July. It will be displayed as a null value until this date is reached. This metric depends on being able to divide the publications into 100 percentiles, and this level of division is not possible earlier in the publication year when items just published have received very few citations The display of >current year A high proportion of the items indexed in Scopus are journals. Publishers sometimes release journal issues before their official cover date; for instance, at the end of 2013, users can find items in the Scopus database that belong to the 2014 publication year, and occasionally even to later years. A great deal of importance is placed in SciVal on a high currency of data, so that all information present in the database is available to users. All publications that belong to items with a cover date later than the current year are collected together in the Publication Year labelled >2013 in 2013, >2014 in 2014, and so on. elsevier.com/research-intelligence/scival 15

60 Research Area filter Publication-type filter Self-citation exclusion Absolute count or percentage SciVal Default No filter applied No filter applied No exclusion Percentage Scholarly Output Journal Count Journal Category Count Citation Count Cited Publications Citations per Publication Number of Citing Countries Field-Weighted Citation Impact Collaboration Collaboration Impact Academic-Corporate Collaboration Academic-Corporate Collaboration Impact Outputs in Top Percentiles Publications in Top Journals Percentiles h-indices Table 2: Calculation options for the metrics in SciVal Citation counts Citations counts in SciVal represent the total number of citations received since an item was published, up to the date of the last data cut. Citation Impact metrics are often displayed in a chart or table with years in SciVal to indicate trends; these years are always the years in which items were published, and never refer to the years in which citations were received. Older publications tend to have more citations than newer publications, simply because they have had longer to receive them from subsequent work. This means that for metrics like Citation Count and Citations per Publication, where this time factor is not accounted for, the user will notice a dip in the timeline in recent years. Comparisons between similar entities will still be useful and valid despite this dip, since the time factor will affect all similar entities equally, but if the user prefers to avoid this display they can select a metric like Field-Weighted Citation Impact which inherently accounts for this. 16 Elsevier Research Intelligence SciVal

61 Main or Sub-Category Collaboration Academic-Corporate Collaboration Percentile level Journal metric Data universe Main International collaboration With collaboration 10% SNIP World Calculation options SciVal users have different needs in how they use the metrics. Some of these needs are met by how the results are visualized, but some affect the method of calculation. For instance: An academic who is a journal editor may well publish regular editorials, which count towards their productivity and may also increase their citation impact. In understanding the citation impact of this academic, it is very important to look at the entire body of their output, including the editorials, since being a journal editor and writing editorials is a significant part of their scholarly contribution. However, when comparing this academic to others who are not journal editors, it may sometimes be desirable to exclude the effect of the editorials to ensure a fair comparison Self-citations are those by which an entity refers to its previous work in new publications. Self-citing is normal and expected academic behavior, and it is an author s responsibility to make sure their readers are aware of related, relevant work. However, some users prefer to exclude self-citations from Citation Impact metrics, whereas to others this is not critical to the analysis being done There is no universally right or wrong approach to these calculation preferences, although in some situations an option may become very important. These calculation preferences are available in SciVal as options to the user, some of which are described below. Each metric has its own set of calculation options, which are summarized in Table 2. elsevier.com/research-intelligence/scival 17

62 Research Area filter The Research Area filter limits the metric calculations to the publications of an entity that fall within a particular discipline. Where journal classifications are used, as in the accompanying screenshot, both the Main Categories and Sub-Categories are available to use as Research Area filters. User-defined Research Areas can also be used in some modules, such as the Research Areas Graphene and Malaria in the screenshot. The Research Area filter applies to the entity s publications only. It does not have any effect on the citation counts used to calculate the Citation Impact metrics; the citations are counted regardless of the Research Area of the citing paper. Some metrics are not field-normalized. This means that differences in the behavior of academics in distinct disciplines that can affect metrics values are not accounted for in their calculation, and it may be difficult for a user to distinguish between differences in disciplinary behavior and true differences in research activity. These non-field-normalized metrics are very useful when comparing entities that fall into the same discipline, but it is not advisable to use non-field-normalized metrics to compare entities that fall into different fields. Users can address this by: Using the Research Area filter when using a non-field-normalized metric to compare entities made up of a mixture of disciplines, such as an Institution or a Country. This has the effect of enabling benchmarking of comparable disciplinary slices of these entities Using field-normalized metrics such as Field-Weighted Citation Impact which inherently take disciplinary differences into account The default setting in SciVal is that no Research Area filter is applied Publication-type filter The Publication-type filter limits the metric calculations to articles, reviews, and/or books for example. This filter can be applied when the user judges that this variable is important for their analysis, such as: Distinguishing between original research contributions that are often published as articles, and expert opinion, typically communicated in reviews In disciplines such as Engineering and Computer Science, where it is sometimes important to focus on conference proceedings When comparing an academic who is a journal editor and has the opportunity to publish editorials, with an academic who is not a journal editor The Publication-type filter applies to the entity s publications only. It does not have any effect on the citation counts used to calculate the Citation Impact metrics; the citations are counted regardless of the Publication-type of the citing paper. The only exception is Outputs in Top Percentiles, where this filter also applies to the data universe used to generate the citation thresholds. The default setting in SciVal is that no Publication-type filter is applied. 18 Elsevier Research Intelligence SciVal

63 Self-citation exclusion Self-citations are those by which an entity refers to its previous work in new publications. Self-citations are typically thought of in the context of an individual academic, but journal self-citations, institutional self-citations, and country-self-citations are examples of the same activity at different levels. There is nothing inherently wrong with the act of self-citing. It is normal and expected academic behavior to build on work previously conducted and published, and to refer in new publications to older material to alert the reader to important and relevant information that will aid their understanding. Indeed, the most relevant work to be aware of may well be research that has previously been published by the same entity, and it is an author s responsibility to make their readers aware of it. The act of referring to previous work is, however, open to abuse, and there are rare cases of unethical selfcitation practices which have led to self-citation acquiring a somewhat bad reputation. SciVal users have the option to exclude self-citations from the calculation of many of the Citation Impact metrics, to make their own judgment about whether self-citation rates are within normal boundaries. SciVal distinguishes between self-citations at the author, institutional, and country levels, and applies the appropriate exclusion depending on the type of entity being viewed without the user needing to specify this. The default setting in SciVal is that all citations are included. See Examples 1a and 1b for illustrations of how self-citation exclusion is performed in SciVal Absolute number and percentage options Some metrics in SciVal offer the option of being viewed as Absolute number, or as Percentage : Users are advised to select the Percentage option to size-normalize, when they are comparing entities of different sizes It is useful to select the Absolute number option when it is important to communicate the magnitude of the number of publications involved The default setting in SciVal is Percentage. elsevier.com/research-intelligence/scival 19

64 Example 1a: Self-Citation Exclusion Scenario: The user is looking at a Country, Institution or Researcher entity that consists of 1 publication, PP. Say that this entity has received 6 citations from P1, P2, P4, P5, P6 and P7. Publication Author Institution Country PP AA II CC PP is cited by Publication Author Institution Country P1 A1 I1 C1 P2 AA I2 C2 P4 AA I3 CC P5 AA II CC P6 A2 II CC P7 A2 I3 CC 20 Elsevier Research Intelligence SciVal

65 Question: What happens when the user chooses not to include self-citations when calculating a metric that offers this as an option? Answer: If the entity you are viewing is Country CC, then the citations from P4, P5, P6 and P7 are self-citations because the affiliation country CC is the same as the entity s Country CC. These 4 citations would no longer be included in the metric calculation. Answer: If the entity you are viewing is Institution II, then the citations from P5 and P6 are self-citations because the affiliation institution II is the same as the entity s Institution II. These 2 citations would no longer be included in the metric calculation. Answer: If the entity you are viewing is a Researcher that includes author AA, then the citations from P2, P4 and P5 are self-citations because the author AA is the same as that of the entity s author AA. These 3 citations would no longer be included in the metric calculation. The citation from P1 is never classified as a self-citation, regardless of the type of entity. elsevier.com/research-intelligence/scival 21

66 Example 1b: Self-Citation Exclusion Scenario: The user is looking at a Country, Institution or Researcher that consists of 3 publications, P4, P5 and P6: The publication P4 has 2 authors: A2 and A5. They are both based at institution I2, in country C2. The publication P5 has 2 authors: A1 and A3. Author A1 has joint affiliations: I1 and I3 in countries C1 and C2 respectively. A3 has a single affiliation: I2 in country C2. The publication P6 has the same 2 authors as P5: A1 and A3. Author A1 has again listed joint affiliations: A1 has remained at I3 in country C2, but the second affiliation is I5 in C4. A3 still has a single affiliation but it is different from that on P5: the affiliation on this publication P6 is I5 in country C4. Say that this entity has received 3 citations from P1, P2 and P3. Publication Author Institution Country P4 P5 P6 A2 I2 C2 A5 I2 C2 A1 I3 C2 I1 C1 A3 I2 C2 A3 I5 C4 A1 I5 C4 I3 C2 P4, P5 and P6 are cited by Publication Author Institution Country P1 P2 P3 A1 I6 C4 A3 I6 C4 A3 I2 C2 I3 C2 A4 I3 C2 A4 I4 C3 A5 I2 C2 I6 C4 A6 I6 C4 A2 I4 C3 22 Elsevier Research Intelligence SciVal

67 Question: What happens when the user chooses not to include self-citations when calculating a metric that offers this as an option? Answer: If the entity you are viewing is, or includes, Countries C1, C2 or C4, then: A citation from P1 to P6 is a self-citation because they share the affiliation country C4 A citation from P2 to all of P4, P5 and P6 is a self-citation: P2 shares affiliation country C2 with each of P4, P5 and P6 A citation from P3 to all of P4, P5 and P6 is a self-citation: P3 shares affiliation country C2 with each of P4, P5 and P6, and in addition shares affiliation country C4 with P6 A citation from P1 to P4 and P5 is not a self-citation because P1 does not share a country with either P4 or P5 Answer: If the entity you are viewing is, or includes, Institutions I1, I2, I3 or I5, then: A citation from P2 to all of P4, P5 and P6 is a self-citation: P2 shares institution I2 with both P4 and P5, and institution I3 with P5 and P6 A citation from P3 to P4 and P5 is a self-citation: P3 shares institution I2 with both P4 and P5 A citation from P1 to any of the entity s publications is not a self-citation because it does not share an institution with any of P4, P5 or P6 A citation from P3 to P6 is not a self-citation because these publications do not have any institution in common Answer: If the entity you are viewing is, or includes, authors A1, A2, A3 or A5, then: A citation from P1 to both P5 and P6 is a self-citation: P1 shares authors A1 and A3 with both of P5 and P6 A citation from P2 to both P5 and P6 is a self-citation: P2 shares author A3 with both P5 and P6 A citation from P3 to P4 is a self-citation: P3 shares authors A2 and A5 with P4 A citation from P1 or P2 to P4 is not a self-citation because these publications do not have any author in common A citation from P3 to P5 or P6 is not a self-citation because these publications do not have any author in common elsevier.com/research-intelligence/scival 23

68 24 Elsevier Research Intelligence SciVal

69 3. Selection of appropriate metrics The ideal situation, when making research management decisions, is to have 3 types of input: peer review, expert opinion, and information from an evidence-base. When these complementary approaches triangulate to give similar messages, the user can have a higher degree of confidence that their decision is robust. Conflicting messages are a useful alert that further investigation is probably a good use of time. It is also preferable that an evidence-base is used to illuminate a question from various angles. Multiple people are often asked to give their professional judgment about a question, and more than 1 peer review is typically sought; in just the same way, triangulating information about the same question from an evidence-base by using 2, 3 or even more different metrics will also ensure that the insights gained in this corner of the triangle are the most reliable they can be. There are not really any strict rules about the selection of which metrics to use, besides approaching a question from more than 1 direction. The most appropriate metrics will always depend on the particular question that the user is asking. The best approach is to highlight some key points that are important to keep in mind, and for the user to apply their common sense. SciVal offers a broad range of metrics to enable triangulation from the evidence-base, and to cater for the enormous variety of questions that users will ask. It is a rich and powerful resource of information, and can be used responsibly and appropriately by keeping a few facts in mind, as a complement to other sources of information. These facts are the focus of this section. elsevier.com/research-intelligence/scival 25

70 3.1 Clarity on the question being asked The aim of using data and metrics as input into decision making is that any differences observed should reflect differences in performance. This will be the case if the user selects metrics that are suitable to answer their question, which in turn relies on 2 important factors: The question that is being asked is clearly articulated The user is aware of other factors, beyond performance, that can influence the value of a metric. These may or may not be important in the context of the questions being asked, but this judgment can only be made once that question has been clearly articulated The types of questions asked typically fall into 3 groups: Evaluation of performance, such as is conducted by a national body on its research institutions for the purposes of allocating national funding, or by a line manager to provide input into career development discussions. It is typically very important in these situations that variables besides differences in performance have been accounted for to ensure that the assessment is fair; it would not be advisable to compare chemistry and immunology using metrics that do not take into account the tendency for higher output and citation rates in immunology, for instance Demonstration of excellence, such as that which may support an application for competitive funding, or that which may be used for promotional purposes to attract post-graduate students to a research institution. The aim in these situations is typically to find a way to showcase a particular entity, and a user may be able to benefit from the factors that affect a metric besides performance; for instance, a big institution may choose to use one of the power metrics that tend to increase as the entity gets bigger, whereas a small institution may choose to use a size-normalized metric Scenario modeling, such as that which supports the decision of which academic to recruit to an existing research team, or the thinking behind reorganizing a school. The importance of factors besides performance that affect the values of metrics may or may not be important, depending on the particular scenario that is being modeled 3.2 Factors besides performance that affect the value of a metric Six types of factors, besides performance, that may affect the value of a metric are discussed in this section: Size Publication-type Manipulation Discipline Database coverage Time The metrics available in SciVal inherently address some or none of these factors, as summarized in Table 3. In situations where a metric does not inherently account for a variable that may be important to address particular questions, SciVal supports the user by providing options in terms of functionality and other metrics; these options are outlined in the sections of this Guidebook dedicated to each individual metric. 26 Elsevier Research Intelligence SciVal

71 Resistant to database coverage? Difficult to manipulate? Sizenormalized? Fieldnormalized? Publicationtypenormalized? Timeindependent? Scholarly Output Journal Count Journal Category Count Citation Count Cited Publications Citations per Publication Number of Citing Countries Field-Weighted Citation Impact Collaboration Collaboration Impact Academic-Corporate Collaboration Academic-Corporate Collaboration Impact Outputs in Top Percentiles Publications in Top Journal Percentiles h-indices Table 3: Characteristics of the metrics in SciVal. Those metrics that have half a shaded cell in the Size-normalized column are size-normalized when the Percentage option is selected, but not when the Absolute Number option is selected Size There are some metrics whose value tends to increase with the size of an entity, such as Scholarly Output that indicates the productivity of an entity, and Citation Count that sums, over all publications, the citations received by an entity. These metrics are referred to within this Guidebook as Power metrics, and are summarized in Table 1. It is often important to account for differences in the size of an entity when evaluating performance; Citations per Publication, for instance, accounts for differences in the size of an entity s Scholarly Output, and is useful to reveal efficiency of citations received per item. When demonstrating excellence, however, Power metrics may be used to the benefit of large entities whose performance will tend to look better; Citation Count will generally be higher for a large collaboration network than for a small research team. elsevier.com/research-intelligence/scival 27

72 3.2.2 Discipline Academics working in different disciplines display distinct characteristics in their approach to research and in their communication about research. These behavioral differences are not better or worse than each other, but are merely a fact associated with particular fields of research. Any Citation Count-type or Citations per Publication-type metric effectively illustrates these differences. These metrics tend to be significantly higher in neuroscience than in engineering, for example, but it is obviously not the case that neuroscience is generally better than engineering. These types of metric do not take into account different behaviors between fields, and it is not advised to use them to make comparisons between fields: for this purpose, field-normalized metrics, such as Field-Weighted Citation Impact, Publications in Top Journal Percentiles, and the free journal metrics SNIP and SJR 12, are suitable. It is not only Citation Count-type and Citations per Publication-type metrics that are affected by these behavioral differences; they affect all metrics used throughout research management, including those implemented in SciVal. What is it that causes these differences? Frequency of publication. Academics in fields such as chemical engineering tend to publish more often than researchers in mathematics Length of reference lists. Publications in disciplines such as toxicology tend to have much longer reference lists than those in social sciences Number of co-authors. Research in physics tends to be much more collaborative than research in arts and humanities, resulting in a higher number of co-authors per publication The typical distribution of these behaviors amongst all disciplines is illustrated in Figure 1. Neuroscience Life Sciences Pharmacology & Toxicology Chemistry & Chemical Engineering Physics Environmental Sciences Health Sciences Earth Sciences Biological Sciences Social Sciences Materials Science & Engineering Mathematics & Computer Sciences Arts & Humanities High Frequency of publication Length of reference list Number of co-authors Low Figure 1: The characteristic behavior of academics differs between disciplines 12: 28 Elsevier Research Intelligence SciVal

73 What does this mean for metrics? The majority of metrics values tend to be higher in neuroscience and life sciences than they are in materials science and computer sciences; this does not necessarily reflect performance, but merely the way that academics in these disciplines conduct research. This may be used to advantage when the aim is to demonstrate excellent performance of an academic working in neuroscience, for instance, but it is very important to be aware of this effect and to take it into account when performing an evaluation. If users need to make comparisons between disciplines, they are advised to: Apply the Research Area filter when comparing entities made up of a mixture of disciplines, such as an Institution or a Country, to focus on one field that is common between all the entities Select field-normalized metrics, such as Field-Weighted Citation Impact It is worth noting, when considering the effect of disciplines on metric values, that there are different ways of defining disciplines: Journal classifications are probably the most common approach to defining disciplines Scopus, for example, assigns the journals that it indexes into Main Categories and Sub-Categories, and the Brazilian CAPES Foundation 13, a government agency that awards scholarship grants to graduate students at universities and research centers in Brazil and abroad, also has a journal classification, as do many other organizations These classifications offer the significant advantage to users of a tool like SciVal of having the same meaning, regardless of the entity being investigated. The meaning of a chemistry department may differ between research institutions, and that is a concern when benchmarking chemistry. If a user filters institutions by a particular journal category they can be sure that they are comparing a consistent definition of a discipline. It is for this reason that the Research Area filter in SciVal offers journal classifications The drawback of these classifications is that they tend to be very large so that it is difficult to keep them up to date with current developments in research, and that it is assumed that a journal s discipline(s) applies equally to each individual publication that it contains 13: elsevier.com/research-intelligence/scival 29

74 Publication-level classifications are gaining increased attention, as the pace of change in academia increases and with technological advances that can handle massive data sets: These classifications do not assume that a publication fits into the same discipline(s) as the journal that it appears in. They are often formed by allowing publications to group themselves through the references they contain, and/or by the publications that refer to them Publication-level classifications are very useful to keep pace with the changing ways in which academics view the world of research, and can be useful to reveal nascent, emerging fields. The global research strengths displayed in the competency maps of SciVal are examples of research activity that represent the way that recent publications cite previous work. It is also for this reason that publication-level classifications are used to determine disciplines in the calculation of Field-Weighted Citation Impact in SciVal The drawbacks of these classifications are: - There tend to be publications which cannot be easily assigned, for instance if they have a short or absent reference list and/or have not yet received citations - It is computationally very demanding to implement an ever changing classification - Frequent changes in classification due to changes in the pattern of citations received reduce the transparency of the data underlying a metric calculation for users User-defined disciplines, for instance by searching for key phrases within the titles and abstracts of a publications database: The advantage of user-defined disciplines is that they ensure that every user can define the world of research as it makes sense to them and their questions. Academics do not conduct their research in a way that conveniently fits the journal classifications and previous citing behavior of their colleagues, and even publication-level classifications are not able to detect the very first stages of Research Areas, when academics are just starting to focus on a new field The drawback of user-defined disciplines is that they are likely unique, and pose challenges to others who would like to use or validate these customized fields 30 Elsevier Research Intelligence SciVal

75 3.2.3 Publication-type Different types of publications tend to be cited with different rates. The most well-known example is that reviews tend to attract more citations than original research articles, but there are variations between other publication-types as well. These are illustrated in Figure Citations per Publication Abstract Report Article Book Business Article Book Chapter Conference Paper Conference Review Dissertation Editorial Erratum Article-in-Press Letter Note Review Short Survey Figure 2: Citation rates for different publication-types as classified in Scopus. This chart displays citations received up to August 2013 per item published during the period Publication-type differences may not be important where the performance of total output is of interest, or when a journal editor is showcasing their productivity and wishes to retain the positive effect of editorials in the count. There are some situations, however, when there may be a strong preference to focus on a particular type, such as on conference proceedings in engineering and computer science, or original research articles when comparing a journal editor who has written many editorials with an academic who is not a journal editor. If this effect is important for the question being addressed by the user, they are advised to: Apply the Publication-type filter Use Field-Weighted Citation Impact which is normalized for Publication-type elsevier.com/research-intelligence/scival 31

76 3.2.4 Database coverage Databases have particular guidelines in determining which content to include. Scopus has a comprehensive policy 14 to select the content which meets its aims, and the policy is to be selective and not to include every single publication globally. This means that there may be some items that have been published by a particular entity that are not indexed in Scopus, and so cannot be part of the metrics calculations in SciVal. There are 2 aspects to considerations of database coverage: Geographical coverage. Scopus indexes content from more than 5,000 publishers from all over the world. The geographical distribution of titles indexed in Scopus is representative of the global concentrations of publishers, with the focus of activity in the United States and the United Kingdom, as shown in Figure 3. This geographical coverage should support a thorough analysis of topics of global interest; however, for research areas of primarily local interest, such as national literature, history or culture, Scopus may not provide sufficiently complete data Disciplinary coverage. The ongoing expansion of the titles indexed by Scopus means that this coverage will continue to change. The disciplinary coverage of Scopus can be estimated by looking at the items that have been cited by recently published work; the extent to which these citations can be linked to items indexed within the Scopus database represents the coverage, and those citations which refer to items not indexed by Scopus are assumed to represent lack of coverage. This methodology represents a minimum coverage level, and the actual coverage is probably a few percentages higher than shown in Figure 4 0 5,000 Figure 3: Geographical distribution of titles indexed in Scopus, to January The country assigned to a title is the country of the publisher imprint. The sliding scale of 0-5,000 indicates the density of indexed titles. 14: 32 Elsevier Research Intelligence SciVal

77 Citations linking to post-1995 items in Scopus Citations to items outside of Scopus or to pre-1996 items in Scopus 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% General Agricultural and Biological Sciences Arts and Humanities Biochemistry, Genetics and Molecular Biology Earth and Planetary Sciences Economics, Econometrics and Finance Business, Management and Accounting Chemical Engineering Chemistry Computer Science Decision Sciences Energy Engineering Environmental Science Immunology and Microbiology Materials Science Mathematics Medicine Neuroscience Nursing Pharmacology, Toxicology and Pharmaceutics Physics and Astronomy Psychology Social Sciences Veterinary Dentistry Health Professions Figure 4: Estimation of Scopus disciplinary coverage. This estimation is made based on the extent to which citations from publications during the period can be linked to items indexed from 1996 onwards within the Scopus database; this is an under-estimation of the true coverage because citations to indexed items published before 1996 are not captured here. This analysis is based on data up to August It is most probably acceptable to have a few missing publications from an entity s output when benchmarking large data sets against their peers, because gaps in database coverage likely affect all entities similarly and will not invalidate the comparison. Care is advised, however, when comparing small entities, from which a single missing publication may have a significant negative impact; for example, an academic s performance may suffer due to gaps in database coverage of their portfolio, as well as gaps in publications citing those items of the portfolio that are indexed. The transition point between a small and a large entity is a matter of judgment, and will differ depending on the discipline. The only way to account for this is to be vigilant and to apply common sense when using SciVal to support decisions. elsevier.com/research-intelligence/scival 33

78 The question of the effect of database coverage is most often raised in relation to Arts and Humanities, and Social Sciences: are the metrics in SciVal useful in these fields? In some situations, they are: The type of decisions supported by SciVal should ideally be based on a combination of inputs from peer review and expert opinion, as well as from an evidence base. Publication and citation data form one part of this evidence-base, with funding, innovation, and societal impact, for instance, also being very important although they are not yet addressed in SciVal. This is the case for all disciplines, and Arts and Humanities and Social Sciences are no exception There is a further concern with these fields, in that their coverage in Scopus and other commercial databases tends to be low. This is a natural consequence of the publishing behavior of academics in these fields, who favor the publication of stand-alone books which are difficult to capture relative to serials, although Scopus is now focusing on increasing its coverage of these items. Nevertheless, valuable information about performance in these fields can be gained from SciVal if the guidelines about the size of the entity, above, are borne in mind Manipulation Some situations lend themselves relatively easily to manipulation for the artificial inflation of metrics values. One example is the combination of research units to artificially increase size for reporting purposes, which tends to improve apparent performance when using Power metrics. Another example is that of self-citations. There is nothing inherently wrong about self-citing: it is normal academic behavior to build on work previously published, and it is an author s responsibility to refer a reader to older material that will support their understanding. This practice is, however, open to abuse by unscrupulous academics who could choose to cite irrelevant past papers to boost their citation counts, and to journal editors who occasionally coerce authors of submitted papers to add too many additional citations to publications within their journal to their reference lists. Citation Impact metrics are the most susceptible to manipulation of the metrics in SciVal. Self-citation abuse is very rare, but users have the option to exclude self-citations from several of these metrics if they wish, as illustrated in Example 1. Other metrics are much more difficult to manipulate, such as those that provide information on aspects of productivity and collaboration Time The passage of time is critical to enable useful information to be derived from some metrics. The most obvious of these are Citation Impact metrics, since time is needed for published work to receive citations. The h-indices are another example of metrics that are not very useful when applied to the output of an early-career researcher, because the need for the passage of time to accumulate citations is coupled with that for a reasonably-sized body of work to provide reliable information. Some time-independent metrics can be used immediately upon entry of a publication into the database. Examples are metrics that look at collaboration by using affiliation information. There is even a Citation Impact metric that is independent of time; Publications in Top Journal Percentiles uses citations received by the journals in which publications appear. 34 Elsevier Research Intelligence SciVal

79 4. SciVal metrics: methods and use This section covers the individual metrics available in SciVal. It shares their method of calculation, situations in which they are useful, and highlights situations in which care should be taken when using them. It also suggests metrics that might be considered useful partners, either to address shortcomings of a particular metric, or to highlight information that is naturally complementary. These suggestions are summarized in Table 4; they should not be taken as rules, but as guidelines that may sometimes be useful to the user in their selection of appropriate metrics. elsevier.com/research-intelligence/scival 35

80 ...useful partner metrics are: Scholarly Output Journal Count Journal Category Count Citation Count Cited Publications Citations per Publication Scholarly Output Journal Count Journal Category Count Citation Count For these metrics... Cited Publications Citations per Publication Number of Citing Countries Field-Weighted Citation Impact Collaboration Collaboration Impact Academic-Corporate Collaboration Academic-Corporate Collaboration Impact Outputs in Top Percentiles Publications in Top Journal Percentiles h-indices Table 4: Suggested partner metrics 36 Elsevier Research Intelligence SciVal

81 ...useful partner metrics are: Number of Citing Countries Field-Weighted Citation Impact Collaboration Collaboration Impact Academic- Corporate Collaboration Academic- Corporate Collaboration Impact Outputs in Top Percentiles Publications in Top Journal Percentiles h-indices Power metrics Time-independent metrics Communicate information about the magnitude of metrics values Natural complementary partner metrics Rely on actual performance of publications, not journal average Reliability with which a publication will receive at least 1 citation Avoid display of the citation dip in recent years elsevier.com/research-intelligence/scival 37

82 4.1 The display of metrics in SciVal The metrics in SciVal are displayed exactly as calculated. There may be situations in which users would prefer values to be averaged over a multiple year period to smoothen fluctuations in trend lines, but such smoothening is not performed in SciVal so that the data underlying a metric are very transparent. This means that some trend lines in SciVal will look jumpy, especially when they are calculated on a small data set, such as a Researcher. Trend lines tend to become smoother when metrics for larger data sets are being displayed. 38 Elsevier Research Intelligence SciVal

83 4.2 Metric: Scholarly Output Scholarly Output in SciVal indicates the productivity of an entity: how many publications does this entity have indexed in Scopus? Scholarly Output is a: Productivity metric Snowball Metric Power metric This metric is useful to: Benchmark the productivity of entities that are similar in size and that fall within similar fields of research, such as Institutions with a similar number of research staff, or Researchers in the same field of research and with similar career lengths Provide impressive figures to showcase the performance of entities that are large in comparison to a group of peers, when this metric is likely to give high numbers Provide transparency on the underlying data to build trust in the metrics in SciVal Look at publishing activity in a way that is difficult to manipulate Investigate activity during the early stages of a new strategy when publications are just starting to appear, or of early-career researchers This metric should be used with care when: Benchmarking the productivity of entities of obviously different sizes, such as Countries and Institutions, large collaboration networks and individual Researchers, or stable, established Research Areas and small, emerging Research Areas: Differences in the value of this Power metric for comparing such entities will probably reflect distinct entity sizes rather than differences in productivity Users are advised to limit the use of this metric to similar entities, or to use size-normalized metrics such as Citations per Publication Benchmarking the collaboration of entities with distinct disciplinary profiles: Academics in distinct fields tend to publish with very different frequencies. For example, Institutions with a high proportion of academics in life sciences, who tend to publish with high frequency, are likely to have a higher Scholarly Output than similarly sized Institutions with a high proportion of academics in humanities, who tend to publish with low frequency Differences in Scholarly Output, in these cases, are most likely to reflect distinct disciplinary characteristics rather than to provide reliable information about differences in productivity elsevier.com/research-intelligence/scival 39

84 It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or a Country, users are advised to apply the Research Area filter to focus on one field that is common between all the entities, or to use field-normalized metrics such as Publications in Top Journal Percentiles which will take this into account Understanding the productivity of small entities for which there may be gaps within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on apparent productivity The only way to account for this is to be vigilant. Consider also limiting the use of Scholarly Output to comparing larger data sets where gaps in the database coverage likely have a similar effect on all entities and do not invalidate the comparison Useful partner metrics are: Citation Count, which sums, over all publications, the citations received by an entity, and is a natural complement to Scholarly Output that counts the publications of an entity h-indices, whose values depend on a combination of Scholarly Output together with Citation Count, and are natural partner metrics The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Journal Count, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of all other time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example Elsevier Research Intelligence SciVal

85 4.3 Metric: Journal Count Journal Count in SciVal indicates the diversity of an entity s publication portfolio: in how many of the distinct journals indexed in Scopus have this entity s publications appeared? Journal Count is a: Disciplinarity metric Power metric This metric is useful to: Compare smaller entities, such as Researchers and Publication Sets, where differences in the journal portfolio are most likely to be evident Provide impressive figures to showcase the performance of a relatively large entity, when this metric is likely to give high numbers Benchmark the diversity of the publication portfolios of related entities, such as scenario models of a research institute created to investigate the effect of recruiting various researchers, or collaboration networks in a given field of research Demonstrate the excellence of entities that work across traditional disciplines and have a broad array of journals available to which to submit Provide transparency on the underlying data to build trust in the metrics in SciVal Look at publishing activity in a way that is difficult to manipulate Investigate differences in activity at the early stages of a new strategy when publications are just starting to appear, or between early-career researchers This metric should be used with care when: Comparing large entities, such as Institutions and Groups of Institutions, which will likely publish in such a large and stable journal portfolio that it will approach the maximum extent of the Scopus database so that differences may not be visible. Users are advised to narrow their focus to a slice of a large entity when using this metric, such as by applying the Research Area filter Benchmarking the productivity of entities of obviously different sizes, such as departments of 150 academics with departments of 30 academics: Differences in the value of this Power metric when comparing such entities will probably reflect distinct entity sizes rather than differences in the diversity of their publication portfolios Users are advised to limit the use of this metric to similar entities, or to use size-normalized metrics such as Collaboration elsevier.com/research-intelligence/scival 41

86 Benchmarking the productivity of entities with distinct disciplinary profiles, when academics in distinct fields may have a different sized range of journals available for submission: Differences in Journal Count, in these cases, may well reflect distinct disciplinary characteristics rather than provide reliable information about differences in publication portfolios It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or a Country, users are advised to apply the Research Area filter to focus on one field that is common between all the entities, or to use field-normalized metrics such as Field-Weighted Citation Impact which will take this into account Understanding the productivity of small entities for which there may be gaps within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on the apparent breadth of a publication portfolio The only way to account for this is to be vigilant. Consider also limiting the use of Journal Count to comparing slightly larger data sets where gaps in the database coverage likely have a similar effect on all entities and do not invalidate the comparison Useful partner metrics are: Journal Category Count, which highlights the disciplinary portfolio of an entity, and is a natural complement to Journal Count The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of all other time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example Elsevier Research Intelligence SciVal

87 4.4 Metric: Journal Category Count Journal Category Count in SciVal indicates the diversity of an entity s disciplinary portfolio: in how many distinct journals categories have this entity s publications appeared? Journal Category Count can be generated using either the Main Categories or Sub-categories. The maximum value of this metric is 27 when using Scopus Main Categories, and 334 when using Scopus Sub-categories. Publications can be assigned to a classification system in 2 ways: Journal-driven assignment assumes that every publication within a journal fits within the same discipline(s) as the journal s scope. Each publication automatically adopts the subject classifications that are assigned to the journal. This method of assignment is suitable for journals that are focused in a core field, and do not tend to include publications that are also a relevant to other fields Publication-driven assignment assumes that publications within a journal may have additional or different relevance to fields outside the core focus of the journal s scope. Publication-driven assignment offers the benefit of being able to assign individual publications from a journal separately to their relevant classifications. This is important for publications in multi-disciplinary journals Journal Category Count uses publication-driven assignment, and a publication can be allocated to more than 1 category. Journal Category Count is a: Disciplinarity metric Power metric elsevier.com/research-intelligence/scival 43

88 This metric is useful to: Compare smaller entities, such as Groups of Researchers and Groups of Publication Sets, where differences in the disciplinary portfolio are most likely to be evident Provide impressive figures to showcase the performance of a relatively large entity, when this metric is likely to give high numbers Benchmark the diversity of the disciplinary portfolios of related entities, such as collaboration networks funded by the same type of grant from a given funder Provide evidence of cross-disciplinarity by indicating the appeal of an entity s output to readers in diverse disciplines Provide transparency on the underlying data to build trust in the metrics in SciVal Look at publishing activity in a way that is difficult to manipulate Investigate differences in activity at the early stages of a new strategy when publications are just starting to appear, or between early-career researchers This metric should be used with care when: Comparing large entities, such as Institutions and Groups of Countries, which will likely publish in such a large and stable disciplinary portfolio that it will approach the maximum extent of the Scopus database and differences may not be visible. Users are advised to narrow their focus to a slice of a large entity when using this metric, such as by applying the Research Area filter Benchmarking the productivity of entities of obviously different sizes, such as large collaboration networks with single Publication Sets: Differences in the value of this Power metric when comparing such entities will probably reflect distinct entity sizes rather than differences in the diversity of their disciplinary portfolios Users are advised to limit the use of this metric to similar entities, or to use size-normalized metrics such as Academic-Corporate Collaboration 44 Elsevier Research Intelligence SciVal

89 Benchmarking the productivity of entities with distinct disciplinary profiles, when academics in distinct fields may have different opportunities to work in a cross-disciplinary manner: Differences in Journal Category Count, in these cases, may well reflect distinct disciplinary characteristics rather than provide reliable information about differences in disciplinary portfolios It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or a Country, users are advised to apply the Research Area filter to focus on one field that is common between all the entities, or to use field-normalized metrics such as Publications in Top Journal Percentiles which will take this into account Understanding the productivity of small entities for which there may be gaps within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on the apparent breadth of a disciplinary portfolio The only way to account for this is to be vigilant. Consider also limiting the use of Journal Category Count to comparing slightly larger data sets where gaps in the database coverage likely have a similar effect on all entities and do not invalidate the comparison Useful partner metrics are: Journal Count, which highlights the publication portfolio of an entity, and is a natural complement to Journal Category Count The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of all other time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Collaboration, Academic-Corporate Collaboration, and Publication in Top Journal Percentiles See Example 2. elsevier.com/research-intelligence/scival 45

90 Example 2: Scholarly Output, Journal Count and Journal Category Count Scenario: The user would like to calculate the Scholarly Output, Journal Count, or Journal Category Count of an entity that consists of 6 publications, and has selected the following viewing and calculation options. Selected Publication Year range 2008 to 2012 Selected Publication Types Selected Research Area Articles, reviews and conference papers Agricultural and Biological Sciences Do publications match user-selected options? Entity with 6 Publications Publication Identity Publication 1 Publication 2 Publication Year Journal in which publication is published Bioscience Journal Biology and Environment Publication Type Article Article Journal Sub-Category (or Sub-Categories) General Agricultural Animal Science Aquatic Science and Biological Sciences and Zoology Journal Main Category (or Categories) Agricultural and Biological Sciences Agricultural and Biological Sciences Agricultural and Biological Sciences Step 1 Does the Journal Category of the publication match the selected Research Area? Yes Yes Yes Step 2 Does this publication fall in the selected Publication Year range? No Yes Step 3 Does the Publication Type match the selected Publication Types? Yes Yes Step 4 Does the publication pass each of steps 1, 2 and 3? No Yes Question: How do I calculate Scholarly Output? Answer: Count the number of publications that received a yes in step 4. Scholarly Output = 3 Question: How do I calculate Journal Count? Answer: Note unique Journal Titles for those publications that received a yes in step 4. Remove duplicates Journal Titles of those publications that passed the filters. Biology and Environment, Archiv für Lebensmittelhygiene Count the number of unique Journal Titles. Journal Count = 2 Question: How do I calculate Journal Category Count (Main Categories)? Answer: Look up main categories. Remove duplicates. Remove duplicate Main Category names from those publications that passed the filters. Agricultural and Biological Sciences Count the number of unique Main Categories. Journal Main Category Count = 1 Question: How do I calculate Journal Category Count (Sub-categories)? Answer: Look up sub-categories. Remove duplicates. Remove duplicate Sub-Category names from those publications that passed the filters. Agronomy and Crop Science, Animal Science and Zoology, Aquatic Science, Horticulture Count the number of unique Sub-Categories. Journal Sub-Category Count = 4 46 Elsevier Research Intelligence SciVal

91 Insect Science Agricultural and Biological Sciences Entity with 6 Publications Publication 3 Publication 4 Publication 5 Publication Biology and Environment Biology and Environment Archiv für Lebensmittelhygiene Archiv für Lebensmittelhygiene Article In Press Review Article Report Environmental Chemistry Environmental Science Agronomy and Crop Science Horticulture Organic Chemistry Agricultural and Biological Sciences Agricultural and Biological Sciences Chemistry Yes No Yes Yes No Yes Yes Yes Yes No Yes Yes No No Yes Yes No elsevier.com/research-intelligence/scival 47

92 4.5 Metric: Citation Count Citation Count in SciVal indicates the total citation impact of an entity: how many citations have this entity s publications received? Citation Count is a: Citation Impact metric Snowball Metric Power metric SciVal often displays Citation Count in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Benchmark the visibility of entities of similar size, and that fall into similar disciplines, such as multidisciplinary institutions with a similar number of research staff, or international collaboration networks in similar disciplines Provide impressive figures to showcase the performance of entities that are large in comparison to a group of peers, when this metric is likely to give high numbers Showcase the performance of entities that have published a few noticeably highly cited papers Provide transparency on the underlying data to build trust in the metrics in SciVal This metric should be used with care when: Benchmarking the visibility of entities of obviously different sizes, when this Power metric will most closely reflect entity size rather than differences in visibility; a Group of Institutions such as a US state will generally have a higher Citation Count than an individual institution, for example. Users are advised to use the size-normalized Citation Impact metrics Citations per Publication or Field-Weighted Citation Impact to compare the visibility of entities of different sizes 48 Elsevier Research Intelligence SciVal

93 Benchmarking the collaboration of entities with distinct disciplinary profiles: Academics working in medical sciences and in virology, for instance, tend to publish and cite with high frequencies, whereas those working in business schools or in linguistics tend to publish and cite with lower frequencies It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or a Country, it is advised to apply the Research Area filter to focus on one field that is common between all the entities, or to select Field-Weighted Citation Impact which will take this into account Investigating the reliability with which an entity s publications are cited, since one or a few publications with a very high number of citations can conceal a sizeable body of uncited material. Users are advised to select Cited Publications to give an idea of reliability Entities are small and there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on apparent visibility. For example, an academic s performance may suffer due to gaps in items they have published, as well as gaps in publications citing the publications that are indexed The only way to account for this is to be vigilant. Consider also limiting the use of Citation Count to comparing larger data sets in the same discipline where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison There is a concern that excessive self-citations may be artificially inflating the number of citations. Users can judge whether the level of self-citations is higher than normal by deselecting the Include self-citations option Uncovering the performance of publications in the very early stages of a new strategy, or of earlycareer researchers, where the short time that has passed since publication will reduce the reliability of basing decisions on citation information. Users are advised to use metrics such as Scholarly Output or Collaboration in these situations The person who will use the data does not like to see a line that dips in recent years. This typically happens with Citation Count because recent publications have had little time to receive citations. Users are advised to use Field-Weighted Citation Impact or Publications in Top Journal Percentiles to avoid this drop, if it is of concern elsevier.com/research-intelligence/scival 49

94 Useful partner metrics are: Citations per Publication and Field-Weighted Citation Impact, which bring complementary perspectives into view on total visibility, and also account for differences in the size of entities being compared. Field-Weighted Citation Impact also accounts for differences in publication and citation behavior between disciplines Field-Weighted Citation Impact, Outputs in Top Percentiles, and Publications in Top Journal Percentiles avoid the dip in recent years due to little time having passed to receive citations since publication Cited Publications provides a measure of the reliability that an entity s publications will subsequently be cited, and is not affected by a high number of citations received by 1 or a few publications h-indices, whose values depend on a combination of Citation Count together with Scholarly Output, and are natural partner metrics The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of time-independent metrics that provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example Elsevier Research Intelligence SciVal

95 4.6 Metric: Cited Publications Cited Publications in SciVal indicates the citability of a set of publications: how many of this entity s publications have received at least 1 citation? Cited Publications is a: Citation Impact metric Power metric when the Absolute number option is selected, but not when the Percentage option is selected SciVal often displays Cited Publications in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Benchmark the extent to which publications are built on by subsequent work Compare the influence of publications of entities of different sizes, but in related disciplines, such as large and small countries, or a research team with individual Researchers within that team It is advised to select the Percentage option when comparing entities of different sizes, to normalize for this variable Demonstrate the excellence of entities that produce consistent work that is reliably cited, regardless of the number of citations received Account for the positive impact of a few very highly cited papers on apparent performance, which affect Citation Count and Citations per Publication Provide transparency on the underlying data to build trust in the metrics in SciVal This metric should be used with care when: Benchmarking the extent to which the publications of entities with distinct disciplinary profiles are built on by subsequent work: Teams working in parasitology, for instance, may experience a relatively short delay between publishing and the receipt of citations because of the high frequency of publishing and citing, relative to teams working in mathematical modeling where publication behavior may lead to a relatively long delay between publishing and the receipt of citations It is not advisable to use this metric to compare entities in distinct disciplines without taking these differences into account When comparing entities made up of a mixture of disciplines, such as an Institution or a Group of Institutions, it is advised to apply the Research Area filter to focus on one field that is common between all the entities elsevier.com/research-intelligence/scival 51

96 Understanding the magnitude of the number of citations received by an entity s publications. Users are advised to use Citation Count or Citations per Publication to communicate this information Entities are small and there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on apparent visibility. For example, an academic s performance may suffer due to gaps in items they have published, as well as gaps in publications citing the publications that are indexed The only way to account for this is to be vigilant. Consider also limiting the use of Cited Publications to comparing larger data sets in the same discipline where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison There is a concern that excessive self-citations may be artificially inflating the proportion of cited publications. Users can judge whether the level of self-citations is higher than normal by deselecting the Include self-citations option Uncovering the performance of publications in the very early stages of a new strategy, or of early-career researchers, where the short time that has passed since publication will reduce the reliability of basing decisions on citation information Useful partner metrics are: Citations Count and Citations per Publication, which indicate the magnitude of citation impact, and which will be positively impacted by one or a few publications that have received a very high number of citations The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Citation Count, Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of time-independent metrics that provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example Elsevier Research Intelligence SciVal

97 4.7 Metric: Citations per Publication Citations per Publication in SciVal indicates the average citation impact of each of an entity s publications: how many citations have this entity s publications received on average? Citations per Publication is a: Citation Impact metric Snowball Metric SciVal often displays Citations per Publication in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Benchmark the average citation impact of publications within a body of work Compare the average influence of publications of entities of different sizes, but in related disciplines, such as research teams working in a similar field of research, or a Researcher and Publication Sets belonging to that Researcher Showcase the performance of entities that have published a few notably highly cited papers that will have a positive effect on the average of the entire data set Provide transparency on the underlying data to build trust in the metrics in SciVal elsevier.com/research-intelligence/scival 53

98 This metric should be used with care when: Benchmarking the average influence of the publications of entities with distinct disciplinary profiles, such as Institutions with sizeable humanities schools with Institutions without humanities schools It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or a Country, it is advised to apply the Research Area filter to focus on one field that is common between all the entities, or to select Field-Weighted Citation Impact which will account for disciplinary differences Investigating the reliability with which an entity s publications are cited, since one or a few publications with a very high number of citations can conceal a sizeable body of uncited material. Users are advised to use Cited Publications to investigate the proportion of publications in a data set that have been cited Entities are small and there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on apparent visibility. For example, an academic s performance may suffer due to gaps in items they have published, as well as gaps in items citing the publications that are indexed The only way to account for this is to be vigilant. Consider also limiting the use of Citations per Publication to comparing larger data sets in the same discipline where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison Entities are small so that the metric may fluctuate significantly and appear unstable over time, even when there is complete Scopus coverage. Citations per Publication calculates an average value, and these types of calculations are strongly influenced by outlying publications in a small data set There is a concern that excessive self-citations may be artificially inflating Citations per Publication. Users can judge whether the level of self-citations is higher than normal by deselecting the Include self-citations option Uncovering the performance of publications in the very early stages of a new strategy, or of earlycareer researchers, where the short time that has passed since publication will reduce the reliability of basing decisions on citation information. It is advised to use metrics like Journal Category Count or Collaboration to account for this The person who will use the data does not like to see a line that dips in recent years. This typically happens with Citations per Publication because recent publications have had little time to receive citations. Users are advised to use Field-Weighted Citation Impact to avoid this drop, if it is of concern 54 Elsevier Research Intelligence SciVal

99 Useful partner metrics are: Field-Weighted Citation Impact is a natural complement to Citations per Publication, taking into account behavioral differences between disciplines Field-Weighted Citation Impact, Outputs in Top Percentiles, and Publications in Top Journal Percentiles avoid display of the dip in recent years due to little time having passed to receive citations since publication Cited Publications, that provides a measure of the reliability that an entity s publications will subsequently be used to support other research, but that is not affected by a high number of citations received by 1 or a few publications Collaboration Impact and Academic-Corporate Collaboration Impact also measure Citations per Publication, and are a natural complement. They focus on subsets of publications within a data set with particular collaboration characteristics The set of time-independent metrics that provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example 3. elsevier.com/research-intelligence/scival 55

100 Example 3: Citation Count, Cited Publications and Citations per Publication Scenario: The user would like to calculate the Citation Count, Cited Publications, or Citations per Publication of an entity that consists of 6 publications, and has selected the following viewing and calculation options. Selected Publication Year range 2005 to 2013 Selected Publication Types Selected Research Area Articles, reviews and editorials Medicine Entity with 6 Publications Publication Identity Publication 1 Publication 2 Publication Year Publication Type Review Editorial Total citations received by this publication 0 4 Journal Sub-Category (or Sub-Categories) Anthropology Emergency Medicine Management Science and Operations Research Journal Main Category (or Categories) Social Sciences Medicine Decision Sciences Do publications match user-selected options? Step 1 Does the Journal Category of the publication match the selected Research Area? No Yes No Step 2 Does this publication fall in the selected Publication Year range? Yes Yes Step 3 Does the Publication Type match the selected Publication Types? Yes Yes Step 4 Does the publication pass each of steps 1, 2 and 3? No Yes Question: How do I calculate Scholarly Output? Question: How do I calculate Citation Count? Answer: Count the number of publications that received a yes in step 4. Scholarly Output = 4 Answer: Retrieve total citations received for the publications that received a yes in step 4. N/A 4 Sum the citations received by those publications that received a yes in step 4. Citation Count = 15 Question: How do I calculate Cited Publications? Answer: Have the publications that passed step 4 received at least 1 citation? For Absolute number display option, sum the publications that have received at least 1 citation. Cited Publications = 3 For Percentage display option, divide Absolute number by Scholarly Output. Cited Publications = 75.0% N/A Yes Question: How do I calculate Citations per Publication? Answer: Divide Citation Count by Scholarly Output Citations per Publication = Elsevier Research Intelligence SciVal

101 Anatomy Entity with 6 Publications Publication 3 Publication 4 Publication 5 Publication Article Article Article Review Information Systems and Management Immunology and Allergy General Medicine Emergency Medicine Medicine Decision Sciences Medicine Medicine Medicine Yes No Yes Yes Yes Yes No Yes Yes Yes Yes Yes Yes Yes No Yes Yes 7 N/A 0 4 Yes N/A No Yes elsevier.com/research-intelligence/scival 57

102 4.8 Metric: Number of Citing Countries Number of Citing Countries in SciVal indicates the geographical visibility of an entity s publications: from how many distinct countries have this entity s publications received citations? Number of Citing Countries is a: Citation Impact metric Power metric SciVal often displays Number of Citing Countries in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Compare smaller entities, such as Groups of Researchers and Publication Sets, where differences in the number of citing countries are most likely to be evident Provide impressive figures to showcase the performance of a relatively large entity, when this metric is likely to give high numbers Benchmark the geographical visibility of the publication portfolios of related entities, such as Collaboration networks in a given field of research Researchers in a common field of research and with similar career lengths Scenario models of a research institute, created to investigate the effect of recruiting different researchers Provide evidence of extensive geographical appeal of a body of work by indicating the diversity of the geographical sources of citations Provide transparency on the underlying data to build trust in the metrics in SciVal Look at publishing activity in a way that is difficult to manipulate This metric should be used with care when: Comparing large entities, such as Institutions and Groups of Countries, which will likely publish so many publications that receive so many citations, that the number of citing countries will approach the maximum and differences may not be visible. Users are advised to narrow their focus to a slice of a large entity when using this metric, such as by applying the Research Area filter Benchmarking entities of obviously different sizes, such as institutions and small departments, when this Power metric will most closely reflect entity size rather than differences in global visibility. It is not advised to use this metric to compare entities of different sizes 58 Elsevier Research Intelligence SciVal

103 Benchmarking the collaboration of entities in different disciplines: The international appeal of publications from different disciplines may be inherently different, such as between national literature and chemistry, or local history and computer science It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or a Country, it is advised to apply the Research Area filter to focus on one field that is common between all the entities Understanding the magnitude of the number of citations received by an entity s publications. Users are advised to use Citation Count or Citations per Publication to communicate information about magnitude Entities are small and there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on apparent global visibility, whereas the effect of 1 or a few missing publication(s) from a large data set may be acceptable The only way to account for this is to be vigilant, particularly when looking at small data sets such as an early-career researcher. Consider also limiting the use of Number of Citing Countries to comparing somewhat larger data sets in the same discipline where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison Investigating the performance of publications in the very early stages of a new strategy, or of early-career researchers, where the short time that has passed since publication will reduce the reliability of basing decisions on citation information. It is advised to use Journal Count or other time-independent metrics in these situations Useful partner metrics are: Citation Count and Citations per Publication, which communicate information about the magnitude of the number of citations received Collaboration, that shows the extent of international co-authorship of an entity s scholarly output and is a natural complement to Number of Citing Countries view on geographical visibility The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of time-independent metrics that provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles elsevier.com/research-intelligence/scival 59

104 Example 4: Number of Citing Countries Scenario: The user would like to calculate the Number of Citing Countries of an entity that consists of 6 publications. They have not selected any viewing or calculation options. Say that this entity has received 6 citations from publications A, B, C, D, E and F. Entity with 6 Publications Publication 1 Publication 2 Publication 3 Publication 4 Publication 5 Publication 6 Cited by: Publication A Yes Yes Cited by: Publication B Yes Yes Cited by: Publication C Cited by: Publication D Cited by: Publication E Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Cited by: Publication F Yes Yes Scenario: The citing publications A, B, C, D, E and F have the following affiliation information: Citing Publication Authors Institutions Country Publication A A2 I4 C2 Publication B A1 A3 A3 I1 I4 I2 C1 C2 C1 Publication C A1 A1 A2 I1 I3 I2 C1 C2 C1 Publication D A1 A4 I1 I1 C1 C1 Publication E A1 A3 A5 I1 I5 I1 C1 C3 C1 Publication F A1 A3 A2 I1 I8 I1 C1 C4 C1 Question: How do I calculate the number of Citing Countries? Answer: Count the number of distinct countries in the affiliations of the citing publications. Number of Citing Countries = 4 60 Elsevier Research Intelligence SciVal

105 4.9 Metric: Field-Weighted Citation Impact Field-Weighted Citation Impact in SciVal indicates how the number of citations received by an entity s publications compares with the average number of citations received by all other similar publications in the data universe: how do the citations received by this entity s publications compare with the world average? A Field-Weighted Citation Impact of 1.00 indicates that the entity s publications have been cited exactly as would be expected based on the global average for similar publications; the Field-Weighted Citation Impact of World, or the entire Scopus database, is 1.00 A Field-Weighted Citation Impact of more than 1.00 indicates that the entity s publications have been cited more than would be expected based on the global average for similar publications; for example, 2.11 means 111% more cited than world average A Field-Weighted Citation Impact of less than 1.00 indicates that the entity s publications have been cited less than would be expected based on the global average for similar publications; for example, 0.87 means 13% less cited than world average Similar publications are those publications in the Scopus database that have the same publication year, publication type, and discipline, as represented by the Scopus journal classification system: Publications can be assigned to a classification system in 2 ways: Journal-driven assignment assumes that every publication within a journal fits within the same discipline(s) as the journal s scope. Each publication automatically adopts the subject classifications that are assigned to the journal. This method of assignment is suitable for journals that are focused in a core field, and do not tend to include publications that are also relevant to other fields Publication-driven assignment assumes that publications within a journal may have additional or different relevance to fields outside the core focus of the journal s scope. Publication-driven assignment offers the benefit of being able to assign individual publications from a journal separately to their relevant classifications. This is important for publications in multi-disciplinary journals Field-Weighted Citation Impact uses publication-driven assignment Publications are allocated to the classification Sub-category level, and can be allocated to more than 1 Sub-category. When we calculate the expected citations for similar publications, it is important that these multi-category publications do not exert too much weight; for example, if a publication P belongs to both in both parasitology and microbiology, it should not have double the influence of a publication that belongs to only one or the other Sub-category. This is accounted for in SciVal by distributing publication and citation counts equally across multiple journal categories; publication P would be counted as 0.5 publications for each of parasitology and microbiology, and its citations would be shared equally between these Sub-categories elsevier.com/research-intelligence/scival 61

106 Field-Weighted Citation Impact is a: Citation Impact metric Snowball Metric SciVal often displays Field-Weighted Citation Impact in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. The citations received in the year in which an item was published, and the following 3 years, are counted for this metric. This metric is useful to: Benchmark entities regardless of differences in their size, disciplinary profile, age, and publication-type composition, such as An institution and departments (Groups of Researchers) within that institution A country and small research institutes within that country A geographical region and countries within that region Easily understand the prestige of an entity s citation performance by observing the extent to which its Field-Weighted Citation Impact is above or below the world average of 1.00 Present citation data in a way that inherently takes into account the lower number of citations received by relatively recent publications, thus avoiding the dip in recent years seen with Citation Count and Citations per Publication Gain insight into the relative citation performance of an entity in a discipline with relatively poor Scopus coverage, since gaps in the database will apply equally to the entity s publications and to the set of similar publications Use as a default to view citation data, since it takes multiple variables that can affect other metrics into account Look at publishing activity in a way that is difficult to manipulate This metric should be used with care when: Information about the magnitude of the number of citations received by an entity s publications is important. In these situations, it is advised to use Citation Count or Citations per Publication Demonstrating excellent performance to those who prefer to see high numbers; Citation Count or Citations per Publication would be more suitable in these circumstances Entities are small so that the metric may fluctuate significantly and appear unstable over time, even when there is complete Scopus coverage. Field-Weighted Citation Impact calculates an average value, and these types of calculations are strongly influenced by outlying publications in a small data set 62 Elsevier Research Intelligence SciVal

107 Trust needs to be built in the metrics in SciVal. This calculation accounts for multiple normalizations, and the generation of the average citations for similar publications requires a view on the entire Scopus database which will be difficult for a user to validate. Users are advised to select simpler metrics, such as Citation Count or Citations per Publication, if trust in the accuracy of the SciVal calculations needs to be built Uncovering the performance of publications in the very early stages of a new strategy, or of early-career researchers, where the short time that has passed since publication will reduce the reliability of basing decisions on citation information. It is advised to use on of the time-independent metrics, such as Academic-Corporate Collaboration, in these cases Completely answering every question about performance. Field-Weighted Citation Impact is a very useful metric, but using it to the exclusion of other metrics severely restricts the richness and reliability of information that a user can draw from SciVal Useful partner metrics are: Citation Count and Citations per Publication. They indicate the magnitude of the number of citations received, to complement this relative view offered by Field-Weighted Citation Impact. They are also simple and offer transparency on the underlying data to build trust in SciVal s metrics calculations The set of time-independent metrics that provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles Mathematical notation: The Field-Weighted Citation Impact (FWCI) for a set of N publications is defined as: FWCI N 1 c i N e i i 1 c i = citations received by publication i e i = expected number of citations received by all similar publications in the publication year plus following 3 years When a similar publication is allocated to more than 1 discipline, the harmonic mean is used to calculate e i. For a publication i that is part of 2 disciplines: ( + ) 2 e e i A e A, e B = fractional counts of publications and citations, so that publication i will be counted as 0.5 publications in each of e A and e B, and the citations it has received will also be shared between A and B. e B elsevier.com/research-intelligence/scival 63

108 Example 5: Field-Weighted Citation Impact Scenario: The user would like to calculate the Field-Weighted Citation Impact of an entity that consists of 3 publications. They have not selected any viewing or calculation options. Entity with 3 Publications Publication Identity Publication 1 Publication 2 Publication 3 Publication Year (pub year) Publication Type Article Review Erratum Journal category(ies) Immunology Immunology Parasitology Parasitology Step 1 Compute number of citations received by publications in the entity. Actual citations received in pub year Actual citations received in 1st year after pub year Actual citations received in 2nd year after pub year Actual citations received in 3rd year after pub year N/A (example prepared in 2013) N/A (example prepared in 2013) N/A (example prepared in 2013) Actual citations received by the individual publication in pub year plus following 3 years Step 2 Compute expected number of citations received by similar publications. Number of publications in database published in same year, of same type, and within same discipline as Publication 1, 2, or 3 Total citations received in pub year plus 3 years by all publications in database published in same year, of same type, and within same journal category(ies) as Publication 1, 2, or 3 Average citations per publication for all publications in database published in same year, of same type, and within same subject category as Publication 1, 2, or 3 7, , ,665.2 / 7, , , ,770.8 / 1, , ,161.5 / / Step 3 Use harmonic mean to compute expected number of citations for publications covered in multiple categories. Combined average citations per publication for publications indexed in multiple journal categories 2/(1/ /13.4) Step 4 Compute ratio of actual (result of step 1) to expected (result of step 2 or 3) citations for each of Publications 1, 2 and 3. 41/ / No citations received or expected => Step 5 Take arithmetic mean of the results of step 4 to calculate Field-Weighted Citation Impact for this entity Arithmetic Mean ( ) / 3 Field-Weighted Citation Impact = Elsevier Research Intelligence SciVal

109 0.9 of an article? 0.3 of a review? Publications in Scopus can belong to more than one journal category. When calculating the expected citations per publication as input to Field-Weighted Citation Impact, we count multi-category publications partially within each of their journal categories, and distribute their citations accordingly. No weighting between journal categories is applied; publications are assumed to belong equally to each category they are indexed in. For example, if a publication with 3 citations belongs to both Parasitology and Microbiology journal categories, it is counted as 0.5 publications with 1.5 citations in Parasitology, and as 0.5 publications with 1.5 citations in Microbiology. This fractional distribution ensures that all publications have equal weight, regardless of the number of journal categories. The net effect of adding up these fractional publication and citation counts is 0.9 of an article, and 0.3 of a review, as seen here in the statistics on the SciVal data set. Pub Year Statistics on SciVal data set Number of articles Citations received in pub year +3 subsequent years Citations per article Immunology, Articles , , , , , , , , , , , , Immunology, Reviews , , , , , , , , , , , , Parasitology, Reviews , , , , Parasitology, Errata (Scopus data as of 11 November, 2013) elsevier.com/research-intelligence/scival 65

110 4.10 Metric: Collaboration Collaboration in SciVal indicates the extent to which an entity s publications have international, national, or institutional co-authorship, and single authorship. Each publication is assigned to 1 of 4 collaboration types, based on its affiliation information: international, national, institutional, or single authorship. A single publication may of course display each of international, national and institutional collaboration in its affiliation information, but a single collaboration type is assigned to ensure that the sum of an entity s publications across the 4 categories adds up to 100.0% of the publications with the necessary affiliation information. The assignment of geographical collaboration type is performed using the following cascading decision tree: Multiple authors? no Single authorship yes Multiple countries? yes International collaboration no Multiple institutions? yes National collaboration no Institutional collaboration 66 Elsevier Research Intelligence SciVal

111 Collaboration is a: Collaboration metric Snowball Metric Power metric when the Absolute number option is selected, but not when the Percentage option is selected This metric is useful to: Explore the extent of international and other types of collaboration within a data set Benchmark the collaboration of entities of different sizes, but in similar disciplines It is advised to select the Percentage option when comparing entities of different sizes, to normalize for this variable Showcase extensive international collaboration that may underpin a set of scholarly output Look at publishing activity in a way that is difficult to manipulate Investigate collaborative activity very early in a new strategy, or for an early-career researcher, since the affiliation data underlying this metric do not require time to accumulate to reliable levels in the same way as citation data do Provide transparency on the underlying data to build trust in the metrics in SciVal This metric should be used with care when: Benchmarking the collaboration of entities in different disciplines: The typical collaborative behavior of academics may differ between disciplines, such as mathematics and medicine, or humanities and molecular biology It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution, or a Group of Institutions such as a US state, it is advised to apply the Research Area filter to focus on one field that is common between all the entities elsevier.com/research-intelligence/scival 67

112 Entities are small and there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on apparent partnerships, whereas the effect of 1 or a few missing publication(s) from a large data set may be acceptable The only way to account for this is to be vigilant, particularly when looking at small data sets such as an early-career researcher, or to limit the use of Collaboration to comparing larger data sets where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison Understanding activity in a discipline with an obvious national focus, such as Finnish literature or cultural studies, when institutional and national collaboration may be of more use than international collaboration Useful partner metrics are: Number of Citing Countries, which indicates the diversity of countries in which Researchers have built on an entity s publications. It is likely to be higher if an entity s international collaboration activity involves several countries, rather than just 1 or 2, even though both situations can give 100.0% international collaboration Collaboration Impact, which calculates the average Citations per Publication for publications with different types of geographical collaboration, and indicates how beneficial these collaborations are with respect to citation impact The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of all other time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example Elsevier Research Intelligence SciVal

113 4.11 Metric: Collaboration Impact Collaboration Impact in SciVal indicates the citation impact of an entity s publications with particular types of geographical collaboration: how many citations do this entity s internationally, nationally, or institutionally co-authored publications receive, as well as those with a single author? Publications are assigned to 1 of 4 geographical collaboration types, as explained for Collaboration. This assignment applies to the entity s publications only, and the count of citations received is independent of the geographical collaboration status of the citing publications themselves; if an internationally collaborative publication is cited by another publication with single authorship, that citation is still counted. This metric calculates the Citations per Publication for each type of geographical collaboration. Collaboration Impact is a: Citation Impact metric Collaboration metric SciVal often displays Collaboration Impact in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Benchmark the average influence of an entity s publications with particular types of geographical collaboration, such as: The citation impact of an entity s internationally collaborative publications with that of its institutionally collaborative output The citation impact of internationally collaborative publications compared to that of non-collaborative publications with a single author Compare the collaborative citation impact of entities of different sizes, but from related disciplines Demonstrate any benefit from establishing and maintaining collaborations to the citation impact of an entity s Scholarly Output Provide transparency on the underlying data to build trust in the metrics in SciVal elsevier.com/research-intelligence/scival 69

114 This metric should be used with care when: Comparing entities in different disciplines where the citing behavior of academics may differ: Citation counts tend to be higher in disciplines such as virology, whose academics tend to publish frequently and include long reference lists, than in law, for example; these differences reflect the differences in the behavior of researchers in distinct subject fields, and not differences in performance It is not advisable to use this metric to compare entities in distinct disciplines without taking these differences into account When comparing entities made up of a mixture of disciplines, such as an Institution or Country, it is advised to apply the Research Area filter to focus on one field that is common between all the entities Understanding the citation impact of collaborative papers of small entities, when there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may significantly distort the apparent performance, whereas the buffering effect of a larger data set may compensate for 1 or 2 missing publications The only way to account for this is to be vigilant, particularly when looking at small data sets such as an early-career researcher. It is also advisable to limit the use of these metrics to comparing larger data sets where gaps in the database coverage likely have a similar effect on all entities being viewed and will not invalidate the comparison Entities are small so that the metric may fluctuate significantly and appear unstable over time, even when there is complete Scopus coverage. Collaboration Impact calculates an average value, and these types of calculations are strongly influenced by outlying publications in a small data set Understanding the performance of publications in the very early stages of a new strategy, or of earlycareer researchers, where the short time that has passed since publication will reduce the reliability of citation information as an input into decisions. These situations can be addressed by metrics that are useful immediately upon publication, such as Collaboration or Publications in Top Journal Percentiles Useful partner metrics are: Citations per Publication, that calculates the average citation impact across all publications within an entity regardless of their geographical collaboration status. This metric could be a useful benchmark against which to judge the extent of benefit of different types of geographical collaboration Collaboration, which indicates the extent to which an entity s publications have international, national, or institutional co-authorship, and single authorship. This is a natural partner metric to Collaboration Impact The set of time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example Elsevier Research Intelligence SciVal

115 4.12 Metric: Academic-Corporate Collaboration Academic-Corporate Collaboration in SciVal indicates the degree of collaboration between academic and corporate affiliations: to what extent are this entity s publications co-authored across the academic and corporate, or industrial, sectors? A publication either exhibits academic-corporate collaboration, or it does not. This assignment is made based on the organization-type with which Scopus tags each affiliation. This metric calculates the Citations per Publication for collaborative and non-collaborative papers. Academic-Corporate Collaboration is a: Collaboration metric Power metric when the Absolute number option is selected, but not when the Percentage option is selected This metric is useful to: Investigate the degree of collaboration between the academic and corporate sectors within a data set Benchmark the cross-sector collaboration of entities of different sizes, but in related disciplines, such as large and small research teams, or large and small Centers of Excellence It is advised to select the Percentage option when comparing entities of different sizes, to normalize for this variable Showcase extensive collaboration between academia and industry that may underpin a set of Scholarly Output Look at publishing activity in a way that is difficult to manipulate Investigate collaborative activity very early in a new strategy, or for an early-career researcher, since the affiliation data underlying this metric do not require time to accumulate to reliable levels in the same way as citation data do Provide transparency on the underlying data to build trust in the metrics in SciVal elsevier.com/research-intelligence/scival 71

116 This metric should be used with care when: Benchmarking the extent of academic-corporate collaboration of entities in different disciplines: The opportunity or desire to collaborate outside the sector may differ, such as between econometrics and drug discovery, or the philosophy of science and toxicology It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or Country, it is advised to apply the Research Area filter to focus on one field that is common between all the entities Entities are small and there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may have a significant negative impact on apparent cross-sector partnerships, whereas the effect of 1 or a few missing publication(s) from a large data set may be acceptable The only way to account for this is to be vigilant, particularly when looking at small data sets such as an early-career researcher, or to limit the use of Academic-Corporate Collaboration to comparing larger data sets where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison Investigating activity in a discipline with focus outside the interest of industry, such as history. It is not advised to use this metric in such a situation Useful partner metrics are: Academic-Corporate Collaboration Impact, which calculates the Citations per Publication for publications with and without academic-corporate collaboration, and indicates how beneficial this cross-sector collaboration is with respect to citation impact The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of all other time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, and Publications in Top Journal Percentiles See Example Elsevier Research Intelligence SciVal

117 4.13 Metric: Academic-Corporate Collaboration Impact Academic-Corporate Collaboration Impact in SciVal indicates the citation impact of an entity s publications with or without both academic and corporate affiliations: how many citations do this entity s publications receive when they list both academic and corporate affiliations, versus when they do not? A publication either exhibits academic-corporate collaboration, or it does not. This assignment applies to the entity s publications only, and the count of citations received is independent of the collaboration status of the citing publications themselves; if a publication that resulted from academic-corporate collaboration is cited by another publication with only academic affiliations, that citation is still counted. Academic-Corporate Collaboration Impact is a: Citation Impact metric Collaboration metric SciVal often displays Academic-Corporate Collaboration Impact in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Benchmark the average influence of an entity s publications with and without academic-corporate collaboration, such as: The citation impact of publications of academic research institutes that were published with industrial collaboration, compared to those that were not The citation impact of publications of Researchers located in a corporate affiliation that were published with academic collaboration, compared to those that were industry-only Compare the citation impact of these cross-sector publications between entities of different sizes, but from related disciplines, such as large and small international networks of Researchers in neuroscience Demonstrate any benefit from establishing and maintaining academic-corporate collaborations to the citation impact of an entity s scholarly output Provide transparency on the underlying data to build trust in the metrics in SciVal elsevier.com/research-intelligence/scival 73

118 This metric should be used with care when: Comparing entities in different disciplines where the citing behavior of academics may differ: Citation counts tend to be higher in disciplines such as cardiology, whose academics tend to publish frequently and include long reference lists, than in anthropology, for example; this reflects differences in the behavior of researchers in distinct subject fields, and not differences in performance It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution, a Country or a Group of Countries, it is advised to apply the Research Area filter to focus on one field that is common between all the entities Understanding the citation impact of collaborative papers of small entities, when there may be gaps in their output within the Scopus coverage: A single missing publication from a small data set may significantly distort the apparent performance, whereas the buffering effect of a larger data set may compensate for 1 or 2 missing publications The only way to account for this is to be vigilant, particularly when looking at small data sets such as an early-career researcher. It is also advisable to limit the use of these metrics to comparing larger data sets where gaps in the database coverage likely have a similar effect on all entities being viewed and will not invalidate the comparison Entities are small so that the metric may fluctuate significantly and appear unstable over time, even when there is complete Scopus coverage. Academic-Corporate Collaboration Impact calculates an average value, and these types of calculations are strongly influenced by outlying publications in a small data set Understanding the performance of publications in the very early stages of a new strategy, or of early-career researchers, where the short time that has passed since publication will reduce the reliability of citation information as an input into decisions. These situations can be addressed by metrics that are useful immediately upon publication, such as Academic-Corporate Collaboration or Publications in Top Journal Percentiles 74 Elsevier Research Intelligence SciVal

119 Useful partner metrics are: Citations per Publication, that calculates the average citation impact across all publications within an entity regardless of their academic-corporate collaboration status. This metric could be a useful benchmark against which to judge the extent of benefit of different types of collaboration Academic-Corporate Collaboration, which indicates the degree of collaboration between academic and corporate affiliations. This is a natural partner metric to Academic-Corporate Collaboration Impact The set of time-independent metrics that provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles See Example 6. elsevier.com/research-intelligence/scival 75

120 Example 6: Collaboration, Collaboration Impact, Academic-Corporate Collaboration, and Academic-Corporate Collaboration Impact Scenario: The user would like to calculate the Collaboration, Collaboration Impact, Academic-Corporate Collaboration, or Academic-Corporate Collaboration Impact of an entity that consists of 6 publications. Selected Publication Year range 2005 to 2012 Selected Publication Types Selected Research Area Articles, reviews and editorials Medicine Entity with 6 Publications Publication Identity Publication 1 Publication 2 Publication Year Total citations received by this publication 8 11 Publication Type Letter Review Journal Sub-Category Anatomy Dermatology Journal Main Category Medicine Medicine Authors Author 1 Author 2 Author 1 Author 3 Institutions Institution 1 Institution 1 Institution 1 Institution 4 Institution 2 Countries Country 1 Country 1 Country 1 Country 2 Country 1 Is the institution academic? Yes Yes Yes No No Is the institution corporate? No No No No Yes Step 1 Does the Journal Category of the publication match the selected Research Area? Yes Yes Step 2 Does this publication fall into the selected Publication Year range? Yes Yes Step 3 Does the Publication Type match the selected Publication Types? No Yes Step 4 Does the publication pass each of steps 1, 2 and 3? No Yes Question: How do I calculate Scholarly Output? Question: How do I calculate Collaboration? Question: How do I calculate Collaboration Impact? Question: How do I calculate Academic-Corporate Collaboration? Question: How do I calculate Academic-Corporate Collaboration Impact? Answer: Count the number of publications that received a "yes" in step 4. Answer: Assign collaboration type to each publication that received a "yes" in step 4 using the decision tree. (i) Does the publication have a single author? If Yes, assign as single Authorship. (ii) Do the unassigned publications have more than one country in the affiliation information? If yes, assign as International Collaboration. (iii) Do the unassigned publications have more than one institution in the affiliation information? If yes, assign as National Collaboration. (iv) Assign all remaining publications as Institutional Collaboration. For "Absolute number" display option, sum the publications within each group. For "Percentage" display option, divide "Absolute Number" within each group by Scholarly Output of the entity. Answer: Divide total citations received by the publications by absolute number of each type of collaboration. Answer: Do the publications that received a "yes" in step 4 have both an academic and a corporate institution in the affiliation information? For "Absolute Number" display option, sum the publications with and without academic-corporate collaboration. For "Percentage" display option, divide "Absolute Number" within each group by Scholarly Output of the entity. Answer: Divide total citations received by the publications by absolute number of each type of collaboration. Scholarly Output = 4 International Collaboration Single Authorship = 1 International Collaboration = 2 National Collaboration = 0 Institutional Collaboration = 1 Single Authorship = (1/4)*100 = 25.0% International Collaboration = (2/4)*100 = 50.0% National Collaboration = (0/4)*100 = 0.0% Institutional Collaboration = (1/4)*100 = 25.0% Single Authorship Impact = 1/1 = 1.0 International Collaboration Impact = (11+6)/2 = 8.5 National Collaboration Impact = 0/0 = 0.0 Institutional Collaboration Impact = 4/1 = 4.0 With collaboration = 1 Without collaboration = 3 With collaboration = (1/4)*100 = 25.0% Without collaboration = (3/4)*100 = 75.0% Impact with collaboration = 11/1 = 11.0 Impact without collaboration = (1+4+6) / 3 = 3.7 Yes 76 Elsevier Research Intelligence SciVal

121 Entity with 6 Publications Publication 3 Publication 4 Publication 5 Publication Article Editorial Review Article Anatomy Internal Medicine General Medicine Emergency Medicine Medicine Medicine Medicine Medicine Author 1 Author 1 Author 4 Author 1 Author 4 Author 1 Author 2 Institution 1 Institution 3 Institution 5 Institution 1 Institution 1 Institution 1 Institution 5 Institution 1 Institution 1 Institution 2 Institution 1 Country 1 Country 2 Country 3 Country 1 Country 1 Country 1 Country 3 Country 1 Country 1 Country 1 Country 1 Yes Yes Yes Yes Yes Yes Yes Yes Yes No Yes No No No No No No No No No Yes No Yes Yes Yes Yes Yes Yes Yes No Yes Yes Yes Yes Yes Yes Yes No Single Authorship International Collaboration Institutional Collaboration No No No elsevier.com/research-intelligence/scival 77

122 4.14 Metric: Outputs in Top Percentiles Outputs in Top Percentiles in SciVal indicates the extent to which an entity s publications are present in the most-cited percentiles of a data universe: how many publications are in the top 1%, 5%, 10% or 25% of the most-cited publications? The entire Scopus database, or World, is the default data universe used to generate this metric: The citation counts that represent the thresholds of the 1%, 5%, 10% and 25% most-cited papers in Scopus per Publication Year are calculated. Sometimes the same number of citations received by the publication at, say, the 10% boundary has been received by more than 10% of publications; in this case, all of the publications that have received this number of citations are counted within the top 10% of the Scopus data universe, even though that represents more than 10% by volume SciVal uses these citation thresholds to calculate the number of an entity s publications that fall within each percentile range SciVal users have the option to select from additional data universes within which to generate this metric, such as countries. If a user selects China, or Canada, or the United Kingdom, then SciVal will use the citation count thresholds at the boundaries of the 1%, 5%, 10% and 25% most-cited papers in China, or Canada, or the United Kingdom per Publication Year in this metric calculation. This selection does not filter the publications of the entity upon which the calculation is performed, and it affects only the citation thresholds. Use of the Publication Type filter affects the publications in the data universe that are used to generate the citation thresholds, as well as the publications of the entity upon which the calculation is performed. The exclusion of self-citations affects only the entity, and not the data universe. Outputs in Top Percentiles can only be calculated for the current year from the first data snapshot on or after 1 July. It will be displayed as a null value until this date is reached. This metric depends on being able to divide the publications into 100 percentiles, and this level of division is not possible earlier in the publication year when items just published have received very few citations. 78 Elsevier Research Intelligence SciVal

123 Outputs in Top Percentiles is a: Citation Impact metric Snowball Metric Power metric when the Absolute number option is selected, but not when the Percentage option is selected SciVal often displays Outputs in Top Percentiles in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Benchmark the contributions towards the most influential, highly cited publications in the world of entities of different sizes, but in similar disciplines It is advised to select the Percentage option when comparing entities of different sizes, to normalize for this variable Distinguish between entities whose performance seems similar when viewed by other metrics, such as Scholarly Output, Citations per Publication, or Collaboration Showcase the performance of a prestigious entity whose publications are amongst the most cited and highly visible publications of the scholarly world Present citation data in a way that inherently takes into account the lower number of citations received by relatively recent publications, thus avoiding the dip in recent years seen with Citation Count and Citations per Publication elsevier.com/research-intelligence/scival 79

124 This metric should be used with care when: Comparing entities in different disciplines: Citation counts tend to be higher in disciplines such as immunology and microbiology, whose academics tend to publish frequently and include long reference lists, than in mathematics, where publishing 1 item every 5 years that refers to 1 or 2 other publications is common; these differences reflect the distinct behavior of researchers in distinct subject fields, and not differences in performance It is not advisable to use this metric to compare entities in distinct disciplines without accounting for these differences When comparing entities made up of a mixture of disciplines, such as an Institution or Country, it is advised to apply the Research Area filter to focus on one field that is common between all the entities Entities are small and there may be gaps in their output within the Scopus coverage. A single missing highly cited publication from a small data set will have a significant negative impact on apparent performance. Although it is relatively unlikely that such prominent publications are not indexed by Scopus, we advise users to be vigilant and to bear this possible limitation in mind There is a concern that excessive self-citations may be artificially inflating the number of publications that appear in the top percentiles. Users can judge whether the level of self-citations is higher than normal by deselecting the Include self-citations option Understanding the status of publications of an early-career researcher, or those resulting from the first stages of a new strategy, where insufficient time may have passed to ensure that presence in top citation percentiles is a reliable indicator of performance. These situations can be addressed by metrics that are useful immediately upon publication, such as Publications in Top Journal Percentiles Trust needs to be built in the metrics in SciVal. The citation thresholds may depend on the entire Scopus database, and it will be difficult for a user to validate these boundaries. Users are advised to select simpler metrics, such as Citations per Publication, if trust in the accuracy of the SciVal calculations needs to be built 80 Elsevier Research Intelligence SciVal

125 Useful partner metrics are: Cited Publications, which indicates the reliability with which an entity s output is built on by subsequent research by counting publications that have received at least 1 citation. It is not affected by 1 or a few very highly cited publications Publications in Top Journal Percentiles, which indicates the extent to which an entity s publications are present in the most-cited journals in the data universe, and is independent of the citations received by the publications themselves. This is a natural partner metric The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Publications in Top Journal Percentiles ( Absolute number ), and h-indices The set of time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles elsevier.com/research-intelligence/scival 81

126 Example 7: Outputs In Top Percentiles Scenario: The user would like to calculate the Outputs in the Top Percentiles of an entity that consists of 6 publications, and has selected the following viewing and calculation options: Selected Publication Year range 2004 to 2013 Selected Publication Types Selected Research Area Articles, reviews and editorials Chemistry Selected Percentile Level 10% Selected data universe World Entity with 6 Publications Publication Identity Publication 1 Publication 2 Publication Year Total citations received by this publication Publication Type Article Review Journal Sub-Category Organic Chemistry Pharmaceutical Science Journal Main Category Chemistry Pharmacology, Toxicology & Pharmaceuticals Do publications match user-selected options? Step 1 Does the Journal Category of the publication match the selected Research Area? Yes No Step 2 Does this publication fall in the selected Publication Year range? Yes Yes Step 3 Does the Publication Type match the selected Publication Types? Yes Yes Step 4 Does the publication pass each of steps 1, 2 and 3? Yes No Question: How do I calculate Scholarly Output? Question: How do I calculate Outputs in Top Percentiles? Answer: Count the number of publications that received a yes in step 4. Answer: Look up the 10% citation threshold for this Publication Year of publications that received a yes in step 4. Scholarly Output = 4 8 N/A Was the publication cited at least as many times as the threshold? Yes N/A For Absolute number display option, count the publications that received a yes in the previous step. For Percentage display option, divide Absolute Number by the Scholarly Output of the entity. Outputs in Top Percentiles = 3 Outputs in Top Percentiles = (3/4)*100 = 75.0% 82 Elsevier Research Intelligence SciVal

127 This table shows the number of times a publication must be cited to be in the top 10% for its Publication Year. 10% World Citation Thresholds Year Citations Year Citations Year Citations Entity with 6 Publications Publication 3 Publication 4 Publication 5 Publication Editorial Editorial Review Article Geochemistry and Petrology Spectroscopy Inorganic Chemistry Organic Chemistry Earth & Planetary Sciences Chemistry Chemistry Chemistry No Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes No Yes Yes Yes N/A N/A No Yes Yes elsevier.com/research-intelligence/scival 83

128 4.15 Metric: Publications in Top Journal Percentiles Publications in Top Journal Percentiles in SciVal indicates the extent to which an entity s publications are present in the most-cited journals in the data universe: how many publications are in the top 1%, 5%, 10% or 25% of the most-cited journals indexed by Scopus? The most-cited journals are defined by the journal metrics SNIP (Source-Normalized Impact per Paper) or SJR (SCImago Journal Rank). This means that the data universe is the set of items indexed by Scopus that have a journal metric and so can be organized into percentiles; this excludes publications in stand-alone books and trade publications, which do not have journal metrics. All items indexed by Scopus that have a SNIP or SJR value form the data universe used to generate this metric: The SNIP or SJR values at the thresholds of the top 1%, 5%, 10% and 25% most-cited journals in Scopus per Publication Year are calculated. It is possible that the same journal metric value received by the indexed item at, say, the 10% boundary has been received by more than 10% of indexed items; in this case, all items with this journal metric value are counted within the top 10% of the data universe, even though that represents more than 10% items by volume. This is less likely to happen than for the Outputs in Top Percentiles thresholds, because the journal metrics are generated to 3 decimal places which reduces the chance of items having the same value These thresholds are calculated separately for each journal metric, not once for a combination of both journal metrics SciVal uses these journal metric thresholds to calculate the number of an entity s publications within indexed items that fall within each percentile range Indexed items have multiple journal metric values, for distinct years. Which one is used in assigning publications to journal metrics thresholds? The journal metric value matching the publication year of an item of scholarly output is used as far as possible The first journal metric values are available for For scholarly output published in the range , the journal metric value for 1999 is used Current year journal metric values are published during the course of the following year. For recent items whose journal metric values have not yet been published, the previous year s journal metric values are used until the current year s become available A publication may be counted in the top journal percentiles without itself ever having been cited. The citations received by an individual publication are irrelevant for this metric, which is based only on citations received by a journal or conference proceedings 84 Elsevier Research Intelligence SciVal

129 SNIP and SJR are both field-normalized journal metrics, meaning that this metric can be used to compare the presence of publications in exceptional journals of entities working in different disciplines. Publications in Top Journal Percentiles is a: Citation Impact metric Power metric when the Absolute number option is selected, but not when the Percentage option is selected SciVal often displays Publications in Top Journal Percentiles in a chart or table with years. These years are always the years in which items were published, and do not refer to the years in which citations were received. This metric is useful to: Benchmark entities even if they have different sizes and disciplinary profiles: It is advised to select the Percentage option when comparing entities of different sizes, to normalize for this variable SNIP and SJR are field-normalized metrics. Selecting one of these journal metrics will inherently account for differences in the behavior of academics between fields Showcase the presence of publications in journals that are likely to be perceived as the most prestigious in the world Incorporate peer review into a metric, since it is the judgment of experts in the field that determines whether a publication is accepted by a particular journal or not Avoid the dip in recent years seen with metrics like Citation Count and Citations per Publication Look at publishing activity in a way that is difficult to manipulate, since it relies on peer review Investigate performance very early in a new strategy, or for an early-career researcher, since journal data underpin this metric, and not the citations received by individual publications themselves which would require time to accumulate Provide transparency on the underlying data to build trust in the metrics in SciVal. The journal metrics for the entire database are available for download in a spreadsheet from so that a user can determine the thresholds themselves if required elsevier.com/research-intelligence/scival 85

130 This metric should be used with care when: The objective is to judge an entity s publications based on their actual performance, rather than based on the average of a journal: A publication may appear in a journal with a very high SNIP or SJR value, and not itself receive any citations; even the most prestigious journals in the world contain publications that have never been cited A publication may be very influential and have received many citations, without being part of a journal with a high SNIP or SJR value; journals which do not rank very highly in the data universe may still contain very highly cited publications Benchmarking the performance of a Researcher who is the editor of a journal in the top percentiles, since they can publish multiple editorials which all fall into these top percentiles. In this situation, it is advised to use the Publication Type filter to exclude editorials and ensure that the types of publications in the data sets being compared are consistent Entities are small and there may be gaps in their output within the Scopus coverage. A single missing publication from a small data set will have a significant negative impact on apparent performance. Although it is relatively unlikely that publications in such prominent journals are not indexed by Scopus, we advise users to be vigilant and to bear this possible limitation in mind Provide transparency on the underlying data to build trust in the metrics in SciVal 86 Elsevier Research Intelligence SciVal

131 Useful partner metrics are: Citation Count, Citations per Publication, Field-Weighted Citation Impact, and Outputs in Top Percentiles which rely on the citations received by an entity s publications themselves, and not on the average performance of the journal Outputs in Top Percentiles, which indicates the extent to which an entity s publications are present in the most-cited percentiles of the data universe, but depends on the citations received by the publications themselves. This is a natural partner metric The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), and h-indices The set of all other time-independent metrics which provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, and Academic-Corporate Collaboration elsevier.com/research-intelligence/scival 87

132 Example 8: Publication In Top Journal Percentiles Scenario: The user would like to calculate the Publications in Top Journal Percentiles of an entity that consists of 6 publications, and has selected the following viewing and calculation options. Selected Journal Metric: SNIP Selected Percentile Level: 10% Entity with 6 Publications Publication Identity Publication 1 Publication 2 Publication Year Journal in which publication is published Perception Journal of the American Medical Association Step 1 Step 2 Retrieve journal's SNIP value for the Publication Year. Look up the 10% SNIP threshold for the Publication Year Step 3 Is this Journal's SNIP (step 1) at least as large as the 10% SNIP threshold (step 2)? No Yes Question: How do I calculate Scholarly Output? Answer: Count the number of publications in the entity. Scholarly Output = 6 Question: How do I calculate Publications in Top Journal Percentiles? Answer: For "Absolute Number" display option, count the publications that are published in the top 10% of journals. Publications in Top 10% Journals Percentiles = 2 For "Percentage" display option, divide "Absolute Number" by the Scholarly Output of the entity. Publications in Top 10% Journal Percentile = (2/6)*100 = 33.3% 88 Elsevier Research Intelligence SciVal

133 Entity with 6 Publications Publication 3 Publication 4 Publication 5 Publication Vision Research Nature Vision Research Biophysics No Yes No No SNIP Percentile Thresholds Percentile Level % % % % elsevier.com/research-intelligence/scival 89

134 4.16 Metric: h-indices h-indices in SciVal indicate a balance between the productivity (Scholarly Output) and citation impact (Citation Count) of an entity s publications. h-indices in SciVal offer 3 variants: the h-index, the g-index, and the m-index. The g- and m-indices inherit the positive qualities of the h-index, but address aspects that are sometimes considered shortcomings; therefore these metrics are grouped together into a set collectively called h-indices: h-index is now recognized as an industry standard that gives information about the performance of Researchers and Research Areas that is very useful in some situations. h-index of an entity is 9 if the top 9 most-cited publications have each received at least 9 citations; it is 13 if an entity s top 13 most-cited publications have each received at least 13 citations; and so on g-index is a variant of the h-index that emphasizes the most highly cited papers in a data set. The h-index does not give extra weighting to the most-cited publications of a data set that are likely the ones that are responsible for an entity s prestige; g-index can be used if this feature of the h-index is seen as a weakness. The g-index is always the same as or higher than the h-index m-index is another variant of the h-index that displays h-index per year since first publication. The h-index tends to increase with career length, and m-index can be used in situations where this is a shortcoming, such as comparing researchers within a field but with very different career lengths. The m-index inherently assumes unbroken research activity since the first publication h-indices are available in SciVal for all Researcher-based entities, and for Research Areas. 90 Elsevier Research Intelligence SciVal

135 h-indices are: Productivity metrics Citation Impact metrics Snowball Metrics (h-index only) Power metrics A Researcher-based entity, or a Research Area, has a single value for the h-, g- or m-index that is based on all publications in the data set; these indices are not calculated per year. This is represented in the Benchmarking module as the same value for all years so that this metric can be made available in this module; in the Chart view, for instance, a horizontal line will be displayed and the years do not have a meaning. These metrics are useful to: Benchmark activity in a way that replies on the balance between 2 fundamental aspects of performance, namely productivity and citation impact: The total number of publications, or Scholarly Output, of an entity sets a limit for the value of the h-index. If a researcher has 1 publication that has been cited 100 times, their h-index cannot exceed 1 The total number of citations received, or Citation Count, sets the other limit for the value of the h-index. If a researcher has 100 publications which have each received 0 or 1 citations, their h-index also cannot exceed 1 Be used as a related group of metrics, each with their own strengths Provide transparency on the underlying data to build trust in the metrics in SciVal elsevier.com/research-intelligence/scival 91

136 These metrics should be used with care when: Comparing entities of significantly different sizes: The values of these metrics are limited by the Scholarly Output of an entity, and tend to increase with the size of the data set This can be accounted for within a discipline, when the difference in size is due to different career lengths, by using the m-index; in this situation, variations revealed by the m-index are due to differences in annual productivity and citations received, which are likely the performance aspects of interest Benchmarking entities within different disciplines, even if these entities have similar sizes: The values of h-indices are limited by the Citation Count of an entity, and tend to be highest in subject fields such as biochemistry, genetics and molecular biology; this reflects distinct publication and citation behavior between subject fields and does not necessarily indicate a difference in performance It is not advisable to compare the h-indices of entities that fall entirely into distinct disciplines, such as a Researcher in genetics with a Researcher in human-computer interaction When comparing entities made up of a mixture of disciplines, such as cross-disciplinary research teams, it is advised to apply the Research Area filter to focus on one field that is common between all the entities An indication of the magnitude of the productivity and citation impact of an entity is important. It is advised to use Scholarly Output and Citation Count when it is important to communicate scale Entities are small and there may be gaps in their output within the Scopus coverage: A single missing publication from a total or 3 or 4 will have a significant negative impact on apparent performance, whereas the effect of 1 missing publication from a total of 100 may be acceptable The only way to account for this is to be vigilant, particularly when looking at small data sets such as an early-career researcher, or to limit the use of these metrics to comparing larger data sets where gaps in the database coverage likely have a similar effect on all entities being viewed and do not invalidate the comparison There is a concern that excessive self-citations may be artificially inflating the values. Users can judge whether the level of self-citations is higher than normal by deselecting the Include self-citations option Investigating the research output of early-career researchers who have, as yet, few publications, and have received a low number of citations due to the short time since publication. These metrics will not give reliable information in this situation, and it would be preferable to use metrics such as Collaboration or Publications in Top Journal Percentiles 92 Elsevier Research Intelligence SciVal

137 Useful partner metrics are: Scholarly Output and Citation Count which provide information about the magnitude of productivity and citation impact The set of all other Power metrics whose value tends to increase as the entity becomes bigger: Scholarly Output, Journal Count, Journal Category Count, Citation Count, Cited Publications ( Absolute number ), Number of Citing Countries, Collaboration ( Absolute number ), Academic-Corporate Collaboration ( Absolute number ), Outputs in Top Percentiles ( Absolute number ), and Publications in Top Journal Percentiles ( Absolute number ) The set of time-independent metrics that provide useful, reliable information immediately upon publication and do not rely on the passing of time for useful data to accumulate: Scholarly Output, Journal Count, Journal Category Count, Collaboration, Academic-Corporate Collaboration, and Publications in Top Journal Percentiles elsevier.com/research-intelligence/scival 93

138 Example 9: h-indices Scenario: The user would like to calculate the h-indices of an entity that consists of 10 publications. They have not selected any viewing or calculation options. Entity with 10 Publications Publication Year Total citations received by this publication Order and rank the publications Step 1 Step 2 Question: Sort the publications by their citation counts, from largest to smallest. Assign a rank to each publication, starting from 1. How do I calculate h-index? Answer: - Is the citation count for the publication (step 1) as least as large as its rank (step 2)? 12 1 Yes 12 2 Yes 6 3 Yes - Identify the highest value for which citation count is at least as large as the publication s rank. h-index 1 h-index 2 h-index 3 Question: How do I calculate g-index? h-index = 5 Answer: - Sum the citation counts of the publications ranked up to and including the current position Calculate the square of the rank (from step 2). 1x1 1 2x2 4 3x3 9 - Is the sum of the citation counts at least as large as the square of the rank? 12 1 Yes 24 4 Yes 30 9 Yes - Identify the highest value for which the sum of the citation counts is at least as large as the square of the rank g-index 1 g-index 2 g-index 3 Question: How do I calculate m-index? g-index = 6 Answer: - Earliest Publication Year in set Current Year Calculate number of years since earliest publication. - Divide h-index by number of years / 18 m-index = Elsevier Research Intelligence SciVal

139 Entity with 10 Publications Yes 5 5 Yes 4 6 No 4 7 No 4 8 No 2 9 No 2 10 No h-index 4 h-index = 5 h-index < 6 h-index < 7 h-index < 8 h-index < 9 h-index < x4 16 5x5 25 6x6 36 7x7 49 8x8 64 9x x Yes Yes Yes No No No No g-index 4 g-index 5 g-index = 6 g-index < 7 g-index < 8 g-index < 9 g-index < 10 elsevier.com/research-intelligence/scival 95

140 For more information about SciVal, please visit: elsevier.com/research-intelligence/scival MKT Copyright 2014 Elsevier B.V. All rights reserved.

141 Last updated on Tuesday, June 30, Elsevier B.V. All rights reserved. SciVal is a registered trademark of Elsevier Properties S.A., used under license.

142 SciVal Table of Contents 2 Table of contents About SciVal 5 What is SciVal?... 5 What Spotlight... and Strata users can expect 5 Browser requirements... 6 Get started 7 Logging in... 7 Selecting a... time period 7 Filtering by... subject area 8 Selecting entities... 9 How you can use SciVal 11 How can... my institution demonstrate research excellence? 11 How can... my institution evaluate the impact of its research portfolio? 12 How can... my institution attract talented researchers? 15 How can... my institution find collaboration partners? 17 How can... my institution identify its research strengths? 21 What is the... impact of adding a new researcher to my team? 22 The Overview module 25 What is the... Overview module? 25 Working... with the Overview module Selecting an entity Selecting a year range Filtering by journal category 26 How can... you use the Overview module? Get an overview of your institution's research performance Explore the publication output of your institution Get an overview of your institution's research strengths Get an overview of collaboration by your institution Evaluate the performance of a researcher or research team Investigate other institutions See your institution's national or global position Identify potential collaboration partners See your institution's performance in a specific research area Get instant performance snapshots of your Publication Sets 33 Using the... Competencies analysis to identify research strengths What are competencies? How can you use competencies? 35 2

143 SciVal Table of Contents 3... Working with competencies How SciVal identifies competencies 41 The Benchmarking module 45 What is the... Benchmarking module? 45 Working... with the Benchmarking module Selecting metrics Selecting a year range Filtering by journal category Working with the chart Working with the table View the list of journals 49 How can... you use the Benchmarking module? Compare your institution to others Benchmark your institution against the national average Analyze developments in a field over time Identify suitable benchmark institutions Compare your institution against collaborating institutions 54 The Collaboration module 56 What is the... Collaboration module? 56 Working... with the Collaboration module Selecting an institution Selecting a country Selecting a year range Selecting a region, country or sector Filtering by journal category or research area Working with the map Working with the table 62 How can... you use the Collaboration module? Identify the collaboration partners of your institution Evaluate a collaboration partner in detail Identify potential new collaboration partners of your institution 67 The Trends module 69 What is the... Trends module? 69 Working... with the Trends module Selecting a Research Area Selecting a Year range Working with the map Working with the table Working with the chart Selecting metrics 73 How can I... use the Trends module? 0 Identify top... performers 75 3

144 SciVal Table of Contents 4 Working with entities in SciVal 78 Types of... entities 78 Selecting... entities 79 Defining... and importing new entities 80 Tagging entities Export customization Sharing entities... with others Transferring ownership 84 Integration... with Scopus 84 Working with research areas 86 About research... areas in SciVal 86 Defining... a research area 86 Predefined... Research Areas 88 Search tips Analyzing... a research area Research areas in the Overview module Research areas in the Benchmarking module Research areas in the Collaboration module Research areas in the Trends module 94 Data and metrics 96 What is the... source of the data in SciVal? 96 How current... is the data? 96 What publication... types can you use? 96 Which metrics... are available to use in SciVal? 97 What are... Snowball Metrics? 99 What are... SNIP and SJR? 99 Which journal... classifications are available in SciVal? 100 How are... keyphrases calculated? 102 Author merge... requests in SciVal 103 Index 104 4

145 SciVal About SciVal 5 1 About SciVal 1.1 What is SciVal? SciVal is a set of integrated modules that enables your institution to make evidence-based strategic decisions. SciVal consists of four modules: Overview - Get an overview of the research performance of your institution and others based on output, impact, and collaborations. View a short video about the Overview module Benchmarking Determine your strengths and weaknesses. Compare your research institution and teams to others based on performance metrics. Model different test scenarios. View a short video about the Benchmarking module Collaboration Identify and analyze existing and potential collaboration opportunities. Identify suitable collaboration partners. See who others are collaborating with. Trends - Analyze Research Areas to find top performing universities, authors and journals. Spot growing and declining topics in the field. SciVal for chancellors and deans To make the right strategic decisions, you need actionable data. SciVal gives you insights to make evidence-based decisions. Track your research performance, identify your institution s strengths and compare your institution to peers around the world. SciVal for senior researchers and department heads With less funding and more competition, it s not enough to do good research. SciVal gives you the tools to evaluate and clearly demonstrate the value of your research to funding agencies and others. Analyze your performance by team or department, compare to peers and identify new collaboration partners. SciVal for research administrators, development professionals and data experts SciVal combines the power to perform massive calculations with the flexibility to respond to userdefined queries. You can apply 15 different metrics to any grouping of people or publications. And you can filter the data by more than 330 journal categories. 1.2 What Spotlight and Strata users can expect SciVal is the successor to SciVal Spotlight and SciVal Strata. SciVal has fully integrated the analytical capabilities of Spotlight and Strata, and made them more comprehensive and intuitive, with more metrics and more sophisticated ways to analyze performance. For Strata users In SciVal, there are more refined analysis capabilities. Instead of five traditional metrics you can benchmark any institution, groups of researchers or publications using 15 5

146 SciVal About SciVal 6 different metrics. For Spotlight users In addition to the innovative competencies and collaboration analysis introduced in Spotlight, SciVal offers more traditional indicators for research performance evaluation. Benefits of the new SciVal include: A single integrated platform SciVal has three modules: Overview, Benchmarking and Collaboration. They form a single integrated platform sharing the same data, entities and metrics. You can navigate from one module to the other with ease. Tailored to your needs In addition to extensive predefined entities, you can also define and analyze your own entities, research teams and topics. This is helpful when tracking performance or planning strategy in a very narrow field, which is not covered by existing entity definitions. More metrics and improved visualizations SciVal goes beyond the basic metrics introduced in Strata. There are more metrics and more flexible ways to analyze the metrics. In seconds, you can see an in-depth analysis. 1.3 Browser requirements Supported browsers. We strive to fully support the latest full versions of Mozilla Firefox and Google Chrome on Microsoft Windows and Mac OS X. The following versions were tested for the current SciVal release: Firefox version 37. x Chrome version 42. x SciVal also fully supports the following browsers running on Microsoft Windows operating systems: Microsoft Internet Explorer version 9, 10 and 11 SciVal also fully supports the following browsers running on Mac OS X: Safari 7 and 8 Note that: Other operating systems and browsers may also be able to access Elsevier products; however, the Elsevier E-Helpdesk cannot provide expert advice or technical support to solve problems you may encounter when using these systems. Beta or test versions of browsers are not supported.. Mobile browsers are not supported. 6

147 SciVal Get started 7 2 Get started 2.1 Logging in To log in to SciVal: 1. Go to 2. If you already have access to other Elsevier sites (such as ScienceDirect or Scopus), you can log in with your current user name and password. Registering as a new user. If you do not yet have an Elsevier username and password, you will need to register as a new user. 1. Go to and click the "Register" link. 3. Your username is your address. It is not case-sensitive. 4. Create a password. Your password must be 5-20 characters long, and it must contain at least: 1 uppercase character 1 lowercase character 1 number or special character: * # $ % ^ & * _ + { } : " < >? ` - = [ ] \ ; ',. / " Remote access. If you are a registered user, there are two ways to access SciVal remotely. 1. You can activate it yourself when you are logged in. 2. Or you can use a registration link provided by Elsevier support staff. Contact your system administrator for details. 2.2 Selecting a time period Use the year range selector at the top of the page to select the time period for your analysis. This lets you choose the range of publication years for the publications that are included in your analysis. In the Overview and Collaboration and Trends modules, you can analyze performance for a threeor five-year period. In the Benchmarking module, you can compare performance from 1996 until the present. 7

148 SciVal Get started 8 Optionally, you can also include the current year as well as publications in future years. However, you may want to exclude this because, by the end of the current year, Scopus has only received and indexed a certain portion of the current year s journals from other publishers. 2.3 Filtering by subject area In SciVal all data can be filtered by subject area. You can choose from 27 main categories and 334 subcategories in the Scopus journal classification. Or use a different journal classification. Which journal classifications are available in SciVal? You can use any of the journal categories as a filter for further analysis. The Scopus journal category filter is available in the modules Overview, Benchmarking and Collaboration. Click the arrow next to each journal category to display the subcategories. 8

149 SciVal Get started Selecting entities An entity is anything that can be viewed in SciVal in terms of academic performance. An entity can be an institution, country, researcher, publication set, or research area. It can also be groupings of these, such as a group of researchers. Use the entity selection panel to select the entities that you want to analyze. It is on the left side of the screen in each of the three modules. Think of the entity selection panel as a workspace. All your entities of interest are in one clear and organized place. Choose from the thousands of predefined entities in the SciVal database: institutions or countries. Or define your own entities. Your self-defined entities can be researchers, research teams, publication sets or even research areas. To add additional items to the entity selection panel, click the Add link at the bottom of the currently opened section: 9

150 SciVal Get started 10 Start typing the name of the entity you will like to add. Then click on the name when it appears in the search results. You can also click on the Define links to define an entirely new entity. You can safely remove entities from the panel. They will not be permanently deleted. You can add them back at any time. To learn more, see Selecting entities in the section "Working with entities in SciVal". 10

151 SciVal How you can use SciVal 11 3 How you can use SciVal 3.1 How can my institution demonstrate research excellence? A number of quality metrics are available in SciVal to demonstrate research excellence at your institution. Use highly cited publications and publications in leading journals. Two metrics often used to illustrate excellence of research are Outputs in Top Percentiles and Publications in Top Journal Percentiles. These show how much of your institution s publication output was good enough to rank among the world s top publications. 1. Go to the Overview module and select your institution. 2. Go to the Publications tab and find the Outputs in Top Percentiles section. This shows the share of your institution s publications that are within the top 1% and top 10% of the most cited publications worldwide. 3. As you can see in the chart above, some 28% of Athena University's publications were in the top 10 percentiles of the most cited publications worldwide. 4. The Publications in Top Journal Percentiles section shows how many of your institution s publications were in the top 1% and 10% of the world s journals. These top journals are selected by measuring all journals by either SNIP or SJR and selecting the top-ranking ones. You can toggle between SNIP and SJR using the dropdown menu. See What are SNIP and SJR? for more information on these journal metrics. 5. As you can see in the chart above, 37.6% of the publications at Athena University were published in the top 10 journals worldwide (measured by SNIP). 6. Go to the Benchmarking module to see the Outputs in Top Percentiles and Publications in Top 11

152 SciVal How you can use SciVal 12 Journal Percentiles metrics for your institution over a longer time period (1996 to present). You also have additional metric options available here. And you can compare your institution to other institutions, or the national or global average. Demonstrate research strengths. The Competencies analysis in the Overview module offers another way to demonstrate excellence. The Competencies analysis identifies research strengths of your institution granular areas of research where your institution is a global leader. A competency shows where an institution has a leading position compared to other institutions in terms of number of publications, number of highly cited publications or innovation - the recentness of cited publications. The competency analysis uses a methodology based on citation patterns called co-citation analysis. Highly cited publications are clustered based on co-citation counts. The clusters are grouped together into competencies. The analysis is always based on five years of data. For example, if you select 2013, the analysis is based on data from 2009 up until and including To learn more, see: What are competencies? 3.2 How can my institution evaluate the impact of its research portfolio? To evaluate the impact of your research, you can use SciVal to analyze your institution s citation metrics. Useful metrics include Citation Count, Citations per Publication and Field-Weighted Citation Impact. 1. Go to the Overview module and select your institution from the left-hand entity selection panel (Athena University in this example). If you are interested in a particular field of research or time period, adjust the filters for year range and Scopus journal category. 2. Go to the Citations tab to see metrics for citation impact. 3. As you can see, Athena University has averaged 12.8 citations per publication over a five-year time period. 12

153 SciVal How you can use SciVal The Field-Weighted Citation Impact of your institution adjusts for the differences in citation behavior across disciplines. Athena s impact is Citations are 63% more than expected based on the global average. The Field-Weighted Citation Impact is the number of total citations received divided by the total citations expected, based on the global average for the field. More than 1.00 means that citations are more than expected. Less than 1.00 means the citations are less than expected. 5. Where does your institution have the highest impact? Go to the Overview module (Publications tab) and select "by Journal Category. You can view Field-Weighted Citation Impact by journal category not just in table format, but also as a bar chart. 6. To compare the impact of your institution s publications against other institutions or the national average, go to the Benchmarking module. 13

154 SciVal How you can use SciVal You can also compare the same institutions within a particular subject area. Select your field of interest from the 27 main categories and 334 subcategories within the Scopus journal classification. Or use a different journal classification. 14

155 SciVal How you can use SciVal How can my institution attract talented researchers? Who are the most talented researchers in my field? Which institutions are they associated with? And how do I find them? The best approach to do this in SciVal is to first determine the top institutions in your field, and then identify the top researchers at those institutions. 1. Go to the Overview module 2. Select your country or the world in the entity selection panel on the left-hand side. 3. Using the dropdown menu at the top of the page, select your field from the 27 main categories and 334 subcategories in the Scopus journal classification (or use a different journal classification). Say you are interested in renewable energy and sustainability: 4. After selecting your field, you can now see the top institutions in that field, based on number of publications and citation impact. 15

156 SciVal How you can use SciVal Now take a closer look at these institutions. Go to the entity selection panel and select one of the top institutions. 6. On the Authors tab, you can see the top authors at that institution within the selected field, based on number of publications. 7. You can export the list of authors to a spreadsheet for further analysis, or click on an author s name to see their publication profile. 16

157 SciVal How you can use SciVal How can my institution find collaboration partners? International collaborations can increase your impact and visibility, which could lead to more funding opportunities. How can you identify suitable international collaboration partners? Which countries should we focus on? And which institutions are active in which disciplines? 1. Let s say that your institution is Athena University. It is located in the U.S. and it is looking for a collaboration partner in Europe for its expanding medical school. 2. Go to the Potential Collaboration tab in the Collaboration module. Select the Scopus journal category Medicine from the dropdown menu at the top of the page. 3. The analysis shows 1,156 institutions in Europe that haven t yet collaborated with Athena. In other words: Athena has not co-authored any publications with these institutions within the selected time period. 17

158 SciVal How you can use SciVal Click on Europe to see which European countries are active in medicine. The numbers in the white circles represent the number of institutions in each country that have not yet collaborated with Athena. 5. Let s take a closer look at Germany, which has 119 institutions that have not yet collaborated with Athena. 18

159 SciVal How you can use SciVal Click on Germany to see which institutions in this country are active in medicine but are not yet collaborating with Athena in this field. 7. Each orange circle in Germany represents an institution. The number inside the circle shows the publication output at that institution within the selected field. You can also switch to a different metric (Citations, Citations per Publication or Field-Weighted Citation Impact) to see the collaborating institutions by citation impact. 8. Humboldt-Universitat zu Berlin stands out. There are 974 authors within medicine at this institution, with 850 publications in this field. 19

160 SciVal How you can use SciVal You can also view the list of institutions in a table and sort the 100 most productive institutions by citation impact, using the metrics Citation Count, Citations per Publication and Field-Weighted Citation Impact. 10 To view only hospitals and other medical institutions in Germany, select "Medical" from the. rightmost of the drop-down menus along the top of the map. 11 You can export a list of all institutions to a spreadsheet for further review.. 12For more details on this institution, click on the marker for Humboldt in the map to open the. institutional details pop-up. Here, you can compare the research output of Humboldt to the output of your own institution. You can also see a list of potential co-authors at Humboldt. 13 In the institution details pop-up, select "View a high-level performance overview of Humboldt- Universitat zu Berlin" from the Shortcuts menu to view Humboldt in the Overview module and explore this institution in even more detail. In which fields of medicine are they most active? Who are the top authors at that institution? How much of their publication output is among the most cited worldwide, and how much of it is published in the top journals? How much are they collaborating internationally? 14. Go back to the Collaboration module and select Humboldt from the entity selection panel. You can now see who they are already collaborating with. Are they working mostly with other German institutions or do they have a large international collaboration network? 20

161 SciVal How you can use SciVal 21 15In the Benchmarking module, select Humboldt and Athena University from the entity selection. panel. Now you can compare the two institutions by various metrics. Does Humboldt have more or less citation impact than your institution? 3.5 How can my institution identify its research strengths? Under the Competencies tab in the Overview module, you can see an analysis of your institution s research strengths, or competencies as they are known in SciVal. The Competencies analysis identifies areas of research in which your institution is a global leader in terms of publications, citations, or innovation. You can get detailed information on each of these areas, such as: How is your institution positioned in this field? Which researchers are most active in this field? What is your institution s unique contribution to this field? What are the overall trends is this is an emerging or declining field? To learn more, see What are competencies? View the list of research strengths. To see the list of your institution s research strengths: 1. Go to the Overview module and select your institution from the left-hand entity selection panel. 2. Click on the Competencies tab and view the list of strengths in the Table view. 3. The Circle graph plots the competencies on a big wheel representing the world of science. This allows you to spot in which subject areas your institution s competencies are concentrated, and how interdisciplinary they are. The closer a competency to the center of the wheel, the more interdisciplinary that competency is. 4. The Matrix graph plots the share of your institution within each competency against the growth of that field of research. This allows you to, for instance, spot emerging fields of research where your institution isn t yet playing a leading role. 5. Click on a competency in the Table, Circle or Matrix to open a modal window with more detailed information about that competency, such as the top institutions and researchers contributing to that field. 21

162 SciVal How you can use SciVal What is the impact of adding a new researcher to my team? SciVal allows you to do what if scenario modeling. If I add researcher X to my team, how would my team perform? Let s define a research team and then compare its performance to a team made up of the current team plus a new recruit. 1. Go to the Benchmarking module. 2. In the entity selection panel (on the left side of screen), click Add Researchers and Groups. 3. Click Define new researcher. 4. Define your team member. 5. Follow this process for each researcher on your team. 6. Now go to the entity selection panel and click Define a new Group of Researchers. 7. Select your researchers from the left side of your screen and drag each one across to the right side of the screen. 22

163 SciVal How you can use SciVal Save as My current project team 9. Define a second group with the same researchers, plus the researcher you want to recruit. Save this group as My current Project team + new recruit. 10Now you can compare the two groups. in Benchmarking. Click on the x-axis button and select. the metric Field-Weighted Citation Impact. 23

164 SciVal How you can use SciVal Compare the performance of the current team versus the expanded team. As you can see, the. addition of the new recruit would significantly strengthen the performance of your team. 12 Try a few additional metrics. Other useful metrics for comparison include Scholarly Output,. Citation Count, Citations per Publication, and Collaboration Impact. 24

165 SciVal The Overview module 25 4 The Overview module 4.1 What is the Overview module? The Overview module provides a high-level overview of your institution s research performance based on publications, citations, and collaboration. In addition, you can review the performance of any of the 5,500+ institutions and 200+ countries in our database. You can even define your own research areas, publication sets and groups of researchers and review their performance. All data can be filtered by a specific subject area. The data can be exported, and you can review the underlying list of publications behind every publication count. View a short video about the Overview module 4.2 Working with the Overview module Selecting an entity Use the entity selection panel on the left-hand side to select the entity you want to view. 1. Open the section that contains the entity you want, e.g. Institutions and Groups for an institution. 2. If the entity you want is not listed, click on the Add link and start typing the name, then click on the name when it appears in the search results. 3. You can also define your own groups of researchers, publication sets and research areas Selecting a year range You can view publication data for a period of either three or five years. Use the year range selector at the top of the page to select the desired year range. Optionally, you can also include the current year and future publications. However, you may want to exclude this because, by the end of the current year, Scopus has only received and indexed a certain portion of the current year s journals from other publishers. 25

166 SciVal The Overview module Filtering by journal category Interested in evaluating or comparing your performance within a specific discipline? Choose from 27 categories and 334 subcategories in the Scopus journal classification. Or use a different journal classification. 1. Use the filter dropdown menu at the top of the page to select a specific journal category. 2. The subcategories appear when you click on the arrow in a category. 3. After your select a journal category, all data shown in SciVal will be filtered by that category. That is to say, the data will be limited to publications in journals within that category. 4. Choose no filter selected from the menu to remove the filter and show all data. 26

167 SciVal The Overview module How can you use the Overview module? Get an overview of your institution's research performance You can get an overview of your institution s research performance in terms of publications and citations, and answer questions such as: Who are the most prolific or most cited authors at my institution? In which disciplines is my institution most active? In which journals is my institution publishing the most? What are the most cited publications of my institution? Who are the top collaboration partners of my institution? To view your institution in Overview: 1. Go to the Overview module and make sure your institution is selected in the entity selection panel on the left-hand side. 2. Select the year range you want from the first dropdown menu at the top of the page pear when you click on the arrow in a category. 3. Do you want to view your institution s research performance within a specific discipline (such as chemistry or engineering)? Then select a journal category from the second dropdown menu at the top of the page 4. Click between the Summary, Publications and Citations tabs to get an overview of your institution s research performance in terms of publications and citations Explore the publication output of your institution You can see the total list of publications at your institution by clicking on View list of publications on the Summary tab. The most cited publications are at the top. The filter options on the left-hand side allow you to explore your institution s publications in various ways. For example, you can see the top authors and journals, the top collaborating institutions and countries, and the top keywords. Try filtering the publications by any of the filter options. The breakdown is now recalculated to reflect the new subset. 27

168 SciVal The Overview module Get an overview of your institution's research strengths Under the Competencies tab in the Overview module, you can see an analysis of your institution s research strengths, or competencies as they are known in SciVal. The Competencies analysis identifies areas of research in which your institution is a global leader in terms of publications, citations, or innovation. You can get detailed information on each of these areas, such as: How is your institution positioned in this field? Which researchers are most active in this field? What is your institution s unique contribution to this field? What are the overall trends is this is an emerging or declining field? To learn more, see What are competencies? View the list of research strengths. To see the list of your institution s research strengths: 1. Go to the Overview module and select your institution from the left-hand entity selection panel. 2. Click on the Competencies tab and view the list of strengths in the Table view. 3. The Circle graph plots the competencies on a big wheel representing the world of science. This allows you to spot in which subject areas your institution s competencies are concentrated, and how interdisciplinary they are. The closer a competency to the center of the wheel, the more interdisciplinary that competency is. 4. The Matrix graph plots the share of your institution within each competency against the growth of that field of research. This allows you to, for instance, spot emerging fields of research where your institution isn t yet playing a leading role. 5. Click on a competency in the Table, Circle or Matrix to open a modal window with more detailed 28

169 SciVal The Overview module 29 information about that competency, such as the top institutions and researchers contributing to that field Get an overview of collaboration by your institution The Overview module also shows the top external collaboration partners of your institution, and how much your institution is collaborating (including international collaboration). Collaboration is measured in terms of co-authored publications. The filter options on the left-hand side allow you to explore your institution s publications in various ways. For example, you can see the top authors and journals, the top collaborating institutions and countries, and the top keywords. Go to the Collaboration module for a much more detailed view of external collaboration at your institution Evaluate the performance of a researcher or research team SciVal lets you define and evaluate individual researchers as well as groups of researchers. Groups of researchers can be research teams at your institution, but also larger units such as institutes, departments, and faculties. You can even define fantasy researcher groups. For example, you can simulate what would happen when you add a top researcher from another institution to an existing research team at your institution. To define and view a researcher: 1. Go to My SciVal and click on Define a new Researcher 2. Now go to Overview and select your new researcher. 3. You can now evaluate the research performance of this researcher. You can also see the collaborating institutions and co-authors of this researcher. To define and view a group of researchers: 1. Go to My SciVal and click on Define a new Researcher to define the individual researchers that will make up your group. 29

170 SciVal The Overview module Click on Define a new Group of Researchers to define the group. 3. Now go to Overview and select your new group. 4. You can now evaluate the research performance and collaboration of this group. The Top Researchers section on the Summary tab shows the top 5 researchers in that group by number of publications, number or citations or h-index. Go to the Researchers tab to see the complete list of researchers that make up the group. If different units of your institution have been predefined in SciVal as groups of researchers, select one of these group of researchers in Overview and go to the Collaboration tab. You can now see how much internal collaboration is taking place within the group, and how much collaboration with other groups within the same parent group, for example other departments within the same faculty Investigate other institutions In Overview, you are not limited to your own institution, but you can view the research performance of any other institution. You can, for instance, find out: In which journals are the world s top institutions publishing the most? Would this institution be a suitable collaboration partner for my own institution? Who are the other collaboration partners of my institution s collaboration partners? Who at this institution would be good to approach for potential collaboration? Use the entity selection panel on the left-hand side to select the institution you want to view. 1. Open the Institutions and Groups section in the entity selection panel 2. If the institution you want is not listed, click on the Add Institutions and Groups link and start typing the name, then click on the name when it appears in the search results See your institution's national or global position How is your institution positioned? What is your institution s position, nationwide or worldwide, in terms of publication output or impact? 1. Select your country from the entity selection panel on the left-hand side. 2. If the country is not listed, click on the Add Countries and Groups link and start typing the name, then click on the name when it appears in the search results. 3. The Institution tab ranks all the institutions in your country by number of publications, citations, or authors. You can see who the top players in your country are, and how your institution ranks among them. 30

171 SciVal The Overview module 31 4 You can view and compare the citation impact of each institution by selecting Citations per Publication or Field-Weighted Citation Impact from the drop-down menu. You can also get a picture of what s happening worldwide: 1. Select World from the Countries and groups sections of the entity selection panel. 2. The Institutions tab ranks all institutions worldwide. 3. You can also see the top journals worldwide (under Publications by journal ) Identify potential collaboration partners The Institutions and Authors tabs for a country can also be used to identify potential collaboration partners - both institutions and individual researchers. Say you are looking for collaboration partners in the United States within the field of chemistry: 1. Select United States in the entity selection panel 2. Select Chemistry from the dropdown menu at the top of the page 3. Go the Institutions and Authors tabs to find out who the key players in that country are. 31

172 SciVal The Overview module For more details on any of the institutions in this list (such as the top authors at that institution or how much it is collaborating internationally), select it from the entity selection panel. 5. To identify key players worldwide, select the World in the entity selection panel. You can also view the top institutions and authors in a group of countries (like the European Union, South America or BRICS) or in a group of institutions (like the Russell Group) See your institution's performance in a specific research area SciVal allows you to define a specific field of research. They can for instance represent a strategic priority of your institution or an emerging area of science. Unlike the fixed, broad categories of the Scopus journal classification, these research areas can be as granular or interdisciplinary as you like. Once you have defined a research area, you can : See how your institution is performing in that field Spot national and international trends Identify collaboration partners Say that you are interested in how much research is happening at your institution on neuroinformatics within the field of computer science. 1. Click on Add Research Areas, then Define a new Research Area in the entity selection panel on the left-hand side of the Overview module. 2. Define your Research Area using the search term neuroinformatics. 3. Narrow down your definition by limiting it to publications in computer science journals only. 4. Name and save the research area 5. You can now select and view the research area in the Overview module. 32

173 SciVal The Overview module 33 You can define a whole series of research areas, for instance a list of strategic goals of your institution, and see how your institution is performing in all of these. 1. Define your research areas. 2. Go to Overview and select your institution from the entity selection panel on the left-hand side. 3. The Publications by Research Area section under the Publications tab gives you an overview of your institution s performance in each of the research areas you have defined Get instant performance snapshots of your Publication Sets SciVal gives you the flexibility to define and evaluate your own Publication Sets in the Overview module. Here you can dive deeper to see the authors, intuitions and countries that have contributed to them and the impact they have. A Publication Set is a fixed set of publications which you can create either by using a subset of a researcher s career (e.g. most cited publications) or by selecting publications on a particular topic. To define and view a Publication set: Go to the Overview module and click on Publications Sets Click on Add Publication Sets Select a researcher from the list Select the most cited publications of the researcher and save it You can now evaluate the research performance of this Publication Set 33

174 SciVal The Overview module 34 Similarly to other entity types, you can get an instant view of the output of the Publication Set in the Overview module. For example, you can find out to what extend the publications are present in the top 10% most cited publications worldwide. In addition, you can further analyze the Publication Set based on citations, collaboration, authors and institutions. 4.4 Using the Competencies analysis to identify research strengths What are competencies? Under the Competencies tab in the Overview module, you can see an analysis of your institution s research strengths, or competencies as they are known in SciVal. You can use this to identify or demonstrate areas of research strength at your institution. You may even identify pockets of excellence at your institution that you were not yet aware of. This analysis can also be used to identify research strengths of any other institution, such as a potential collaboration partner. And you can see the research strengths of your country as a whole, and see what your institution contributes to each of those national strengths. 34

175 SciVal The Overview module 35 You can use the Competencies analysis to: identify and analyze any institution s or country s interdisciplinary areas of research excellence related to any other institution or country worldwide evaluate different strengths of researchers and research teams based on publication output, impact and innovativeness identify institutions and individual researchers for collaboration to strengthen your leadership position How is the Competencies analysis different from the traditional ways of evaluating research? The Competencies analysis gives you an alternative way of looking at research performance. The methodology is based on citation patterns (co-citation analysis), as opposed to the traditional publication and citation rankings, which are based on journal-based classifications. For example, traditionally, if a publication was published in the journal W ater Treatm ent, it would be considered to be 100% about that field. The Competencies methodology might find that this same publication has 20% of its references to computer sciences and 10% to economics fields, giving a more accurate view of the publication and the interdisciplinary fields to which it contributes. Using this methodology, SciVal competencies helps you identify: Top researchers in granular fields of research, who may not be visible in traditional publication and citation rankings due to high thresholds. Emerging areas of science. Since SciVal competencies are based on 5 years of data, it helps you see current research topic trends for an institution or country. If a research topic shows positive significant growth over 5 years, it may be an area that you want to focus on. Multidisciplinary research areas. In a traditional journal-level classification, publications are clustered into subject categories based on the journals in which they have been published. Interdisciplinary publications could be ignored if published in single-disciplinary publications. In SciVal, publications are clustered based on co-citation patterns. In other words, we apply an article-based classification. Since citations often cross multiple scientific disciplines, SciVal is able to map performance at an interdisciplinary level. See also How SciVal identifies competencies How can you use competencies? You can use the Competencies analysis to: identify and analyze any institution s or country s interdisciplinary areas of research excellence 35

176 SciVal The Overview module 36 evaluate different strengths of researchers and research teams based on publication output, impact and innovativeness identify institutions and individual researchers for collaboration to strengthen your leadership position Identify or demonstrate research strengths of your institution. You can use the Competencies analysis to identify or demonstrate areas of research strength at your institution. You may even identify pockets of excellence at your institution that you were not yet aware of. For each competency identified for your institution, you can examine: What is your institution s unique contribution to this field? Which researchers at your institution are most active in this field? What are the overall trends is this is an emerging or declining field? Which institutions are most active in this field, and how is your institution positioned? Who are your institution s collaboration partners in this field? Who are you not yet collaborating with? To view the Competencies analysis for your own institution: 1. Go to the Overview module 2. Select your institution from the left-hand entity selection panel 3. Click on the Competencies tab View current collaboration partners and and identify potential new partners. For each area of research identified as a competency of your institution, you can see which institutions your institution is collaborating with in that field. You can also see which institutions you are not yet collaborating with. To see the list of collaborating and not yet collaborating institutions: 1. Go to the Overview module and select your institution from the left-hand entity selection panel 2. On the Competencies tab, click on a competency in the table (or on the Circle or Matrix graph) to open a modal window with more details about the competency 3. In the modal window, select the Institutions tab 4. Click on "View another set of institutions" and select either "Collaborating Institutions" or "Not yet collaborating Institutions" 5. Select the Compare tab to compare an institution's contribution to the field to your own institution's contribution. Here you can see where you overlap and also what your unique 36

177 SciVal The Overview module 37 contributions to the field are. Find potential new hires. For each area of research identified as a competency of your institution, you can see who the top authors within that field are by number of publications. This can be useful to find potential new hires to strengthen your institution's competencies. You can also identify specific researchers at other institutions for potential collaboration. 1. On the Competencies tab, click on a competency in the table (or on the Circle or Matrix graph) to open a modal window with more details about the competency 2. In the modal window, select the Authors tab 3. Click on "View the top authors for a different set of institutions" and select "Other contributing Institutions" to see the top authors in this field at institutions other than your own institution Identify research strengths of another institution. You can view not only the research strengths of your own institution, but also the strengths of any other institution. For instance, you can identify pockets of excellence of an institution that you are considering as a potential collaboration partner. To view the Competencies analysis for your own institution: 1. Go to the Overview module 2. Select an institution from the left-hand entity selection panel 3. Click on the Competencies tab View nationwide research strengths. You can also use the Competencies analysis to see the research strengths of your country as a whole (or any other country). This allows you to compare the strengths of your institution to the nationwide strengths. To what extent your institution is your institution aligned with national research priorities? To view the Competencies analysis for your own institution: 1. Go to the Overview module 2. Select a country from the left-hand entity selection panel 3. Click on the Competencies tab Working with competencies View the Competencies analysis for your institution. To view the Competencies analysis: 1. Go to the Overview module and select your institution (or any other institution or country) from the left-hand entity selection panel 2. Click on the Competencies tab 3. Select the Table view to see the list of research strengths for your selected entity 37

178 SciVal The Overview module 38 View your competencies in a graph. In addition to the table view, there are two different types of visualizations available to you: the Circle and Matrix graphs. The Circle graph plots the competencies on a big wheel representing the world of science. This allows you to spot in which subject areas your institution s competencies are concentrated, and how interdisciplinary they are. The closer a competency to the center of the wheel, the more interdisciplinary that competency is. The outer circle represents all the published articles in Scopus across the main Scopus journal categories. The size of the pie represents the relative publication output in a specific journal category.the Circle plots the competencies on a big wheel representing the world of science. This allows you to spot in which subject areas your institution s competencies are concentrated, and how interdisciplinary they are. The closer a competency to the center of the wheel, the more interdisciplinary that competency is. Each bubble represents a competency for the selected institution/country. Bubble size of the competency represents the volume of publications published worldwide, indicating the size of the field. The larger the circle, the more publications are in that competency. The smaller the bubble, the more specialized the field it represents. The colored lines in the competency point to the related journal categories. Competencies positioned towards the middle indicate an interdisciplinary mix. Competencies 38

179 SciVal The Overview module 39 positioned towards the edge of the circle indicate a strong distinctive competency in the colorcoded subject area. The Matrix graph plots the share of your institution within each competency against the growth of that field of research. This allows you to, for instance, spot emerging fields of research where your institution isn t yet playing a leading role. The size of a circle is based on the number of publications published by the institution/country. The larger the circle, the more publications are in that competency. The colored lines in the circles represent the subject areas and disciplines of the publication clusters in the competency. The horizontal position shows the leadership position of the institution/country within that competency. The vertical position shows how fast the number of publications in that competency is growing. The Matrix view can help you answer high-level strategic questions such as: How stable are our competencies? Which competencies require investment of our time and resources? Is our institution or country maintaining the lead in a certain research field? Filter the list of competencies. Click on "Filter by" to filter the list of competencies by a specific journal category. This is useful if you are interested only in a single subject area (for instance chemistry or engineering). You can also filter by other criteria. For instance, you could choose to view only competencies within research areas that are growing over time in terms of publication output. 39

180 SciVal The Overview module 40 Search for competencies. You can search for a specific institution or researcher in order to find competencies where that institution or researcher is active. For instance, you can search for a specific researcher at your institution to find the competencies to which that person has contributed publications. You can also search through the competencies by journal category (subject area) or journal. 1. In the Overview module, click on the Competencies tab, then click on "Search for competencies" 2. Choose to search by author (researcher name), institution, journal category or journal 3. Type the search string and use the dropdown menu marked "Limit search to:" to select whether you want to search only within your own institution, only within your own country or region, or worldwide 4. Click Search Analyze your competencies in detail. Click on a competency in Table, Circle or Matrix to open a popup window where you can explore that competency in detail. For instance, this shows you: Which institutions are most active in this field, and how is your institution positioned? Who are your institution s collaboration partners in this field? Who are you not yet collaborating with? Which researchers at your institution are most active in this field? What is your institution s unique contribution to this field? What are the overall trends is this is an emerging or declining field? How did SciVal identify this field as a competency of your institution? At the top of the window are the top three keywords for the field overall. Below that, on the Summary 40

181 SciVal The Overview module 41 tab, you can see the top 10 keywords for the publication output of your institution in this field. The top 10 authors, journal categories and journals for your institution are also shown. You can switch to "all contributing institutions worldwide" to see the top 10 keywords, authors, journal categories and journals for the entire field. Combine competencies. Are two or more competencies actually part of the same area of research strength at your institution? Then you can combine them into a self-defined "research area". 1. In the Overview module, go to the entity selection panel on the left-hand side of your screen. 2. In the Research Areas section, click on Add Research Areas, then Define a new Research Area 3. A popup window now opens where you can define a research area. Select the Use competencies tab. 4. Drag the competencies that you want to combine from the left side to the right side of the screen. When you are done, click Next Step. 5. Name and save the research area. 6. The new research area will now be computed and shown in Overview. For more information on self-defined research areas, see About research areas in SciVal How SciVal identifies competencies The Competencies analysis identifies research strengths of your institution interdisciplinary areas of research where your institution is a global leader. The methodology behind it is based on citation patterns (co-citation analysis) instead of a traditional journal classification. Highly cited publications are clustered based on co-citation counts. These publication clusters are then grouped together into competencies. Competencies are classified as either dis tinc tive or em erging competencies. The analysis is always based on five years of data. For example: If you select 2013, the analysis is based on data from 2009 up to and including View a presentation explaining the Competencies methodology (PowerPoint, 3MB) For a detailed, step-by-step explanation of how a particular specific competency was identified: 1. Go to the Overview module and select the Competencies tab Click on a competency in the table (or on the Circle or Matrix graph) to open a modal window with details about that competency 2. In the modal window, select the Methodology tab for a full explanation 41

182 SciVal The Overview module Scroll down to the bottom of the page and click the "View the graph at full size" link to see a graph showing the publication clusters that make up the competency and how these clusters are connected. How are competencies created in SciVal? SciVal performs a co-citation analysis on all the publications published in a specific year (e.g. 2013). This citation analysis groups the references into publication clusters representing specific areas of research. These clusters are then grouped to become competencies. Competencies are created in three steps: Step 1. Create publication clusters Researchers organize themselves around highly specific research topics in which they cite each other's work. To identify these research topics, SciVal performs a co-citation analysis on all the publications published in a specific year (e.g. 2013). This analysis groups the references cited by those publications into "publication clusters" that represent specific areas of research. Publication clusters are groups of highly cited publications and the current publications that cite them. Once the clusters have been identified, all publications published in the past 5 years (e.g ) 42

183 SciVal The Overview module 43 are assigned to publication clusters based on their references. Step 2. Determine which publication clusters are strengths Next, SciVal determines which publication clusters represent the individual institution's or country's strengths. These are the clusters where the institution/country as a whole has a significant presence, in terms of publication contribution, compared to other institutions/countries worldwide. To determine what can be considered a significant presence, SciVal first calculates the Relative Publication Share (RPS) of each institution/country for each of the approximately 100,000 clusters. RPS is the publication output of the institution/country (over a 5-year period) divided by the publication output of the institution/country ranked #1 worldwide within a particular competency. SciVal uses a different RPS value depending on whether the competency is for an institution or a country: Institutions Countries For a cluster to be selected for a competency, it must have a Relative Publication Share (RPS) greater than a specified threshold. The threshold is determined by size of the specific institution/country on a sliding scale between 0.2 and 0.6. The threshold may vary from year to year, since it is based on a comparison with the 5-year output of Stanford University, which has been selected as the benchmark because it is representative of a large institution. The following is the formula for determining an institution's threshold; note since that the number must be between 0.2 and 0.6, the result is rounded up or down accordingly. Threshold = (0.494 * 5-year publication output)) Example: Chiba University 5-year publication output) / (Stanford s Chiba University s threshold (in 2009) is This is because its 5-year output is 8,406 publications and Stanford's 5-year output is 38,733 publications: 0.34 = (0.494 * 38,733)) For countries, the U.S. has been selected as the benchmark because it is representative of a large country. The RPS of a country is calculated as part of a formula which determines the number of clusters to be selected for competencies: Number of clusters = (number of publications belonging to the country / number of U.S. publications )*number of U.S. publication clusters with RPS >1.0 Example: Denmark Number of Denmark publications: 69,124 Number of U.S. publications: 2,406,043 Number of U.S. clusters with RPS >1.0 : 64,516 Number of clusters to include in the competencies for Denmark: 1,854 1,854 = (69,124/ 2,406,043 )*64,516 Step 3. Group publication clusters into competencies The publication clusters identified in Step 2 are then grouped together to form competencies that represent an individual institution's or country's strengths. The clusters are grouped together when they share one or more publications from the institution/country. The competencies represent research areas where the (entire!) institution or country has obtained a leading position in terms of number of publications, number of highly cited publications or innovativeness - the recentness of cited publications. These leadership criteria are described under the Methodology tab for each of the competencies. How are competencies assigned to journal categories? Each competencies is assigned to one or more ASJC journal categories (subject areas) based on the journal categories of the publication clusters that make up the competency. Each cluster is 43

184 SciVal The Overview module 44 assigned to the dominant journal category in that cluster. What is the difference between distinctive and emerging competencies? Competencies be classified as either dis tinc tive or em erging competencies, depending on size of the field and whether or not the leadership criteria have been met. In order to classify as dis tinc tive, a competency must meet the size criterion: it is a significantly large field of research. This means that the worldwide publication output in the field over the five-year period exceeds a specified threshold. The threshold is related to the size of the institution/country. For a large institution like Harvard or Yale, the fractionalized publication count of the publication output in the field must exceed 500. Smaller institutions are given a lower threshold. And it must also meet at least one of the three leadership criteria: the institution/country is ranked #1 worldwide in this field of research in terms of publication output. The Relative Publication Share of the institution/country is greater than 1, meaning that it is has a larger fractionalized publication count than any other institution/country in that field. the institution/country is ranked #1 worldwide in this field of research by number of highly cited publications. The Relative Reference Share of the institution/country is greater than 1, meaning that it is has a larger number of "reference publications" ( highly cited publications) than any other institution/country in that field. the institution/country is ranked #1 worldwide in this field of research in terms of innovativeness - the recentness of cited publications. This is determined using the State of the Art value, an indicator of the recency of the work cited by the institution/country relative to the average recency of work cited in this field. To meet this criterion, the institution/country must have a larger State of the Art value than any other institution/country in that field. In addition, it must also have a Relative Publication Share larger than 0.8, meaning that the publication output of the institution/country should be the largest in the field, or else at least 80% of the output of the institution/country ranked #1. Competencies that do not meet the criteria for distinctive competencies are classified as em erging. Emerging competencies may not meet any of the criteria, or it may meet the field size criterion but not any of the leadership criteria, or it may meet at least one of the leadership criteria but not the size criterion. A full explanation of why a particular competency is classified as distinctive or emerging is given on the Methodology tab. 44

185 SciVal The Benchmarking module 45 5 The Benchmarking module 5.1 What is the Benchmarking module? The Benchmarking module lets you easily evaluate your research performance in comparison to others. How does your institution compare to others in your region, country or the world? Choose from a broad range of metrics. You can use 15 different metrics to compare the performance of different types of entities, such as institutions, research teams and individual researchers. View a short video about the Benchmarking module Which metrics are available to use in SciVal? 5.2 Working with the Benchmarking module Selecting metrics Select the metric you want to view. By default, the metric Scholarly Output (number of publications) is shown. To view a different metric, click on the y-axis button along the top of the chart and select it from the list. Then click on the Choose as y-axis button. Which metrics are available to use in SciVal? Choose metric options. Each metric has different options, but all let you choose the types of publications to include. For instance, you can choose to include only articles and reviews, or only conference papers. Citation metrics also let you choose whether or not you want to include self-citations. Plot metrics against each other. You can plot two or even three different metrics against each other. Two metrics are shown as a scatter plot. Three metrics are shown as a bubble chart, where the size of the bubbles (circles) on the chart indicates the value of the third metric. Select a second metric from the x-axis button. This will replace Publication Year with that metric. If you want, you can select a third metric from the bubble size button. Let s compare Athena, Yale and the United States on three metrics: Scholarly Output, Field- Weighted Citation Impact and International Collaboration. 1. Select Athena University, Yale and the United States from the entity selection panel on the lefthand side of the screen. 2. Click on the y-axis button and select Scholarly Output, if this wasn t already selected 3. Click on the x-axis button and select Field-Weighted Citation Impact. 4. Click on the Bubble size button and select Collaboration. In the options for this metric, select International collaboration. 5. The chart now shows that the Field-Weighted Citation Impact of Athena is lower than Yale's, but higher than the United States average. The amount of international collaboration is slightly lower than Yale's. 45

186 SciVal The Benchmarking module 46 You don t have to select two different metrics necessarily. Instead, you can also select the s am e metric for both x-axis and y-axis, but with different options. For instance, you could compare outputs in the top 1% percentile to outputs in the top 10% percentiles, or you could compare international collaboration to national collaboration Selecting a year range You can view publication data from 1996 until the present. Use the time period selector at the top of the page to select the start and end year. You may want to exclude the current year because, by the end of the current year, Scopus has only received and indexed a certain portion of the current year s journals from other publishers. 46

187 SciVal The Benchmarking module Filtering by journal category Interested in evaluating or comparing your performance within a specific discipline? Choose from 27 categories and 334 subcategories in the Scopus journal classification. Or use a different journal classification. 1. Use the filter dropdown at the top of the page to select a specific journal category. 2. The subcategories appear when you click on the arrow in a category Working with the chart Chart legend. Each entity plotted on the chart has a different color and symbol. This is shown in the legend shown below to the chart. You can hide a particular entity from the chart by clicking on the "eye" icon next to that item in the legend. Below the legend, you can find the Metrics detail option which gives you further information on the metrics you selected. Data pop-ups. Hover over a data point on the chart and a small pop-up will appear with the metrics you have selected and their values for that year. 47

188 SciVal The Benchmarking module 48 Export the chart. You can export the chart to an image file by selecting Export the chart as an image file from the Export menu in the top right corner. This will export the chart in several different file formats at once (JPEG, PNG, SVG and PDF). You can also export the underlying data by choosing Export the data to a spreadsheet file Working with the table When viewing a metric by year, you can scroll the table horizontally to see the values for all years in the selected year range. View the overall value. When viewing metrics over a year range, the rightmost column will show the overall value for that metric. This can be the total number of publications for the selected year range (for Scholarly Output for instance) or the overall value for the selected year range (for Field-Weighted Citation Impact for instance). View the underlying publications. When numbers of publications are shown in the table, you can click on any number to view the actual list of publications. Numbers of publications are shown for Scholarly Output or Cited Publications, for example. Export the data. To export the data in the table to a spreadsheet file, choose Export the data to a spreadsheet file from the Export menu in the top right corner. You can then view and manipulate the data in an external spreadsheet application such as Microsoft Excel. 48

189 SciVal The Benchmarking module View the list of journals When you have selected one or more researchers or groups of researchers in Benchmarking, you can view the list of journals for those entities. Click on "View list of journals for the selected Researchers and Groups" below the chart or table in Benchmarking to open the "List of journals" window. This shows you a breakdown by academic journal of the selected researchers' and groups' publication output and impact: 49

190 SciVal The Benchmarking module 50 As usual the publication output is limited to the selected year range and subject area (the selected journal category filter, if any). The journals are arranged by highest SNIP or SJR value. View citation impact. By default, publication output is shown, but you can also view citation impact. Use the drop-down menu at the top to switch from Scholarly Output (number of publications) to either Citation Count or Field-Weighted Citation Impact. Export the list of journals. Go to the Export menu and select "Export the list of journals to a spreadsheet file". This will export the SNIP and SJR values of each journal, as well as the Scholarly Output, Citation Count and Field-Weighted Citation Impact for each researcher or group's output in that journal. Define a research area. You can quickly define a new research area based on this list of journals. Use the link in the Shortcuts menu to start the define process for a research area with the list of journals preselected. If desired, you can then add additional journals, remove journals or filter the research area by a specific institution or country for instance. This enables to you to create a journal profile for a group of researchers and benchmark the group against that profile. You can use this to compare the journal profile of one group of researchers to the journal profile of another group. Or you can see how the citation impact of your researcher group compares to the impact of all researchers publishing in the same set of journals. 5.3 How can you use the Benchmarking module? Compare your institution to others How does your institution compare to peer institutions? Let s say that your institution, Athena University, wants to compare its research performance with SUNY Buffalo, Yale and Dartmouth. 1. Start by setting up the list of institutions, using the entity selection panel on the left side of your screen. Make sure your institution and the peer institutions are all selected (checked off) in the entity selection panel. 50

191 SciVal The Benchmarking module If an institution is not listed, click Add Institutions and Groups and start typing the name of that institution. Then select the institution from the list of search results that appears below the text field. 3. By default, you will view Scholarly Output by publication year. This shows you the total research output of your selected institutions over a period of time. 4. Use the buttons along the top of the chart to select different metrics. Use the y-axis button to change from Scholarly Output to another metric. To compare two different metrics, select a second metric from the x-axis button Benchmark your institution against the national average Field-Weighted Citation Impact is a good metric to use when you want to compare your institution s research performance to the national average. This is because Field-Weighted Citation Impact adjusts for differences in citing behavior across disciplines. A score of 1.00 means citations are as expected based on the global average. More than 1.00 means that citations are more than expected. Less than 1.00 means the citations are less than expected. 1. Select Field-Weighted Citation Impact from the y-axis button. 2. Make sure your institution and your country are selected in the entity selection panel on the left side. 3. If your country is not listed in the entity selection panel, click Add Countries and Groups and start typing the name of that country. Then select the country from the list of search results that appears below the text field. Let s say that you want to compare your institution, Athena University, to the national average. The Field-Weighted Citation Impact for Athena was 1.83 in This means citations were 83% higher than expected based on the global average. When plotted against the United States and Yale, we can see that Athena s Field-Weighted Citation Impact is higher than the US national average, but it is slightly lower than Yale. 51

192 SciVal The Benchmarking module 52 Other metrics that can be used to compare your institution to the national average include Outputs in Top Percentiles (by percentage), and Publications in Top Journal Percentiles (by percentage), and Collaboration (by percentage) Analyze developments in a field over time You can view metrics for a specific field of research over time. This lets you analyze developments in that field, such as whether the field is emerging, declining or levelling off. You can use either journal categories or self-defined research areas. Journal categories are categories in the Scopus journal classification. They represent broad areas of science, such as chemistry or engineering. Self-defined research areas can be more granular or interdisciplinary. To use a journal category: 1. Choose the journal category from the filter selector at the top of page. 2. Select the World from the entity selection panel. 3. In addition to the World, you can also select your country or your institution. This lets you compare your national or institutional performance to the international trend. 4. Select Scholarly Output from the y-axis button and Publication Year from the x-axis button. You can now see the year-by-year publications trend in this journal category. 52

193 SciVal The Benchmarking module Try some other metrics, like Field-Weighted Citation Impact, Collaboration or Outputs in Top Percentiles. SciVal also allows you to define your own research areas and view these in the Benchmarking module. 1. Use the entity selection panel on the left-hand side to select a previously defined research area. 2. You can also define a new research area. In the entity selection panel, click on Add Research Areas, then Define a new Research Area. 3. Select the research area in Benchmarking to see the worldwide output in that field. 4. You can select multiple research areas to compare the output in one research area to the output in another area Identify suitable benchmark institutions If you have access to the Overview module, you can make use of it to identify suitable benchmark institutions for your institution. These are institutions that you can use to compare your own institution against, in order to evaluate how well your institution has performed. Say you are looking for benchmark institutions for Athena University within the United States in the field of chemistry. 1. In the Overview module, select the United States and filter by chemistry. 2. Go to the Institutions tab. This will now list the top U.S. institutions by number of publications in chemistry journals. 53

194 SciVal The Benchmarking module You can now select one or more institutions from this list that you would like to match or exceed in terms of research performance Compare your institution against collaborating institutions You can use the Overview or Collaboration modules to find the top collaborating institutions of your institution. You can then compare these institutions in the Benchmarking module. 1. Go to the Overview module and select your institution. 2. Go to "Top collaborating institutions" under the Collaboration tab 3. Choose Benchmark these institutions from the Shortcuts menu. 54

195 SciVal The Benchmarking module You will now jump to the Benchmarking module and your institution, plus the top collaborating institutions will be selected there. 5. Note that the selected year range in Overview will also be selected in Benchmarking. 55

196 SciVal The Collaboration module 56 6 The Collaboration module 6.1 What is the Collaboration module? The Collaboration module is where you can evaluate the existing research collaborations of your institution. Start with a worldwide view of your collaboration landscape. Then zoom in to individual collaborating institutions and researchers anywhere in the world. You can also use this module to identify new opportunities for collaboration in your own country or worldwide. See which institutions and researchers your institution isn t yet collaborating with. All data can be filtered by a specific subject area. Say you are only interested in collaboration within the field of chemistry. Then you can view only institutions and researchers that have co-authored chemistry publications with your institution. The data can be exported, and you can review the underlying list of publications behind every publication count. 6.2 Working with the Collaboration module Selecting an institution Use the entity selection panel on the left-hand side to select the institution you want to view. If the institution you want is not listed, click on the Add link and start typing the name, then click on the name when it appears in the search results Selecting a country SciVal allows you to explore international collaboration on a country level. Find out who your country collaborates with, how many co-authored publications you have, what the impact of the co-authored publications is, and much more. To use this feature go to the Collaboration module and select a desired country from the select panel. Select a country and use the map view to zoom in on the region you are interested in. Once you find a collaborating country you can see the partnership in detail.. 56

197 SciVal The Collaboration module 57 Question you are able to find answers for: 1. How many co-authored publications do we have? 2. What is the citation impact of our co-authored publications? 3. Who are the current co-authors from the collaborating countries? 4. Who are the potential co-authors in the collaborating countries? You can alternatively create a research area in SciVal and filter the collaborating countries by that research area or any other Scopus journal category to find out what your common impact is within a particular field of research Selecting a year range You can view publication data for a period of either three or five years. Use the year range selector at the top of the page to select the desired year range. Optionally, you can also include the current year and future years. However, you may want to exclude this because, by the end of the current year, Scopus has only received and indexed a certain portion of the current year s journals from other publishers. 57

198 SciVal The Collaboration module Selecting a region, country or sector You can limit the list of institutions shown to a specific world region or country. This will apply to both the "Current collaboration" and "Potential collaboration" views, and to both the Map and Table views. In addition, you can filter the list of institutions by sector. For instance, you can choose to view only institutions in the corporate sector, or only institutions in the medical sector. You can also combine the geographical and sector selections. For example, you could choose to view only corporations in France, or only medical institutions in North America. Select a region. From the drop-down menu marked "Worldwide", select the region you would like to view, for instance North America, Asia Pacific or Europe. In Map view, you can also click on one of region markers that are shown on the map when you are zoomed out to worldwide view. Select a country. Start by selecting the region of the country you want from the leftmost drop-down menu, A second drop-down menu now appears that lets you pick the country. For example, to select the United States, select "North America" from the first menu, then select "United States" from the second menu. In Map view, you can also click on one of blue-on-white country markers that are shown on the map after you have zoomed in to a particular region. To return to region level, select "All countries" from the country menu. 58

199 SciVal The Collaboration module 59 Select a sector. Use the rightmost drop-down menu to select a specific sector. SciVal uses 5 organization types: Academic, Corporate, Government, Medical, and Other. These are composed of the following Scopus organization types: Academic: university, college, medical school, and research institute Corporate: corporate and law firm Government: government and military organization Medical: hospital Other: non-governmental organization Filtering by journal category or research area Interested in evaluating or comparing your performance within a specific discipline? You can choose from 27 categories and 334 subcategories in the Scopus journal classification. Or use a different journal classification. 1. Use the filter dropdown menu at the top of the page to select a specific journal category. 2. The subcategories appear when you click on the arrow in a category. Filter by research area. You can also filter by research areas that you have defined yourself. These can be as granular or interdisciplinary as you like. 1. Select your institution from the entity selection panel on the left-hand side. 59

200 SciVal The Collaboration module Use the filter dropdown menu at the top of the page to select a specific research area. 3. If you have not yet defined the research area, click on Define a Research Area in the filter menu Working with the map Zooming in and out. You can use the zoom control in the top left corner of the map to zoom in and out. You can also double-click on the map to zoom in further. Zoom in on a region. At world level, the map gives you an overview of your global collaboration landscape. There are markers on the map for each of the world regions (Asia Pacific, North America, South America, Europe, Middle East, and Africa). These markers show you how many collaborating institutions there are in each region. Click on one of the region markers at world level to zoom in to that region. 60

201 SciVal The Collaboration module 61 Zoom in on a country. After zooming in on a region, you see a number of round markers for each of the countries in that region. These country markers display the number of collaborating institutions in each country. Click on a country marker to zoom into that country. Zoom in on an institution. You can now click on any of the institutions. A pop-up window opens with full details on the collaboration with that institution. View citation impact. By default, the number of co-authored publications for each institution is shown, but you can also see the citation impact of those collaborations. Use the drop-down menu in the top-right corner of the map to switch to number of citations, Citations per Publication or Field- Weighted Citation. 61

202 SciVal The Collaboration module Working with the table Instead of the map view, you also see your institution s (potential) collaborating institutions in a tabular list view. In Current collaboration this table view shows the top 100 collaboration institutions, by number of publications. You can use the dropdown menus at the top to view the top collaborating institutions in a specific region or country. You can change the sort order of the table by clicking on any of the column headings. You can use the drop-down menu to switch from citations to a different metric for citation impact. Click on the name of an institution for full details of the collaboration with that institution Click on the number of co-authored publications to view the list of publications Use the Export menu to export the full list of collaborating institutions to a spreadsheet file. 6.3 How can you use the Collaboration module? Identify the collaboration partners of your institution Get an overview of your collaboration landscape. The map view in Current collaboration gives you a global overview of the collaboration partners of your institution. You can then zoom in to a specific country. For example, to see your collaboration partners in France: 1. Select your institution from the entity selection panel on the left-hand side. 2. Go to Current collaboration and select the Map view 62

203 SciVal The Collaboration module Click on the Europe marker to zoom in and see all the European countries where collaboration with your institution has taken place. 4. Click on the round marker shown in France to zoom in and see all collaborating institutions and researchers in France. By default, the number of co-authored publications for each institution is shown, but you can also see the citation impact of those collaborations. Use the drop-down menu in the top-right corner of the map to switch to number of citations, Citations per Publication or Field-Weighted Citation. 5. Switch from map to table view to see the collaborating institutions in France in a tabular list view. Click on one of these institutions to explore the collaborating with that institution in more detail. 6. Go to the Export menu to export the full list of collaborating institutions to a spreadsheet file. 7. To view collaboration in France within a particular sector (such as corporate or medical), select the sector from the rightmost olf the drop-down menus along the top of the map. Measure the impact of your collaborations. The table view lets you compare institutions by the number of publications co-authored with each. You can also evaluate the impact of your collaborations using metrics such as number of citations or Field-Weighted Citation Impact. 63

204 SciVal The Collaboration module 64 View your collaborations within a specific field of research. Do you want to see the collaborations of your institution within a specific field of research? Then select a filter from the dropdown menu at the top of the page. For example, to see collaboration within chemistry only, choose Chemistry from the menu. You can choose from 27 categories and 334 subcategories in the Scopus journal classification. Or use a different journal classification. You can also define your own research areas, which can be as granular or interdisciplinary as you like. 64

205 SciVal The Collaboration module Evaluate a collaboration partner in detail Click on any institution in either map or table view. You can now zoom into that institution and look at your collaboration with that institution in much greater detail. If you ve filtered the data by a journal category or research area, you will see only the collaboration with that institution in the field of research you ve selected. Compare the co-authored publications to the overall output of the institutions. At the top of the pop-up window, you can compare the co-authored publications to the total output at each institution. Which is the most active and most cited of the two institutions? Do the co-authored publications have more citation impact than the individual institutions overall publication output? View the list of co-authored publications. The pie chart in the pop-up window gives you a breakdown of the co-authored publications by journal category. In which disciplines did most of the collaboration occur? You can also view this as a bar chart, which lets you compare the co-authored publications by journal category to the total output of each institution by journal category. Click View list of publications for the full list of co-authored publications. The filter options will show you a breakdown of the publications by author, institution, publication 65

206 SciVal The Collaboration module 66 year, or keyword, for instance. Use the filter options to slice and dice the list in various ways. Explore the list of co-authors. Go to the "Current co-authors" tab to drill down to the full list of coauthors, both at your institution and at the collaborating institution. This lets you see which researchers have co-authoring publications, and which of those collaborations had the most citation impact. Click on a researcher s name for more details on that researcher s publication career. Click on the arrow next to each name to see their co-authors at the other institution. Use the Export menu to export the complete list of co-authors to a spreadsheet file. Identify potential new co-authors. Go to the "Potential co-authors" tab to see which researchers at each institution are not yet collaborating with the other institution. Here, you can identify potential matches between researchers at your institution and researchers at the other institution. The "Potential co-authors" tab lists the top 100 authors at each institution, by number of publications, who are not yet collaborating with the other institution. The list is available both for your institution's current collaboration partners (institutions where at least one researcher is collaborating institution), and for institutions where no one is collaborating with your institution yet. This view is particularly useful when filtering by a specific research area or journal category Other ways to evaluate a collaboration partner. Use the Shortcuts menu in the institution details popup to examine a collaboration partner in even more detail. You can: get a high-level overview of the institution in the Overview module view and compare metrics for that institution in Benchmarking 66

207 SciVal The Collaboration module 67 see the collaboration partners of that institution in the Collaboration module Identify potential new collaboration partners of your institution The Potential collaboration tab can be used to identify potential new opportunities for collaboration. This view is similar in many ways to the Current collaboration view: You can see these institutions in either map or table view, and zoom in from the world to a particular region or country. You can also use the Export menu to export the list of institutions to a spreadsheet file. This view shows you institutions not yet collaborating with your institution. These institutions did not co-author any publications with your institution within the selected year range and subject area (journal category or research area). The potential new collaboration partners are arranged by their total publication output, but you can also see their citation impact. This allows you to quickly spot the most suitable potential new collaboration partners. View potential collaboration partners within a specific field of research. The Potential collaboration view is most useful when you filter the data by a particular field of research. Say you are only interested in collaboration within the field of chemistry. Then you can view only institutions that have are active within chemistry, but have not yet co-authored any chemistry publications with your institution. You can choose from 27 categories and 334 subcategories in the Scopus journal classification. Or use a different journal classification. You can also define your own research areas, which can be as granular or interdisciplinary as you like. Evaluate a potential collaboration partner. Click on any institution in either map or table view. You can now zoom into that institution and evaluate that institution as a potential new collaboration partner in much greater detail. 67

208 SciVal The Collaboration module 68 At the top of the pop-up window, you can compare the total output at each institution. The pie chart and bar chart below that let you compare the total output of each institution by journal category. Where do the institutions overlap and where is their research unique? Go to the "Potential co-authors" tab to see the top researchers at each institution. Here, you can identify potential matches between researchers at your institution and researchers at the other institution. Export this list of potential co-authors to a spreadsheet file for further analysis. Use the Shortcuts menu in the institution details pop-up to examine a potential collaboration partner in even more detail. You can: get a high-level overview of the institution in the Overview module view and compare metrics for that institution in Benchmarking see the collaboration partners of that institution in the Collaboration module 68

209 SciVal The Trends module 69 7 The Trends module 7.1 What is the Trends module? The Trends module is where you can evaluate all aspects of Research Areas. Start with a Research Area you define yourself based on a topic or area of interest, or pick a pre-defined one provided with SciVal. Analyze the developments of the Research Area such as the contributing institutions, authors, countries and journals. The Trends module also allows you to analyze their contribution to the subtopics within the Research Area through a keyphrase analysis. In addition to citation and publication data the Trends module includes usage data from Scopus and ScienceDirect to complement the analysis. You can review the underlying list of publications behind every publication count and you can export tables and graph by using the export feature. 7.2 Working with the Trends module Selecting a Research Area Use the entity selection panel on the left-hand side to select an existing Research Area. If the Research Area you want is not listed, click on the Add link and start typing the name, then 69

210 SciVal The Trends module 70 click on the name when it appears in the search results. If the Research Area section is empty you can select one of the Research Areas provided by SciVal from the list or look them up and add them to the entity selection panel. SciVal provides all the Scopus Journal Categories as Research Areas to help kickstart your analysis. If you want to define your own Research Area to use in SciVal, select the option Define a new Research Area from the Entity Selection Panel Selecting a Year range You can view publication and usage data for a period of either three or five years. Use the year range selector at the top of the page to select the desired year range. Optionally, you can also include the current year and future publications. However, data from the current year may not be complete as Scopus may not have received all of the publisher's journal 70

211 SciVal The Trends module 71 content Working with the map The map view is available for countries and institutions. It shows the location and contribution of the institution or country. Up to two variables can be plotted against one another at a time using both shape and color. Zooming in and out. You can use the zoom control in the top left corner of the map to zoom in and out. You can also zoom in by double-clicking on the map Working with the table To complement the map view, you also see the contributing institutions, countries, authors and journals in a tabular list view. For Institutions, Countries, Authors and Journals the table view shows the top 100 contributors by scholarly output. To refine further, use the drop-down menus at the top to view the top 100 contributors in a specific region or country. You can change the sort order of the table by clicking on any of the column headings. You can use the drop-down menus to switch to view and sort by different metrics. 71

212 SciVal The Trends module 72 Click on the name of an institution, country, author or journal for full details of their contribution to the field. Click on the number of publications to view the list of publications Working with the chart To complement the map and table view, you also view the contributing institutions, countries, authors and journals plotted over time in the chart view. For Institutions, Countries, Authors and Journals the table view shows the top 100 contributors by scholarly output. You can use the drop-down menus at the top to view the top 100 contributing institutions in a specific region or country. Click the check-boxes next to the contributors in the list to add them to the chart. You can change the metric on the y-axis by clicking the view drop-down and by using the Metrics details option you can get more information on the metrics you selected. 72

213 SciVal The Trends module Selecting metrics Metrics in the Map view. The Map view allows you to select two metrics and plot them against one another. By default, the metric Scholarly Output (number of publications) is shown as the bubble size and Views Count is shown as the color. Metrics in the table view. The Table view allows you to view more metrics depending on the screen size. By default the list is sorted by Scholarly Output with additional columns for Views Count, Field- Weighted Citation Impact and Citation Count. 73

214 SciVal The Trends module 74 Changing metrics. To view a different metric, click on the button with the metric name you wish to change along the top of the map or in the column headers in the Table view. Select which metric you want to view from the list. Then click on the Choose metric button. Which metrics are available to use in SciVal? Choose total value or percentage growth or decline. Each metric has different options, for instance, you can choose to show total values or percentage growth or decline during the selected time period. If percentage growth or decline is selected as the first metric in the map view, it is displayed as upward triangles for growth and downward triangles for decline. The size of the triangle is the relative magnitude. If it is selected as the second metric, negative values are displayed as blue tones and positive values as red tones. 74

215 SciVal The Trends module Identify top performers Identify the top performers within the Research Area. In the Institutions map view you can get a global overview of the top performing institutions within the Research Area. You can zoom in to a specific region or country, for instance, to see the top performing institutions in Europe in General Neuroscience: 1. Add General Neuroscience by clicking + Add Research Areas at the bottom of the Research Area section in the entity selection panel on the left-hand side and start typing it in until it can be selected from the drop down menu. 2. Go to Institutions and select the Map view. 75

216 SciVal The Trends module Choose Europe from the region selection drop down to see the contributing institutions in Europe. By default Scholarly Output is visualized as the size of the circle and Views Count is visualized by color. In this view it is easy to spot clusters of activity in a geographic region. 4. Zoom in further using the plus button or the slider to the left on the map. By hovering over the institutions of interest we get more information about their contribution to the field. You can also click on the marker to see a more detailed view about the institution. 76

217 SciVal The Trends module To get a more detailed view change to the Table view. Here you can see different metrics in tabular format and sort by them. By default, the the table is sorted by Scholarly Output, but you can also see the citation impact instead. Click on one of the metric names and use the drop-down menu to switch to a metric of your choosing, such as number of citations, Citations per Publication or Field-Weighted Citation Impact. 77

218 SciVal Working with entities in SciVal 78 8 Working with entities in SciVal 8.1 Types of entities An entity is anything that can be viewed in SciVal in terms of academic performance. An entity can be an institution, country, researcher, publication set, or research area. It can also be groupings of these, such as a group of researchers. Researchers and groups of researchers. You can define researchers and groups of researchers in SciVal. A researcher is defined as someone who has authored one or more publications. Researchers are updated weekly with any new publications, but a publication set is fixed and never automatically updated with new publications. You can, however, manually add new publications to a publication set. Citation counts will always be updated. You can use groups of researchers to model your institution's department structure. You can also model different what-if scenarios. For example, you can determine what happens to a research team s performance if you add researchers X and Y. Publication sets. A publication set is a fixed set of publications. You can for example use these to create a selection of a researcher s most cited publications or a set of publications on a particular topic. You can define a publication set from the list of publications from one or more researchers defined in SciVal. You can also import a publication set from a text file containing a list of up to 20,000 publication IDs. These can be DOIs, PubMed IDs, or Scopus IDs (EIDs). You can import up to 20,000 publications in a single publication set. To learn more, see Defining and importing new entities Institutions and groups of institutions. An institution is any organization engaged in research activity. It can be an academic, corporate or governmental institution, for example. An Institution is a type of entity in SciVal. Technically, an institution is defined in SciVal as a collection of one or more Scopus affiliations. Often an institution has multiple affiliations because some of its parts, like hospitals or research institutes, can be assigned their own affiliation in Scopus. Multiple institutions can be combined into another type of selectable SciVal entity: a group of institutions. A number of predefined groups of institutions are available in SciVal, including: institutional alliances such as LERU and Universitas 21 78

219 SciVal Working with entities in SciVal 79 constituent states and provinces of various countries. These include the U.S. states, each of which is made up of all institutions in that state. Countries and groups of countries. A country is a type of entity in SciVal representing a nation state or semi-autonomous part of a state. Publications are assigned to countries by picking up the country mentioned in the publication. If not present, we take the country from the Scopus affiliation mentioned in the publication. A special type of country is the World. This entity represents the total publication output worldwide, in other words: all publications from Scopus between 1996 and now. It is particularly useful as a benchmark. Multiple countries can be combined a new entity: a group of countries. A number of predefined groups of countries are available in SciVal. These include; world regions such as North America, Europe and Asia Pacific international organizations such as the European Union, ASEAN and the G20 various groupings of emerging economies such as Developing-8, CIVETS and BRICS Research areas. You can define your own research areas in SciVal, which can be as granular or interdisciplinary as you like. Research areas are not fixed, but represent a dynamic definition of a field of science. Whenever the publication data in Scopus is updated, new publications matching the definition are added to the research area. To lean more, see About research areas in SciVal in the section "Defining your own research areas" 8.2 Selecting entities Use the entity selection panel to select the entities that you want to analyze. It is on the left side of the screen in each of the three modules. Think of the entity selection panel as a workspace. All your entities of interest are in one clear and organized place. Choose from the thousands of pre-defined entities in the SciVal database: institutions or countries. Or define your own entities. Your self-defined entities can be researchers, research teams, publication sets or even research areas. Add entities to the selection panel. To add additional items to the entity selection panel, click the Add link at the bottom of the currently opened section: Start typing the name of the entity you will like to add. Then click on the name when it appears in the search results. You can also click on the Define links to define an entirely new entity. In My SciVal, you can add an entity into the selection panel by clicking the "Add" icon for that entity. The entity is now marked with "Added" to indicate that it has been added to the panel. The "Add" 79

220 SciVal Working with entities in SciVal 80 icon for that entity flips to a "Remove" icon which you can click to take it back out of the panel. Remove entities from the selection panel. Remove an entity from the panel by clicking on the "remove" (x) icon that appears when you hover over the entity in the panel. You can also use the "Remove all entities from this section" to, for example, remove all researchers and groups of researchers from the "Researchers and Groups" section. You can safely remove entities from the panel. They will not be permanently deleted. You can add them back at any time. Add sets of entities to the panel. In My SciVal, you can move an entire set of entities into the selection panel. Select the entities in My SciVal (using the checkboxes) and then click on the "Add to entity selection panel" button. This opens a modal window where you can either add the entities to the current set of entities in the panel. You can also choose to replace the current set of entities in the panel with the new set. This allows you to easily move sets of entities in and out of the panel. 8.3 Defining and importing new entities In addition to the entities provided by SciVal, you can also define your own entities: researchers groups of researchers publication sets research areas groups of institutions groups of countries Define a new entity. Click on My SciVal in the top right corner of your screen. In My SciVal, choose a category (for example "Researcher and groups") and open the "Define a new entity". Now click on one of the links in the menu to define a new entity. This will take you through a step-by-step process to define, name and save the new entity. Note that some entities with large number of publications are not available immediately but take up to 48 hours to compute. You will receive an as soon as the entity is ready to use. View the list of entities defined by you. To see an overview of all the entities defined by you, choose a category (for example "Researchers and groups") and then select "Entities defined by you" from the drop-down menu. 80

221 SciVal Working with entities in SciVal 81 Import a list of researchers. You can import a list of researchers into SciVal from a text file containing a list of Scopus author IDs. The file containing the list of Scopus authors IDs should be a text file (with a.txt extension in Windows). The text file should also be ANSI format, not Unicode/UTF. The Scopus authors IDs should be listed one per line (max. 300 IDs per file). During the import process, you can choose to create a new group of researchers containing the researchers you're importing. So you can for example import a research team and immediately create a group of researchers entity for that team. Tip: when you export a list of researchers from SciVal, the export will include the Scopus author ID of each person. You can paste this list of IDs into a text file and reimport the list of researchers back into SciVal. Import a publication set. You can import a publication set from a text file containing a list of publication IDs. These can be DOIs, PubMed IDs, or Scopus IDs (EIDs). You can import up to 20,000 publications in a single publication set. The file containing the list of publication IDs should be a text file (with a.txt extension in Windows). The text file should also be ANSI format, not Unicode/UTF. The Scopus authors IDs should be listed one per line (max. 20,000 IDs per file). The publication IDs should be listed one per line. You will be notified if any IDs in the file are not known to SciVal. This may be because the publications were before 1996 (the cutoff point for the Scopus data used by SciVal), or because the publications are very recent and therefore not yet included in the Scopus data cut used by SciVal. When you export a list of publications from SciVal, the export will include the DOI of each publication. You can paste this list of DOIs into a text file and reimport the list of publications back into SciVal. Tip: It is possible to merge multiple Publication sets up to 100,000 documents. 8.4 Tagging entities In My SciVal, you can add tags to entities or sets of entities. For instance, you can attach keywords to entities or use tags to distinguish two researchers with the same name. Tag an entity. In My SciVal, click on the "tag" icon for an entity to add tags. You can add as many tags to a single entity as you like. 81

222 SciVal Working with entities in SciVal 82 To tag multiple entities in one go, select them in My SciVal (using the checkboxes) and click the "Add tags" button along the top. You can also tag entities during the entity creation process. For example, when defining a new research area, you can assign tags to the new entity during the final (name and save) step of the process. Cascading tags.when you create a researcher group in SciVal, it is possible to not only tag the group itself but the individual researchers within the group as well. The feature allows you to use a group level tag and cascade it to all researchers who belong the group. This option can be useful during an analysis for which identification of individual researchers who belong to the same group is important. The option only appears in SciVal when a tag is added to a researcher group. How you can use tagging. You can use tagging to create various grouping or categories of entities. Use the rightmost drop-down menu (marked "All tags") to view entities by tag. You can use this to select all entities with a particular tag, then move these into the entity selection panel by selecting them and clicking on "Add to entity selection panel". Example: Say you have tagged five researchers with "possible hires". You can then filter the list of researchers by this tag, select all five researchers and move them into the entity selection panel. Now the five researchers are ready for you to compare them in the Benchmarking module. Filtering by tags. Tag filters are included in the define flows of Groups of researcher, institution and countries. By using filtering by tag you can quickly and easily find researchers, institutions and countries with the tags you ve defined when creating entity groups. 82

223 SciVal Working with entities in SciVal Export customization SciVal offers greater flexibility for further analysis through an increased number of export fields for lists of publications.to access this feature, open a list of publications in SciVal and click on Export a list of publications to a spreadsheet file. SciVal provides you with the default fields to be exported, but this can be further refined and changed in the Export publications window. 8.6 Sharing entities with others You can share entities you've defined in SciVal (such as researchers, groups of researchers and research areas) with other SciVal users at your institution. For example, you could set up part of your institution's department structure as groups of researchers in SciVal, and then share that with a others at your institution. Or you could define a research area in SciVal and share that with the other members of your research team. Sharing is possible with any groups or individuals within your institution. By default when you share an entity, you remain the owner. Other users can only view the entity. You are the only user who can make changes to that entity, and the only user who can delete it from SciVal. However, you have the option to transfer the ownership of a shared entity. A shared entity is the same for all users. When it is changed or updated with new publications, these changes are immediately visible to all users. How to share an entity. To share an entity, go to My SciVal and click on the "Share" icon for that entity to bring up the sharing settings. You can invite SciVal users at your institution by entering their addresses, separated by 83

224 SciVal Working with entities in SciVal 84 comma. You can create and manage your invitation list from the sharing panel. Invited users receive an - which can be personalized- with a link that gives them access to the entity Use the drop-down menu to the right of the filter box in My SciVal to display only entities that have been shared with you ("Entities shared with me"). You can also choose to view only entities that you have shared with others ("Entities shared by me"). Share multiple entities with your peers. It is possible to effortlessly share multiple research entities with groups of peers in SciVal. Steps to share multiple entities: 1. Go to MySciVal 2. Select multiple entities 3. Click on the share icon Transferring ownership You can now transfer the ownership of your research entities in SciVal. If you change positions, leave the organization or you just want to transfer the administration of a shared entity to another person within your organization, the transfer ownership function allows you to do this in SciVal. You can transfer the ownership to all people with whom you have already shared the entity. After transferring the ownership the previous owner will have viewing rights to the shared entity, the new owner will have full editing rights to the entity. To transfer ownership of a shared entity in SciVal: Go to My SciVal and click on the share icon for the shared entity Click on the tab currently invited/shared with Next to the entity name click on change Set the new owner of the research entity 8.7 Integration with Scopus Create a Publication Set in Scopus If you have access to both Scopus and SciVal it is now possible to create a Publication Set from a Scopus search query with just a few clicks and perform further in-depth analysis in the SciVal Overview and Benchmarking modules. Enter your search in Scopus From your search results, select the desired publications up to a maximum of 2,000 and add them to your Scopus list using the Add to My list option From My list, you can now click on the option to Export your list to SciVal Click on Continue to SciVal to redirect to SciVal, where you can name your Publication Set, add 84

225 SciVal Working with entities in SciVal 85 a tag to it and analyze it further in detail 85

226 SciVal Working with research areas 86 9 Working with research areas 9.1 About research areas in SciVal SciVal gives you the flexibility to model, evaluate and benchmark any field of research. This can be a strategic priority, an emerging area of science or any other topic of interest. Once you have defined a research area, you can: Evaluate your institution s output in that field See which other institutions and researchers are active in this field See who the top performing and fastest growing countries, authors and institutions are in the field Compare your output in that field against that of other institutions See which journals contain the most publications from the Research Area See what the most important topics are within the field Identify existing and potential new collaboration partners User-defined research areas offer an alternative to subject area classifications like the Scopus journal classification. They can be as granular or interdisciplinary as you like. Research areas are not fixed, but represent a dynamic definition of a field of science. Whenever the publication data in Scopus is updated, new publications matching the definition are added to the research area. 9.2 Defining a research area The definition of a research area can either be keywords or entities. If this definition is too broad, you can apply filters to narrow it down further. Let s say that your institution has made research on graphene a strategic focus. You are specifically interested in research on the thermal conductivity of graphene, and want to see how well your institution performs in this area. You can define a Research Area from the entity selection panel in Overview, Benchmarking and Trends modules or from My SciVal 1. Go to the Overview, Benchmarking or Trends module 2. Open the Research Areas section of the entity selection panel on the left-hand side of the screen. 3. Click on Add Research Areas, then Define a new Research Area. 86

227 SciVal Working with research areas 87 A pop-up window will now open. Here you can define your research area using a 3-step process. Step 1. Start by defining your research area: 1. Go to the tab Use search terms 2. Enter thermal conduction graphene in the input field. 3. Press the Search button. Step 2. You will now proceed to step 2. Here you can see how many publications worldwide (since 1996) match the definition thermal conduction graphene. Apply filters (if needed) to narrow down the definition of your research area. Let s say that you are interested only in academic publications. To filter out other organization types: 1. Click on the tab Organization types 2. Check off Academic 3. Click on Limit to The filter you have just applied will now be shown on the right side of the screen. 87

228 SciVal Working with research areas 88 Step 3. Click Next Step in the bottom right corner to proceed to step Name your research area Thermal Conductivity Graphene (Academic) 2 Click "Save and finish". Your research area is now computed, and you are returned to your previous place in SciVal. See Search tips for more help with using search terms to define a research area. 9.3 Predefined Research Areas SciVal offers for instant analysis predefined Research Areas based on all Scopus 334 classifications in Overview, Benchmarking and Trends. For more information about the Scopus journal classification visit the Journal Title List. 9.4 Search tips Search technology in SciVal. When you use search terms to define your research area, SciVal will search the Scopus database for publications matching your search terms. We search through the publication titles, as well as the abstracts and the keywords that Scopus assigns to each publication. 88

229 SciVal Working with research areas 89 SciVal uses a search engine called Apache Solr, while Scopus uses FAST ESP. So the results returned from search queries might differ in SciVal and Scopus, even though they use the same data source. Compare Bing and Google Search for example - both search the Web, but return different results. Key search tips for creating a Research Area Choose search terms that are specific and closely related to your research area Avoid very general terms like 'cell' Your syntax will make a difference in how SciVal interprets your search 'Solar flare' is interpreted as 'solar AND flare', which may be located next to each other or in separate sentences 'Solar-flare' is also interpreted as 'solar AND flare' Enclose the search terms in double quotes (for example "solar flare") to bring back exact matches only. This search will find publications containing 'solar flare' but not 'solar-flare'. Stop words are always ignored. Stop words include personal pronouns (such as 'he', 'she', 'we', 'they'); most articles (such as 'the', 'an'); most forms of the verb to be (such as 'be', 'is', 'was'); and some conjunctions (such as 'as', 'because', 'if', 'when') SciVal ignores accents and upper/lower case The search is not case-sensitive. It will match both upper-case and lower-case text Terms containing accented characters will be found if you type in the unaccented version, for example u to represent ü or ú SciVal uses a stemming algorithm that reduces words to their root form If you enter 'fishing', 'fished', 'fish', or 'fisher', they will all be stemmed automatically so that the search is conducted on the root word, 'fish' If you use the singular form of a word, your search will retrieve the singular, plural, and possessive forms of most words Search strings containing wild-cards are not reduced to their root form You can find variants using wild-card searching? replaces a single character. For example, 'organi?ation' will return both 'organisation' and 'organization' * replaces one or more characters. For example, 'cat*' will return 'catastrophe', 'catheter', 'catnip', and so on SciVal uses the Boolean operators AND, OR, NOT Entering 'blood cell' will search for 'blood AND cell' Entering 'cat AND dog OR mouse' we will search for '(cat AND dog) OR mouse' If you specify parentheses, they will be followed and not overridden. If you enter 'cat AND (dog OR mouse)' we will search for 'cat AND (dog OR mouse)' If you don't use parentheses, we will add them to simulate operative precedence - AND takes precedence over OR. If you enter 'cat OR dog AND mouse' we will search for 'cat OR (dog AND mouse)' 89

230 SciVal Working with research areas Analyzing a research area Research areas in the Overview module See your institution's output within a research area. In the Overview module, you can view your institution s contribution to a particular research area by number of publications and citations. How does your institution s output in this area compare to the national or worldwide output? For more in-depth information, browse the Publications and Citations tabs. For example, to see how much of your institution's output was in the top 1% and 10% most cited publications worldwide, click on the Publications tab and scroll down to the Outputs in Top Percentiles section. See which institutions are active within a research area. The Institutions tab gives you an overview of the top contributing institutions in the research area within your own region or country or worldwide. You can also see which institutions are collaborating with your institution within the research area. 90

231 SciVal Working with research areas Research areas in the Benchmarking module Use the Benchmarking module to explore the worldwide output in a particular research area from 1996 until the present. Go to the Benchmarking module Select the research area from the entity selection panel on the left-hand side You can spot possible trends using a variety of different metrics, such as: Scholarly Output (number of publications) Field-Weighted Citation Impact (normalized citation count) Outputs in Top Percentiles (an indicator of research excellence) Journal Category Count (an indicator of multidisciplinarity) Collaboration (for instance international collaboration) 91

232 SciVal Working with research areas Research areas in the Collaboration module See your institution's collaboration partners in a research area. You can use the Collaboration module for an in-depth view of your institution s collaboration partners in a particular research area. Or identify potential new collaboration partners in that research area. 1. Go to the Collaboration module 2. Select your home institution from the entity selection panel on the left-hand side 3. Select the research area from the filter menu at the top of the page. Switch from Map to Table view to view the full list of collaborating institutions. Which collaboration had the greatest citation impact? 92

233 SciVal Working with research areas 93 Find new collaboration partners in a research area. Switch to the Potential collaboration tab to view potential new collaboration partners in this research area. These institutions are active in this research area, but are not yet collaborating with your institution in that area. 93

234 SciVal Working with research areas Research areas in the Trends module The Trends module is built for you to analyze Research Areas in depth. See who the top performers and rising stars are based on both output and usage data. Which subtopics are viewed the most and who is contributing to them? 94

235 SciVal Working with research areas 95 See The Trends module for more information about how to analyze Research Areas. 95

236 SciVal Data and metrics Data and metrics 10.1 What is the source of the data in SciVal? SciVal is based on output and usage data from Scopus, the world s largest abstract and citation database for peer-reviewed publications. The Scopus database covers over 30 million publications from 1996 until the present: 21,000 serials from 5,000 publishers. These include: 20,000 peer-reviewed journals 390 trade publications 370 book series 5.5 million conference papers Additionally SciVal uses usage data from ScienceDirect the world s largest scientific full text database with more than 2,500 journals and 26,000 books. For detailed information on the data used in SciVal, see the SciVal Metrics Guidebook. Download the SciVal Metrics Guidebook (PDF format) 10.2 How current is the data? SciVal regularly checks whether the researchers you've defined have any new publications in Scopus. The researchers are then automatically updated with any new publications. Publications, author and affiliation profiles in SciVal are updated approximately every two weeks. So our data is close to being in total sync with Scopus. Usage data from Scopus and ScienceDirect are updated monthly What publication types can you use? SciVal includes all types of publications that are classified by Scopus: articles reviews conference papers editorials short surveys books For each selected metric in the Benchmarking module, you can choose which types of publications to include in the analysis. 96

237 SciVal Data and metrics Which metrics are available to use in SciVal? SciVal uses a broad range of metrics, including the Snowball Metrics. The metrics can be divided into five categories: Productivity metrics These measure research productivity Citation impact metrics These measure the impact of citations Collaboration metrics These measure the benefits of collaboration Disciplinary metrics These measure multidisciplinarity Usage metrics These measure viewing activity The Metrics Guidebook and the Usage guidebook discusses each SciVal metric in detail. The guidebooks offer suggestions on how and when to apply each metric. Download the SciVal Metrics Guidebook (PDF format) Download the Usage Guidebook (PDF format) The available metrics are: Scholarly Output The number of publications of a selected entity Snowball Metric Outputs in Top Percentiles Publications in Top Journal Percentiles Publications of a selected entity that have reached a particular threshold of citations received The set of an entity s publications that have been published in the world's top journals Snowball Metric Snowball Metric Field-weighted outputs in Top percentiles Share of publications that are the most cited publications worldwide Snowball Metric Citations Count Total citations received by publications of the selected entities Snowball Metric Citations per Publication The average number of citations received per publication Snowball Metric Cited Publications Publications that have received at least one citation Number of Citing Countries The number of distinct countries 97

238 SciVal Data and metrics 98 represented by the publications citing a selected entity Field-Weighted Citation Impact The ratio of citations received relative to the expected world average for the subject field, publication type and publication year Snowball Metric Views Count* Total views received by publications of the selected entities Views per Publication* The average number of views per publication Field-Weighted Views Impact* The ratio of views relative to the expected world average for the subject field, publication type and publication year Collaboration The extent of international, national and institutional co-authorship Snowball Metric Field-weighted collaboration The amount of international, national and institutional co-authorship Snowball Metric Collaboration Impact The average number of citations received by publications that have international, national or institutional co-authorship Snowball Metric Academic-Corporate Collaboration Publications whose affiliation information contains both academic and corporate organization types Snowball Metric Academic-Corporate Collaboration Impact h-index The average number of citations received by publications that have academiccorporate collaboration A measure of both the productivity and publication impact of an entity, which depends on both the number of publications and the number of citations they have received Snowball Metric Snowball Metric Journal count The number of journals in which an entity's 98

239 SciVal Data and metrics 99 publications have appeared Category count The number of journal categories in which a selected entity's publications have appeared * These metrics are currently only available in the SciVal Trends module What are Snowball Metrics? The Snowball Metrics were initiated by eight highly successful research universities as a manageable set of metrics that capture the strategic aspects of research performance. The ambition is for the Snowball Metrics to become the global standard for the higher education sector. The agreed and tested definitions are shared free of charge with the research community. Elsevier supports Snowball Metrics as a recognized industry standard and has implemented many of the metrics in SciVal. You can recognize these metrics by the following icon: More information about Snowball Metrics is available on snowballmetrics.com: More info about Snowball Metrics Download the Snowball Metrics Recipe Book (PDF format) 10.6 What are SNIP and SJR? Source Normalized Impact per Paper (SNIP) and SCImago Journal Rank (SJR) are journal metrics. They are used to measure the citation impact of a journal. SNIP (Source-Normalized Impact per Paper) - This measures the citation impact of a journal. SNIP is normalized for the journal s subject field, weighting citations based on the number of expected citations in that field. SJR (SCImago Journal Rank) - This measures the prestige of citations received by a journal. The subject field, quality and reputation of the citing journal have a direct effect on the value of a citation. See for more details on SNIP and SJR. Publications in Top Journal Percentiles. In the Benchmarking and Overview modules, you can view the metric Publications in Top Journal Percentiles by either SNIP or SJR. So you can see how many of your institution's publications are in the top 1% and 10% journals worldwide, as measured by either SNIP or SJR. In the Benchmarking modules, you can also see the output in the top 5% and 25% of journals by SNIP or SJR. Publications by journal. In the Overview module, you can view the top 10 journals where your institution's publications have been published. You can see both the SNIP and SJR value for each journal. To see this breakdown by journal, click on the Publications tab, then select "by journal". Use the export function to view the full list of journals for the selected entity's publications, including 99

Elsevier Research Intelligence. Usage Guidebook

Elsevier Research Intelligence. Usage Guidebook Elsevier Research Intelligence Usage Guidebook Version 1.01 March 2015 Contents 1. Usage as a data source 4 1.1 The importance of research intelligence based on multiple data sources 4 1.2 Why are usage

More information

Analysis of Khon Kaen University SciVal. Alexander van Servellen Consultant, Elsevier Research Intelligence

Analysis of Khon Kaen University SciVal. Alexander van Servellen Consultant, Elsevier Research Intelligence 0 Analysis of Khon Kaen University SciVal Alexander van Servellen Consultant, Elsevier Research Intelligence a.vanservellen@elsevier.com SciVal Report for Khon Kaen University, Thailand April 22, 2015

More information

SciVal Metrics Guidebook

SciVal Metrics Guidebook TITLE OF PRESENTATION 11 SciVal Metrics Guidebook Dr Lisa Colledge Senior Manager for Strategic Alliances March 27 th, 2014 #SciVal TITLE OF PRESENTATION 22 SciVal Metrics Guidebook http://bit.ly/scivalmetricsguidebook

More information

Novel Research Impact Indicators

Novel Research Impact Indicators Vol. 23, no. 4 (2014) 300 309 ISSN: 1435-5205 e-issn: 2213-056X Novel Research Impact Indicators Martin Fenner Hannover Medical School, Hannover, Germany and Public Library of Science, San Francisco, CA,

More information

Altmetrics is an Indication of Quality Research or Just HOT Topics

Altmetrics is an Indication of Quality Research or Just HOT Topics Purdue University Purdue e-pubs Proceedings of the IATUL Conferences 2014 IATUL Proceedings Altmetrics is an Indication of Quality Research or Just HOT Topics Chia Yew Boon Nanyang Technological University,

More information

International Comparative Performance of the Czech Republic Research base 2012

International Comparative Performance of the Czech Republic Research base 2012 International Comparative Performance of the Czech Republic Research base 212 Prepared by Elsevier for the National Technical Library of the Czech Republic in October 212 SciVal Analytics executive summary

More information

Introduction to Altmetric for Institutions

Introduction to Altmetric for Institutions Introduction to Altmetric for Institutions Schulich School of Medicine and Dentistry Natalia Madjarevic natalia@altmetric.com @altmetric / @nataliafay altmetric.com By the end of this session You ll have:

More information

ScienceDirect: Empowering researchers at every step

ScienceDirect: Empowering researchers at every step ScienceDirect: Empowering researchers at every step Lucie Boudova, PhD. Customer Marketer & Consultant December 10 th 2015 Mykolas Romeris University 2 ScienceDirect empowers smarter research ScienceDirect

More information

Mendeley The reference manager and beyond. Massimiliano Bearzot Customer Consultant South Europe

Mendeley The reference manager and beyond. Massimiliano Bearzot Customer Consultant South Europe 0 Mendeley The reference manager and beyond Massimiliano Bearzot Customer Consultant South Europe 1 The ecosystem of the world of research is changing Institutions Researchers 2 The changes in the world

More information

Altmetric Explorer for Institutions: A Guide for Researchers

Altmetric Explorer for Institutions: A Guide for Researchers Researcher tools Altmetric Explorer for Institutions: A Guide for Researchers Introduction Altmetric tracks, monitors and collates online attention for scholarly content from thousands of online sources,

More information

Why Scopus content is relevant to you

Why Scopus content is relevant to you 0 Why Scopus content is relevant to you Presenter: Dr. Wim Meester Head of Product Management, Scopus Moderator: Susanne Steiginga Product Manager Scopus Content February 18, 2016 1 Agenda for today s

More information

2015 Elsevier B.V. All rights reserved. SciVal is a registered trademark of Elsevier Properties S.A., used under license.

2015 Elsevier B.V. All rights reserved. SciVal is a registered trademark of Elsevier Properties S.A., used under license. Last updated on Tuesday, June 30, 2015 2015 Elsevier B.V. All rights reserved. is a registered trademark of Elsevier Properties S.A., used under license. 4 25 4.1 What is the Overview module? provides

More information

Day to day operational activities

Day to day operational activities DEVELOPING AN OPERATIONAL PLAN FOR THE JOURNAL Day to day operational activities Manage editorial matters Perform production Ensure Impact (Visibility & Marketing) Distribute articles/journal Preserve

More information

Scopus: Search. Discover. Analyse. What s in it for researchers? Tomasz Asmussen Customer Consultant, Research Intelligence DACH

Scopus: Search. Discover. Analyse. What s in it for researchers? Tomasz Asmussen Customer Consultant, Research Intelligence DACH Scopus: Search. Discover. Analyse. What s in it for researchers? Tomasz Asmussen Customer Consultant, Research Intelligence DACH 2018 Always up-to-date Scopus Blog https://blog.scopus.com Additional learning

More information

Sales Training 101. Scopus Introduction. Peter Porosz customer consultant Belgrade 23 rd January 2015

Sales Training 101. Scopus Introduction. Peter Porosz customer consultant Belgrade 23 rd January 2015 Sales Training 101 1 Scopus Introduction Peter Porosz customer consultant p.porosz@elsevier.com Belgrade 23 rd January 2015 Sales Training 101 2 Agenda Scopus at-a-glance How Scopus supports the researcher

More information

Why do we need metrics, and what can new metrics offer editors and journals?

Why do we need metrics, and what can new metrics offer editors and journals? Why do we need metrics, and what can new metrics offer editors and journals? David Crotty Senior Editor, Oxford University Press david.crotty@oup.com August 19, 2015 New Metrics How do we measure researcher

More information

9 Principles of (alt)metrics

9 Principles of (alt)metrics 1 9 Principles of (alt)metrics Special Libraries Association Vancouver, BC June 10, 2014 Michael Habib, MSLS Sr. Product Manager, Scopus habib@elsevier.com twitter.com/habib orcid.org/0000-0002-8860-7565

More information

THIS DOCUMENT, PREPARED BY THOMSON REUTERS, HIGHLIGHTS HOW INCITES HELPS THE HIGHER EDUCATION AND RESEARCH ORGANIZATIONS IN AUSTRALIA AND NEW

THIS DOCUMENT, PREPARED BY THOMSON REUTERS, HIGHLIGHTS HOW INCITES HELPS THE HIGHER EDUCATION AND RESEARCH ORGANIZATIONS IN AUSTRALIA AND NEW THIS DOCUMENT, PREPARED BY THOMSON REUTERS, HIGHLIGHTS HOW INCITES HELPS THE HIGHER EDUCATION AND RESEARCH ORGANIZATIONS IN AUSTRALIA AND NEW ZEALAND. MARCH 2015 BENCHMARK ORGANIZATIONS With Incites users

More information

THOMSON REUTERS: INCITES

THOMSON REUTERS: INCITES THOMSON REUTERS: INCITES An objective analysis of people, programs, and peers THOMSON REUTERS: SINGULAR EXPERT IN RESEARCH DISCOVERY The global research environment is changing: it s more collaborative,

More information

Using Bibliometric Big Data to Analyze Faculty Research Productivity in Health Policy and Management

Using Bibliometric Big Data to Analyze Faculty Research Productivity in Health Policy and Management Using Bibliometric Big Data to Analyze Faculty Research Productivity in Health Policy and Management Christopher A. Harle, PhD [corresponding author] Joshua R. Vest, PhD, MPH Nir Menachemi, PhD, MPH Department

More information

Introducing the. Performance Based Research Fund (PBRF)

Introducing the. Performance Based Research Fund (PBRF) Introducing the Performance Based Research Fund (PBRF) and Completing the Evidence Portfolio 1 May 2015 Doug MacLeod, AUT PBRF Manager PBRF first introduced in 2003. Quick Review of the PBRF Intent of

More information

Making sense and making use of Altmetrics in research evaluation

Making sense and making use of Altmetrics in research evaluation Making sense and making use of Altmetrics in research evaluation Citation metrics lag behind Academics aren t the only audience https://www.flickr.com/photos/usacehq/5158203033 https://www.flickr.com/photos/isafmedia/6149305686

More information

InCites Benchmarking & Analytics

InCites Benchmarking & Analytics InCites Benchmarking & Analytics Massimiliano Carloni Solution Specialist massimiliano.carloni@clarivate.com June 2018 Agenda 1. Clarivate Analytics: news 2. Publons & Kopernio plug-in 3. Web of Science

More information

Nature India Media Information

Nature India Media Information 2018 Media Information Nature India ABOUT NATURE INDIA...2 ADVERTISING SOLUTIONS...3 NATUREJOBS...4-5 SCIENTIFIC WRITING & PUBLISHING WORKSHOPS...6 CUSTOM PUBLISHING...7 CONTACT US...8 ABOUT NATURE INDIA

More information

The Overview module. SciVal. 4.1 What is the Overview module? 4.2 Working with the Overview module Selecting an entity

The Overview module. SciVal. 4.1 What is the Overview module? 4.2 Working with the Overview module Selecting an entity 4 25 4.1 What is the Overview module? provides a high-level overview of your institution s research performance based on publications, citations, and collaboration. In addition, you can review the performance

More information

The Benchmarking module

The Benchmarking module 5 45 5.1 What is the Benchmarking module? lets you easily evaluate your research performance in comparison to others. How does your institution compare to others in your region, country or the world? Choose

More information

Scopus certification programme for Editors (Level 3) Pilot

Scopus certification programme for Editors (Level 3) Pilot 1 Scopus certification programme for Editors (Level 3) Pilot Derrick Duncombe Market Development Manager (Asia Pacific) 10 November 2017 2 Agenda 1) Pilot survey results a) Questions for you 2) Recap Levels

More information

Altmetrics and Traditional Metrics: What Do Scholars Use to Judge Quality?

Altmetrics and Traditional Metrics: What Do Scholars Use to Judge Quality? Altmetrics and Traditional Metrics: What Do Scholars Use to Judge Quality? Carol Tenopir University of Tennessee, Knoxville, USA ctenopir@utk.edu Fiesole August 12-14, 2013 Trust and authority in scholarly

More information

The Research Excellence Framework

The Research Excellence Framework The Research Excellence Framework Assessment framework, guidance on submissions and panel criteria Presentation outline Overview REF panels Staff Outputs Impact Environment 1 Overview: Purpose of the REF

More information

GROWTH OF SCIENTIFIC PUBLICATIONS: AN ANALYSIS OF TOP TEN COUNTRIES

GROWTH OF SCIENTIFIC PUBLICATIONS: AN ANALYSIS OF TOP TEN COUNTRIES University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Library Philosophy and Practice (e-journal) Libraries at University of Nebraska-Lincoln GROWTH OF SCIENTIFIC PUBLICATIONS:

More information

Adis Journals Overview

Adis Journals Overview Adis Journals Overview Introduction Adis Rapid Publishing Service Life after publication AdisOnline.com The Adis publishing heritage The Adis Journals brand has evolved over 40 years to become synonymous

More information

Getting Found - Using social media to build your research profile

Getting Found - Using social media to build your research profile Edith Cowan University Research Online ECU Research Week Conferences, Symposia and Campus Events 2013 Getting Found - Using social media to build your research profile Julia Gross Edith Cowan University,

More information

University Human Resource Services TMS Talent Management Training Guide for STAFF

University Human Resource Services TMS Talent Management Training Guide for STAFF University Human Resource Services TMS Talent Management Training Guide for STAFF February 2019 University Human Resource Services Talent Management System (HR-TMS) and PeopleAdmin HR-TMS enhances the

More information

Altmetrics, PIRUS and Usage Factor

Altmetrics, PIRUS and Usage Factor Insights 26(3), November 2013 Altmetrics, PIRUS and Usage Factor Peter Shepherd Altmetrics, PIRUS and Usage Factor Scholars have moved their publications onto the web, and the ongoing conversation around

More information

Scholarly Collaboration Networks & Societies. David Nygren, VP Research Insights, Wiley June 30, 2016

Scholarly Collaboration Networks & Societies. David Nygren, VP Research Insights, Wiley June 30, 2016 Scholarly Collaboration Networks & Societies David Nygren, VP Research Insights, Wiley June 30, 2016 Societies need to pay attention to scholarly collaboration networks because your members are. Societies

More information

Best Practices for More Effective E-Newsletter Advertising:

Best Practices for More Effective E-Newsletter Advertising: Best Practices for More Effective E-Newsletter Advertising: Practical tips for maximizing this marketing channel Contents 2 Executive Summary...3 Why E-Newsletter Advertising...4 E-Newsletter Trends in

More information

Edith Cowan University Research Activity System (RAS)

Edith Cowan University Research Activity System (RAS) Edith Cowan University Research Activity System (RAS) Step-by-Step Guide for Users Research Outputs Data Entry Edith Cowan University May 2016 Contents Introduction... 2 Logging in... 2 Outputs... 4 Entering

More information

Illustration inspired by the work of John Maynard Keynes Springer Nature Nano/AdisInsight. Daria Iovcheva November, 2017

Illustration inspired by the work of John Maynard Keynes Springer Nature Nano/AdisInsight. Daria Iovcheva November, 2017 Illustration inspired by the work of John Maynard Keynes Springer Nature Nano/AdisInsight Daria Iovcheva November, 2017 1 Contents 1.0 About Springer Nature and our platforms 2.0 Nature Research Journals:

More information

Ball State University HR-TMS Training Guide: FACULTY

Ball State University HR-TMS Training Guide: FACULTY PeopleAdmin Ball State University HR-TMS Training Guide: FACULTY Human Resource s Talent Management System Updated September 2017 Human Resource s Talent Management System (HR-TMS) and PeopleAdmin As we

More information

Measuring the Quality of Science

Measuring the Quality of Science Jochen Bihn Measuring the Quality of Science A look at citation analysis tools to evaluate research impact Unless otherwise noted, this work is licensed under a Creative Commons Attribution 3.0 Switzerland

More information

Publication Strategies and Ranking

Publication Strategies and Ranking Publication Strategies and Ranking Lunch seminar for AoA Transport 24 november 2016 David Minguillo Brehaut Per-Eric Thörnström University rankings General rankings of whole universities Subject Rankings

More information

The Productivity and Characteristics of Iranian Biomedical Journal (IBJ): A Scientometric Analysis

The Productivity and Characteristics of Iranian Biomedical Journal (IBJ): A Scientometric Analysis Iranian Biomedical Journal 22 (6): 362-366 November 2018 The Productivity and Characteristics of Iranian Biomedical Journal (): A Scientometric Analysis Hassan Asadi 1 and Ehsan Mostafavi 2* 1 Deputy of

More information

Taleo Enterprise Fluid Recruiting User Guide Release 17.2

Taleo Enterprise Fluid Recruiting User Guide Release 17.2 Oracle Taleo Enterprise Release 17.2 Taleo Enterprise Part Number: E92669-01 Copyright 2017, Oracle and/or its affiliates. All rights reserved Authors: Taleo Information Development Team This software

More information

Research Progress and Prospects of Saudi Arabia

Research Progress and Prospects of Saudi Arabia European Review for Medical and Pharmacological Sciences Research Progress and Prospects of Saudi Arabia in Global Medical Sciences S.A. MEO 1, A. HASSAN 2, A.M. USMANI 2 2013; 17: 3265-3271 1 Department

More information

Research Excellence Framework

Research Excellence Framework Research Excellence Framework Assessment framework Guidance on submissions Panel criteria Purpose - Research funding allocation ( 2 billion per year) - Accountability for public funding of research - Demonstration

More information

Journal print subscription price increases no longer reflect actual costs

Journal print subscription price increases no longer reflect actual costs 206 Journal print subscription price increases no longer reflect actual costs Learned Publishing, 26: 206 210 doi:10.1087/20130309 Library Journal s Annual Periodical Price Surveys 1 show a more than six-fold

More information

Scopus content: What s happening right now and a look at what s to come

Scopus content: What s happening right now and a look at what s to come 1 Scopus content: What s happening right now and a look at what s to come Presenter: Susanne Steiginga Product Manager Scopus Content Moderator: Dr. Wim Meester Head of Product Management, Scopus October

More information

Taleo Enterprise Fluid Recruiting User Guide Release 17

Taleo Enterprise Fluid Recruiting User Guide Release 17 Oracle Taleo Enterprise Release 17 Taleo Enterprise Part Number: E90637-01 Copyright 2017, Oracle and/or its affiliates. All rights reserved Authors: Taleo Information Development Team This software and

More information

2013 IPP 2012 SJR 2013 SNIP 2013 SJR

2013 IPP 2012 SJR 2013 SNIP 2013 SJR Sourcerecord id Source Title (Medline-sourced journals are indicated in Green) Titles indicated in bold red do not meet the Scopus quality criteria anymore and therefore Scopus discontinued the forward

More information

Metals Production, Trading and Brokerage SEEK MORE

Metals Production, Trading and Brokerage SEEK MORE Metals Production, Trading and Brokerage SEEK MORE Discover Profitable Insights on the World s Metals Markets Changing metals markets create new opportunities. Are you going to see and seize them first?

More information

Specialist Diploma in Search Marketing

Specialist Diploma in Search Marketing Specialist Diploma in Search Marketing SPECIALIST 30-48 HOURS www.kmdc.com.my Validated by the Industry Advisory Council. Including members from ONLINE / CLASS Content Specialist Diploma in Search Marketing

More information

Deep Insights into Your Research: The PlumX Suite. Mike MacKinnon Director of Sales, Plum Analytics

Deep Insights into Your Research: The PlumX Suite. Mike MacKinnon Director of Sales, Plum Analytics 1 Deep Insights into Your Research: The PlumX Suite Mike MacKinnon Director of Sales, Plum Analytics mackinnon@plumanalytics.com Journal citations are the traditional form of 2 Scholarly Measure Journal

More information

An Executive s Guide to B2B Video Marketing. 8 ways to make video work for your business

An Executive s Guide to B2B Video Marketing. 8 ways to make video work for your business An Executive s Guide to B2B Video Marketing 8 ways to make video work for your business [Video] Content Is King Companies that utilize video content to present their products and services can experience

More information

As our brand migration will be gradual, you will see traces of our past through documentation, videos, and digital platforms.

As our brand migration will be gradual, you will see traces of our past through documentation, videos, and digital platforms. We are now Refinitiv, formerly the Financial and Risk business of Thomson Reuters. We ve set a bold course for the future both ours and yours and are introducing our new brand to the world. As our brand

More information

SOCIETAL ATTENTION OF RESEARCH OUTPUT

SOCIETAL ATTENTION OF RESEARCH OUTPUT SOCIETAL ATTENTION OF RESEARCH OUTPUT THE PUBLIC IS MORE AWARE OF THE VALUE OF REFERENCES TO RESEARCH REFERENCES TO RESEARCH CAN BE FOUND IN NEWS, BLOGS, SOCIAL MEDIA, POLICY DOCUMENTS HOW DO YOU KNOW

More information

Oracle Taleo Cloud for Midsize (Taleo Business Edition) Moving from the Legacy Employee Website to the Talent Center

Oracle Taleo Cloud for Midsize (Taleo Business Edition) Moving from the Legacy Employee Website to the Talent Center Oracle Taleo Cloud for Midsize (Taleo Business Edition) Moving from the Legacy Employee Website to the Talent Center TABLE OF CONTENTS TABLE OF CONTENTS... 2 REVISION HISTORY... 3 TALENT CENTER... 4 Help

More information

EXPANDING READERSHIP THROUGH DIGITAL REPOSITORIES

EXPANDING READERSHIP THROUGH DIGITAL REPOSITORIES EXPANDING READERSHIP THROUGH DIGITAL REPOSITORIES Research is more valuable when it s shared. With this type of system a researcher could access, from any corner of the globe, the full texts of relevant

More information

Springer Protocols. springer.com. Introducing the world s most comprehensive collection of peer-reviewed life sciences protocols

Springer Protocols. springer.com. Introducing the world s most comprehensive collection of peer-reviewed life sciences protocols springer.com Springer Protocols Introducing the world s most comprehensive collection of peer-reviewed life sciences protocols 7 More than 18,000 protocols with an additional 2,000 added each year 7 Based

More information

Understanding Usage Statistics and Its Implications on Collection Management

Understanding Usage Statistics and Its Implications on Collection Management Understanding Usage Statistics and Its Implications on Collection Management Jaslyn Tan Marketing Manager (Subscriptions) John Wiley & Sons (Asia) Pte Ltd Discussion Point What is Usage Statistics? Does

More information

YOUR ACADEMIC. Tel: +44 (0) us on

YOUR ACADEMIC. Tel: +44 (0) us on YOUR ACADEMIC Advantage Tel: +44 (0) 207 936 6400 Email us on reachus@marketline.com WHAT IS advantage In classrooms, libraries and campuses across the globe, students, librarians and educators are all

More information

Engagement Portal. Employee Engagement User Guide Press Ganey Associates, Inc.

Engagement Portal. Employee Engagement User Guide Press Ganey Associates, Inc. Engagement Portal Employee Engagement User Guide 2015 Press Ganey Associates, Inc. Contents Logging In... 3 Summary Dashboard... 4 Results For... 5 Filters... 6 Summary Page Engagement Tile... 7 Summary

More information

2011 Global Frost & Sullivan Product Innovation Award for RFID UHF Solutions

2011 Global Frost & Sullivan Product Innovation Award for RFID UHF Solutions 2011 2011 Global Frost & Sullivan Product Innovation Award for RFID UHF Solutions 2011 Frost & Sullivan 1 We Accelerate Growth Frost & Sullivan s Global Research Platform The 2011 Global Frost & Sullivan

More information

RUNNING HEAD: LIBRARY & INFORMATION SCIENCE RESEARCH 1

RUNNING HEAD: LIBRARY & INFORMATION SCIENCE RESEARCH 1 RUNNING HEAD: LIBRARY & INFORMATION SCIENCE RESEARCH 1 Library & Information Science Research Kalvin Van Gaasbeck San Jose State University LIBRARY & INFORMATION SCIENCE RESEARCH 2 Library & Information

More information

Employer Guide

Employer Guide Queen s Undergraduate Internship Program (QUIP) 2018-19 Employer Guide Contents: Introduction.. p.3 Recruitment Timelines.. p.4 Posting an Internship Position p.5 Interviewing Candidates. p.6 Job Offers.

More information

Talent Management System User Guide. Employee Profile, Goal Management & Performance Management

Talent Management System User Guide. Employee Profile, Goal Management & Performance Management Talent Management System User Guide Employee Profile, Goal Management & Performance Management January 2017 Table of Contents OVERVIEW... 1 Access the Talent Management System (TMS)... 1 Access the TMS...

More information

Ball State University HR-TMS Training Guide: Staff

Ball State University HR-TMS Training Guide: Staff PeopleAdmin Ball State University HR-TMS Training Guide: Staff Human Resource s Talent Management System March 16, 2017 Human Resource s Talent Management System (HR-TMS) and PeopleAdmin As we embrace

More information

Topic 3 Calculating, Analysing & Communicating performance indicators

Topic 3 Calculating, Analysing & Communicating performance indicators National Erasmus + Office Lebanon Higher Education Reform Experts In collaboration with the Issam Fares Institute and the Office of Institutional Research and Assessment Ministry of Education and Higher

More information

Master Growth Marketing with Modern Analytics

Master Growth Marketing with Modern Analytics Master Growth Marketing with Modern Analytics Marketing Analytics in Periscope Data 201807 Content 5 7 8 10 11 12 13 14 Smarter Tools for Smarter Marketing Crunchbase Success Story Building a More Data-Driven

More information

BP(A S) Taleo Performance User Guide

BP(A S) Taleo Performance User Guide BP(A S) Taleo Performance User Guide January 2008 Confidential Information It shall be agreed by the recipient of the document (hereafter referred to as "the other party") that confidential information

More information

SAP SuccessFactors Foundation

SAP SuccessFactors Foundation SAP SuccessFactors Foundation Technical and Functional Specifications CUSTOMER TABLE OF CONTENTS KEY FEATURES AND FUNCTIONALITIES... 3 INTELLIGENT SERVICES... 4 Activities... 4 Administration... 4 INTEGRATION

More information

Mojisola O. Odewole Osun State University Library, Osogbo, P. M. B. 4494, Osogbo, Osun State, Nigeria

Mojisola O. Odewole Osun State University Library, Osogbo, P. M. B. 4494, Osogbo, Osun State, Nigeria The Role of a Librarian in Using Social Media Tools to Promote the Research Output of HIS/ HER Clienteles Mojisola O. Odewole Osun State University Library, Osogbo, P. M. B. 4494, Osogbo, Osun State, Nigeria

More information

The secret life of articles: From download metrics to downstream impact

The secret life of articles: From download metrics to downstream impact The secret life of articles: From download metrics to downstream impact Carol Tenopir University of Tennessee ctenopir@utk.edu Lorraine Estelle Project COUNTER lorraine.estelle@counterus age.org Wouter

More information

Moving Beyond Press Release Pick Up to Reporting on Real Outcomes

Moving Beyond Press Release Pick Up to Reporting on Real Outcomes Moving Beyond Press Release Pick Up to Reporting on Real Outcomes The Results You Should Be Sharing with Your Clients There are two things that are certain in life when you work for a PR agency. Number

More information

How to map excellence in research and technological development in Europe

How to map excellence in research and technological development in Europe COMMISSION OF THE EUROPEAN COMMUNITIES Brussels, 12.3.2001 SEC(2001) 434 COMMISSION STAFF WORKING PAPER How to map excellence in research and technological development in Europe TABLE OF CONTENTS 1. INTRODUCTION...

More information

Introduction to SEND Assurance Tool

Introduction to SEND Assurance Tool Introduction to SEND Assurance Tool Table of Contents QuiqSolutions Background... 2 QuiqCare... 2 Policy Manager... 2 Surveys, Audits & Requests for Information (RFI)... 2 QuiqCare SEND Assurance Tool...

More information

Graduate Medicine. Admissions Portfolio Guide INTERNATIONAL APPLICANTS Admissions Cycle

Graduate Medicine. Admissions Portfolio Guide INTERNATIONAL APPLICANTS Admissions Cycle Graduate Medicine Admissions Portfolio Guide INTERNATIONAL APPLICANTS 2018-19 Admissions Cycle GM Admissions Portfolio BACKGROUND UOW GM Admissions Portfolio allows you to identify personal experiences

More information

ARMSTRONG ATLANTIC STATE UNIVERSITY

ARMSTRONG ATLANTIC STATE UNIVERSITY PEOPLEADMIN USER GUIDE 1 TABLE OF CONTENTS OVERVIEW... 4 What is PeopleAdmin?... 4 Initial Set-up of user Accounts... 4 GETTING STARTED... 5 Logging In... 5 NAVIGATING THE HOMEPAGE... 7 1. Platforms...

More information

An Analysis of the Achievements of JST Operations through Scientific Patenting: Linkage Between Patents and Scientific Papers

An Analysis of the Achievements of JST Operations through Scientific Patenting: Linkage Between Patents and Scientific Papers An Analysis of the Achievements of JST Operations through Scientific Patenting: Linkage Between Patents and Scientific Papers Mari Jibu Abstract Scientific Patenting, the linkage between patents and scientific

More information

Oracle Knowledge Analytics User Guide

Oracle Knowledge Analytics User Guide Oracle Knowledge Analytics User Guide Working with Oracle Knowledge Analytics Reports Oracle Knowledge Version 8.4.2.2 April, 2012 Oracle, Inc. COPYRIGHT INFORMATION Copyright 2002, 2011, Oracle and/or

More information

COMPETITIVE INTELLIGENCE

COMPETITIVE INTELLIGENCE COMPETITIVE INTELLIGENCE 1 CAREER RESOURCE SERIES NEW CAREERS FOR MLIS AND LIS GRADUATES & ALUMNI Career Resource Series #1 - Competitive Intelligence 1 Foreword The role of the librarian is undergoing

More information

QS Stars Development Road Map. Universitas Lampung. QS Stars 2011 QS Intelligence Unit (a division of QS Quacquarelli Symonds Ltd)

QS Stars Development Road Map. Universitas Lampung. QS Stars 2011 QS Intelligence Unit (a division of QS Quacquarelli Symonds Ltd) QS Stars Development Road Map Universitas Lampung QS Stars 11 QS Intelligence Unit (a division of QS Quacquarelli Symonds Ltd) Contents Introduction... 2 Core Criteria... 3 Research Quality... 3 Graduate

More information

Getting Your Paper Noticed

Getting Your Paper Noticed Getting Your Paper Noticed The tools available to you Nicholas Pak, Solutions Consultant October 2015 2 You want to make sure your article gets the attention it deserves The volume of research articles

More information

Kathy O Kane Kreutzer, M.Ed., Office of Faculty Affairs, School of Medicine March, 2017

Kathy O Kane Kreutzer, M.Ed., Office of Faculty Affairs, School of Medicine March, 2017 SOM Authorship Guidelines, Recent Updates to the ICMJE Uniform Requirements for Scholarship, and the Emerging Role of Social Media in Monitoring Scholarship Kathy O Kane Kreutzer, M.Ed., Office of Faculty

More information

Make your research visible! Luleå University Library

Make your research visible! Luleå University Library Make your research visible! Luleå University Library Why this guide? Maximizing the visibility and impact of research is becoming ever more important in the academic world with tougher competition and

More information

DIRECTOR OF STUDENT RECRUITMENT

DIRECTOR OF STUDENT RECRUITMENT A formidable seat UNIVERSITY of learning OF STIRLING Appointment of Director of Student Recruitment where ability, not background, is valued DIRECTOR OF STUDENT RECRUITMENT Candidate Pack November 2018

More information

Make Data Count: April 2015 Progress Update. California Digital Library PLOS DataONE

Make Data Count: April 2015 Progress Update. California Digital Library PLOS DataONE Make Data Count: April 2015 Progress Update California Digital Library PLOS DataONE Make Data Count Overview Partners California Digital Library, PLOS, DataONE NSF Grant Record Grant No. 1448821 proposal

More information

Higher Education Funding Council for England Call for Evidence: KEF metrics

Higher Education Funding Council for England Call for Evidence: KEF metrics January 2018 Higher Education Funding Council for England Call for Evidence: KEF metrics Written submission on behalf of the Engineering Professors Council Introduction 1. The Engineering Professors Council

More information

SciVal Polska Research Assessment Academy

SciVal Polska Research Assessment Academy 1 SciVal Polska Research Assessment Academy Preparation for Certification Test Poznan, November 13 rd 2018. Peter Porosz p.porosz@elsevier.com 2 Welcome to the Polish SciVal user community! Starting January

More information

Oracle Fusion Talent Management

Oracle Fusion Talent Management Oracle Fusion Talent Management Service Oracle Fusion Talent Management IT Consultancy Service Oracle Fusion Talent Management Approach & Features Contents Oracle Fusion Talent Management... 3 A. Oracle

More information

Alternatives for STM Publishing in the Internet Age A Personal View

Alternatives for STM Publishing in the Internet Age A Personal View Alternatives for STM Publishing in the Internet Age A Personal View Dr. Antoine Bocquet E-mail: a.bocquet@naturejpn.com Asia-Pacific Publisher Nature Publishing Group Alternatives for STM Publishing in

More information

Insurance Day FAQ. June 2017

Insurance Day FAQ. June 2017 Insurance Day FAQ June 2017 Insurance Day enhancing your user experience Why has the Insurance Day website changed? Insurance Day s digital platforms have been upgraded to improve your experience when

More information

Certified Digital Marketing Specialist in Search

Certified Digital Marketing Specialist in Search Certified Digital Marketing Specialist in Search Align your skills with the needs of industry www.digitalandsocialmediaacademy.com Validated by the Syllabus Advisory Council (SAC). Including members from

More information

Comparing Journal Impact Factor and H-type Indices in Virology Journals

Comparing Journal Impact Factor and H-type Indices in Virology Journals University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Library Philosophy and Practice (e-journal) Libraries at University of Nebraska-Lincoln 2012 Comparing Journal Impact Factor

More information

Radian6 Overview What is Radian6?... 1 How Can You Use Radian6? Next Steps... 9

Radian6 Overview What is Radian6?... 1 How Can You Use Radian6? Next Steps... 9 Radian6 Overview What is Radian6?... 1 How Can You Use Radian6?... 6 Next Steps... 6 Set up Your Topic Profile Topic Profile Overview... 7 Determine Your Keywords... 8 Next Steps... 9 Getting Started Set

More information

LEARNER GUIDE. Government Regulatory Compliance Qualifications. Core Knowledge

LEARNER GUIDE. Government Regulatory Compliance Qualifications. Core Knowledge LEARNER GUIDE Government Regulatory Compliance Qualifications Core Knowledge Congratulations on signing up Central and local government regulators play a crucial role in delivering outcomes that contribute

More information

Altmetriikka & Visibility

Altmetriikka & Visibility Altmetriikka & Visibility Jukka Englund Jukka.Englund@Helsinki.f Terkko Medical Campus Library Helsinki University Library PLoS (http://www.ploscollections.org/article/browseissue.action? issue=info:doi/10.1371/issue.pcol.v02.i19)

More information

Access to professional and academic information in the UK

Access to professional and academic information in the UK 1 Access to professional and academic information in the UK A survey of SMEs, large companies, universities & colleges, hospitals & medical schools, governmental & research institutes Companion report

More information

See What's Coming in Oracle Talent Management Cloud

See What's Coming in Oracle Talent Management Cloud See What's Coming in Oracle Talent Management Cloud Release 9 Release Content Document 1 TABLE OF CONTENTS REVISION HISTORY... 3 HCM COMMON FEATURES... 4 HCM Extracts... 4 Deliver Extracts Using HCM Connect...

More information

2012 North American Clinical Laboratory Competitive Strategy Leadership Award

2012 North American Clinical Laboratory Competitive Strategy Leadership Award 2012 2012 North American Clinical Laboratory Competitive Strategy Leadership Award 2012 Frost & Sullivan 1 We Accelerate Growth Competitive Strategy Leadership Award Clinical Laboratory, North America,

More information