Bibliometric Benchmarking Analysis of Universidade Nova de Lisboa, /13

Size: px
Start display at page:

Download "Bibliometric Benchmarking Analysis of Universidade Nova de Lisboa, /13"

Transcription

1 Bibliometric Benchmarking Analysis of Universidade Nova de Lisboa, /13 E. van Wijk and M.S. Visser Center for Science and Technology Studies (CWTS) Leiden University PO Box 905, 2300 AX Leiden The Netherlands 9 October 2014 Research Report to the Universidade Nova de Lisboa, Portugal Report CWTS, Draft Version

2 Contents 1 Introduction 1 2 Bibliometric indicators 4 3 Data collection 11 4 Overall results for UNL and benchmarks 16 5 Highly cited papers in main fields 33 6 Trend analyses for UNL in main fields 50 7 General comments and discussion 52 8 Overview of results 54 References 58 Appendix I: Overview of main fields and subfields 61 Appendix II: List of World universities 68 Appendix III: Bibliometric indicators for main fields 71 Appendix IV: Highly cited papers for main fields 86 Appendix V: Trend Analyses 101 Appendix VI: Bibliometric statistics for (sub)fields 109 Appendix VII: Research and Impact Profile Comparison Charts 113

3 1 1 Introduction The objective of the present study is to provide insight in important aspects of publication output and international citation impact of the Universidade Nova de Lisboa (UNL). To this end, the publication output and impact of the Universidade Nova de Lisboa has been assessed in comparison to the research performance of 365 large research universities worldwide. A more detailed comparison has been carried out for a selection of 35 specific European benchmark universities of which 7 are Portuguese. In addition we compared the impact of the Universidade Nova de Lisboa to the general impact levels in Portugal and Europe (29 countries). This report is an update of three previous studies, the first one covering (Visser & Nederhof, 2007), the second one covering (Visser, Van Raan, and Nederhof, 2010) and the third one covering (Van Wijk and Visser, 2012) In comparison to the previous report focusing on the time period , there are a number of important changes in the current report: The pool of 365 world universities which are used to calculate ranking scores has been updated (Appendix II). The definitions of universities correspond with the Leiden Ranking edition This means among other things that the assignment procedure in the case of affiliations with university hospitals has become more strict. In previous studies publications in journals covering many different scientific disciplines (e.g. Science, Nature) were assigned to a separate field labeled Multidisciplinary Sciences. In this study we re-assigned the publications in these journals to their actual fields on the basis of their reference lists. For example, if an article in Nature refers to articles published in astronomy journals, that article has been assigned to the subject category Astronomy. If this article also has references to articles from other sub fields it would be assigned to each sub field it refers to whereby the fraction depends on the proportion of references. The calculation of the percentile based indicators has been updated. The actual/expected ratio s used in the previous report are now indicated as proportions or percentages of the total output. More details can be found in Waltman and Schreiber (2012). The distinction between normal articles and review articles for the purpose of normalization has been dropped. In the past, this distinction was based on the classification of document types in the Web of Science. Recent research from CWTS and others (A-W Harzing, 2013) has shown that this classification has some serious limitations. As there is no alternative at hand we have suspended the distinction for the time being. The covered period is for publications and their citation impact in , while a trend analysis traces publications and their impact back to The study is based on a quantitative analysis of scientific articles published in journals and serials processed for the Web of

4 2 Science (WoS) versions of the Science Citation Index and associated citation indices: the Science Citation Index (SCI), the Social Science Citation Index (SSCI), and the Arts & Humanities Citation Index (A&HCI); here the CWTS database containing these records as well as enhanced citation data is briefly indicated as CI. Non-serial literature has not been included in the present study. Although non-serial literature might be of some importance for UNL, on an aggregate level, CI publications account for the major part of the total impact in most disciplines (see Section 3.2.2). There are two main approaches to what research performance indicators should address. (1) The past performance approach focuses on an ex-post assessment of the past performance of a group of scientists from a perspective of accountability of research funds allocated to the research unit during a certain period. Then, retiring scientists and those formerly working in the research unit should be included. (2) The back-to-the-future or prospective approach addresses the performance of the scientists who are still active in a particular research unit, from the objective of obtaining a view on the research performance of those who have the task to shape the future of this research unit. Therefore, this ex-ante approach has been called back-to-the-future. Then, it seems appropriate to exclude scientists no longer working in the research unit. Both approaches relate to the past performance of groups of scientists. However, the policy view underlying the latter approach is more directed to the future, while the perspective adopted in the first approach is more focused on the past. In the present study, the past performance approach has been adopted. The data on publications of universities were collected by means of their work address(es), which should include mention of at least one affiliation with the university (or a university institute). As a result output and citation impact indicators for Universidade Nova de Lisboa and benchmark universities can be compared. The selection process of publications entails that: the focus of the present study is on publications that report on research activities conducted at the universities; publications by currently employed scientists resulting from former affiliations may not have been included. on the other hand, publication output may include papers authored by temporary or retired staff and students. This means that the present publication data will offer a broad view of publications produced at the universities; as the addresses on publications do not allow at present satisfactory allocation of publications to organizational units below the level of universities, publications are aggregated according to broad fields of science, so called main fields ; data have also been analysed at the CI subfield level. One should keep in mind that a CI subfield, for example Ecology, refers only to a combination of journals, and not to an

5 3 institutional or departmental affiliation. As a consequence, it is not unusual that publications in one subfield have been contributed by members from several research units. For instance, occasionally, a physiology department may not even publish a single document in the CI subject category Physiology. This means that results represent rough indications of stronger and weaker points of Universidade Nova de Lisboa research; Structure of the report The structure of this report is as follows. The bibliometric indicators applied in this study are described in Section 2, with an overview in Section 2.4. Section 3 gives the main lines of data collection, including analyses relating to CI coverage. Section 4 presents the overall results for the Universidade Nova de Lisboa and benchmarks, while results for highly cited papers are included in Section 5. Trend analyses for the Universidade Nova de Lisboa in main fields are addressed in Section 6. Section 7 provides general methodological comments. Finally, Section 8 provides an overview of the main results.

6 4 2 Bibliometric indicators 2.1 Introduction Bibliometrics is the quantitative study of written products of research. It is assumed that scientific subjects develop at an international research frontier (Price, 1963). Research results are communicated in publications that are submitted to evaluation by professional colleagues. In the references of their papers, scientists acknowledge relevant publications by others, as they build on previous work. Therefore, the number of times a publication is referred to gives a partial indication of the impact of a publication, its reception and use by scientists at the research frontier. In nearly all scientific fields, the scientific journal is by far the most important medium of communication. The Web of Science (WoS) claims to cover the most important leading international journals and serials (such as Annual Reviews) with a well-functioning referee system. In addition, the overall citation rate of journals is considered, as well as their timeliness of publication, and adherence to international editorial conventions. Regularly, a limited number of new journals is added, while other journals are no longer covered. More peripheral journals, often national in scope, are usually not covered by the WoS. On an annual basis, the CWTS CI database includes about 8,000 leading international journals from all domains of scholarship. The process of data-collection and the methodology applied in this study are comparable to those adopted in previous studies on, for instance, physics research (Rinia et al., 2001), biology (Nederhof et al., 1999), electrical and electronic engineering (Van Leeuwen et al., 2000), chemistry (Van Leeuwen et al., 2003), the humanities (Nederhof, 2006; Tijssen et al., 2010), medicine (Tijssen et al., 2002) and psychology (Nederhof, Van Leeuwen & Visser, 2000). Publications were derived from a large bibliometric database of scientific publications. This database contains all scientific articles published in serials processed during the period by the Institute for Scientific Information (ISI; now part of Thomson Scientific) for the Web of Science versions of the Science Citation Index (SCI), the Social Science Citation Index (SSCI), the Arts & Humanities Citation Index (A&HCI). The database includes citation data on all journals processed for the SCI, SSCI, A&HCI or CI for short. A detailed description of the main principles behind this database is given in Moed, De Bruin & Van Leeuwen (1995) and Moed (2005). Our work is partly based upon previous work by Garfield (1979), Martin & Irvine (1983), Narin (1990), Van Raan (1997), and Schubert, Glänzel & Braun (1989). Both statistical requirements and imperfections in the citation process (for a discussion see Nederhof, 1988) make it desirable to aggregate across individuals, publications, and citations. As scientific (sub)fields differ in publication and citation patterns (as visible in differences in, for example, length of reference lists or age of cited literature), it is usually not meaningful to compare directly the raw impact of publications from one (sub)field with those of a different (sub)field. Therefore, in our studies raw impact scores are compared to the impact of similar publications within the same subfield (see Section 2.2).

7 5 2.2 Output and impact indicators We calculate several indicators for the total CI output or oeuvre of a research unit, as produced within the time frame of the study (cf. Moed, De Bruin & Van Leeuwen, 1995, Waltman et al. 2010). Publications, citations, and citations per publication A first statistic gives the total number of papers published by the research unit during the entire period (P). We considered only papers classified in the Web of Science as normal articles, letters and reviews. Other document types, such as meeting abstracts, book reviews, corrections, comments, editorial material and editorials are not included. Publications classified as letters are given a weight of 0.25 while articles and reviews are given a weight of 1. The second indicator gives the total number of citations received, without (C) self-citations. A selfcitation (sc) to a paper is a citation given in a publication of which at least one author (either first author or co-author) is also an author of the cited paper (either first author or co-author). International reference value For each paper an international reference value is computed. This value presents the mean citation rate of the subfield in which the paper was published. Our definition of subfields is based on a classification of scientific journals into subject categories developed by Thomson Reuters/ISI (see Section 3.3). The international reference value takes into account both the type of paper (normal article and review articles on the one hand and letters on the other hand), and the specific year in which the paper was published. For example, the number of citations received during the period by a letter published by a research unit in 2007 in subfield X is compared to the average number of citations received during the same period ( ) by all letters published in the same subfield (X) in the same year (2007). Normalized citation scores The international reference value explained above is used as the expected citation score for each paper published. The crown indicator, the MNCS (the Mean Normalized Citation Score) relates the actual number of received citations for with these expected citation scores by first calculating the ratio of actual and expected citations for each publication separately and by then taking the average of the ratios. This normalization mechanism was first proposed by Lundberg (2007). As the normalization procedure takes into account the publication date, the document type, and also the differences in the citation characteristics between sub fields in which the papers are published the MNCS can be considered as an appropriate indicator to compare the research performance of a research unit with other units. If the MNCS indicator is above (below) 1.0, this means that the

8 6 publications of the research unit are cited on average more (less) frequently than the publications in the subfield(s) in which the research unit is active. This world average is calculated for the total population of articles published in CI journals assigned to a particular subfield. As a rule, nearly 75 percent of these papers are authored by scientists from the United States, Canada, Western Europe, Australia and Japan. Therefore, this world average is dominated by the Western world. The brute force indicator The crown indicator represents the field normalized impact score per paper and is in that sense size independent. The international visibility of a research unit also depends to a large extent on the volume of the publication output. The brute force indicator TNCS (the Total Normalized Citation Score) combines output with impact as each paper contributes to the brute force indicator based on their field normalized impact. Reliability Due to the presence of error (Moed et al., 1995), only the first decimal of the ratios is usually reliable, given that it is based on a sufficient number of publications (N>50). Even for a quite large number of publications, a 5% difference or shift in the value of an indicator should not be regarded as a significant result.

9 7 Highly cited publications An additional set of impact indicators reflects the contribution to the most frequently cited papers worldwide. To examine the distribution of highly cited papers, we have ranked each publication on the number of citations it received up to four years after publication. We marked those belonging to, for example, the 10% most frequently cited papers in a particular subfield in a given publication year. The use of the fixed length four-year citation window implies that the analysis only involves papers published during Moreover, letters were excluded because of their relatively low impact compared to articles and reviews. Thus, the P06-10 figure gives the number of review articles and normal articles published during The indicator Ptop x% renders the absolute number of papers that are represented among the top 10% or 5% most frequently cited papers world-wide that are similar in publication year, document type, and subfield. The method used in the present study is advanced, as the rank of papers is calculated based on the actual impact distribution of all similar papers worldwide, and self-citations are excluded. PPtop x% relates the proportion of papers among the top 10% or top 5% most frequently cited publications compared to the total number of papers. A value above (or below) 0.1 of PPtop 10%, for example. indicates that more (or less) than 10% of the total papers belong to the 10% most frequently cited papers. 2.3 Ranking Positions Overall ranking positions are computed within the set of 365 world universities listed in Appendix II. These universities are selected based on the size of their publication output. As the bibliometric indicators presented in the report are at least to some extent size dependent, ranking positions are only given for this group of world universities. Although the Universidade Nova de Lisboa is not among the 365 world universities based on the size of their publication output, we computed their ranking position to give an impression of where UNL would be positioned. At the level of individual disciplines we followed a somewhat different procedure. On the one hand, universities publishing infrequently in a main field (<15% of the median number of publications of the 365 World Universities) were excluded from the rankings. This prevents that universities are ranked improperly on indicators based on relatively few publications, for instance, that a technical university without a university hospital obtains a spurious ranking in the main field Clinical medicine. On the other hand we included UNL and benchmark universities that did not belong to the group of 365 world universities in case their publication output in these fields was sufficiently large. Therefore the group of universities for which ranking positions are calculated varies by discipline as is shown in the overview below.

10 8 The number of universities included in the calculation of ranking scores Main Field Nr. Univs Applied Physics and Chemistry 365 Biological Sciences: Animals & Plants 356 Biological Sciences: Humans 359 Chemistry 365 Clinical Medicine 336 Economics 328 Engineering 359 Geosciences 350 Humanities & Arts 321 Mathematics 363 Molecular Biology & Biochemistry 377 Other Social Sciences 338 Physics and Astronomy 345 Psychology, Psychiatry and Behavioral Sciences 334 Social Sciences related to Medicine 341

11 9 2.4 Main fields and sub fields In order to obtain a more detailed view of the research performance of Universidade Nova de Lisboa in comparison to its benchmarks, the publication output of UNL and benchmark universities is categorized into 15 main fields such as Clinical Medicine, Physics and Astronomy etc. These main fields are aggregates of ca. 250 ISI subject categories (e.g. Ecology, Cell Biology, Condensed Matter Physics etc.). An overview of the categorization of the subject categories into the 15 main fields is given in Appendix I. In turn, these subject categories are aggregates of journals pertaining to the particular sub fields. In this way, each publication is classified by means of the journal in which they appear. In addition to the separate analyses for each of the 15 main fields, bibliometric indicators are also computed at the level of sub fields in Appendix VI and VII. If a paper appears in a journal that is classified in more than one subfield, the paper (and its citations) is distributed over the subfields. Thus, a paper with 7 citations published in a journal categorized in three subfields is counted as 0.33 publication with 2.33 citations in each subfield. The same procedure was applied for the assignment to main fields. Here, a paper assigned to more than one main field contributes only for a fraction to each of the main fields. For example, if a paper was published in a journal that is assigned to two subfields belonging to Clinical Medicine and to one subfield belonging to Biological Sciences: Humans, the paper contributes for two thirds to Clinical Medicine and one third to Biological Sciences: Humans. For publications in each subfield, the impact is compared to the world subfield average score, as described in Section 2.2. In particular at the subfield level, these scores need to be interpreted with caution as the numbers of publications are often relatively low. In the results presented in Appendix VII, the impact scores are grouped into three classes merely as an indication; if the MNCS value is lower than 0.8, the impact is said to be low, if the ratio is higher than 1.2, the impact is designated as high, while a ratio between 0.8 and 1.2 is called average. The bibliometric indicators at the level of the main fields and sub fields may indicate stronger and weaker points in the research profile of UNL vis-à-vis its benchmarks. However, one should refrain from equating the outcomes of these analyses with the performance of specific organizational units such as faculties and departments. A different methodology is required to assess the productivity and impact of such organizational units.

12 Overview of bibliometric indicators P C MNCS TNCS P0610 Ptop x% PPtop x% The number of articles (normal articles, letters, notes and reviews) published in journals processed for the Web of Science (CI) versions of the Science Citation Index, the Social Science Citation Index, the Arts and Humanities Citation Index (see Section 2.1). Letters are given a weight of The number of citations recorded in CI journals (as contained in Web of Science CI version) to all articles involved. Self-citations are excluded. Mean Normalized Citation Score: Normalization is performed by first calculating the ratio of actual and expected citations for each publication separately and by then taking the average of the ratios. The expected number of citations is based on the average citation score of publications of the same document type that belong to the same field and were processed for the citation index in the same year. Total Normalized Citation Score: The sum of the normalized citation scores of all papers by a research unit. The indicator reflects to a large extent the size of the research unit as it depends on the volume of published papers. Number of papers (normal articles and reviews) published in journals processed for the Web of Science version of ISI Citation Indices (CI) in the period The absolute number of papers that are among the 10% or 5% most frequently cited of all similar papers. The actual number of papers among the 10% or 5% most frequently cited compared to the expected number of papers in these top percentiles based on the total volume of papers.

13 11 3 Data collection 3.1 Levels of aggregation and time periods Indicators are computed at the following levels of aggregation of UNL scientists: a) the total collection of all articles having at least one author affiliated with the Universidade Nova de Lisboa; b) Benchmark units: World Universities: the 365 universities with the largest publication output in the world during The names of the universities are listed in Appendix II; Benchmark Universities: University of Algarve; University of Aveiro; University of Coimbra; University do Minho; University of Lisboa; University of Porto; University Tecnica Lisboa 1 ; Aarhus University; ABO Akademi University; Dublin City University; ETH Zurich; Katholieke Universiteit Leuven; Masaryk University Brno; Norwegian University S&T Trondheim; University Autonoma Barcelona; University of Bath; University of Bergen; University of Granada; University of Helsinki; University Libre Bruxelles; University Paris V René Descartes; University of Southampton; University of Southern Denmark; University Stuttgart; University Toulouse III; University Turku; University of Twente; Universiteit Utrecht; Aristotle University of Thessaloniki, Technische Universität Dresden, Technische Universität Wien; Jagiellonian University in Krakow; University of Leeds; University of Perugia; University of Sevilla. Portuguese Universities: University of Algarve; University of Aveiro; University of Coimbra; University do Minho; University of Lisboa; University of Porto; University Tecnica Lisboa and the Universidade Nova de Lisboa. Portugal: All papers with at least one Portuguese work address, whether or not linked to a university. Europe: The 28 current member states of the European Union, plus Switzerland and Norway. Throughout the report, the papers indicated with Europe relate to all papers that mention at least one work address located in one of these countries. These work addresses may include affiliations with universities, companies, government etc. Double occurrences of papers are excluded within each unit of analysis. So, one paper, labeled to two or more different research units, is counted only once on a higher level of aggregation. Similarly, a paper, co-authored by several scientists belonging to the same unit, is counted only once. 1 Although in 2013 UTL has merged with the University of Lisbon, in this report that concerns the period we treated UTL and UL as separate universities.

14 12 The bibliometric analysis relates to journal articles published during the period (main analysis), or (trend analysis). Citations were counted from (main analysis), or from (trend analysis). More recent impact data were not available during the data collection period of this study. Apart from an overall analysis of the impact data for the whole period, we also conducted an analysis of the main indicators across four-year periods at the level of main fields for UNL.

15 Data collection CI papers All relevant publications from the database years were extracted from our CI publication database. We considered only papers classified in the CI as normal articles, letters, notes, and reviews, published in source serials processed for the CI on Web of Science. Other document types, such as meeting abstracts, editorials, editorial material, corrections, comments, and book reviews were not included. Also, papers in non-ci source journals are not counted. A few journals are only partially processed for the CI. Here, only papers processed for the CI were included. The publication data for UNL were collected through internal procedures at UNL and provided to CWTS for the purpose of the current analysis. Articles for benchmark universities were assigned according to the institutional affiliations of authors, as included in the corporate address field. We selected all articles containing the name of a university (and its major departments) explicitly in the address. It is important to note that in the large majority of cases the data have not been verified by representatives of institutions (Moed, 2006) 2. Based on an analysis of the names appearing in the department field, several universities were split into colleges or constituent universities (e.g., University of London, University of Wales, National University of Ireland, University of California at Los Angeles; University of Texas at Austin). Finally, from this set of universities, we selected the 365 universities in the world with the largest publication output in WoS journals during In view of the collaboration among institutions, resulting in co-publications by scientists from two or more institutions, the publication output of a university should be conceived as the publications to which the university contributed The importance of CI Publications for UNL research There are several ways to assess the importance of CI publications for UNL researchers. One way would be to calculate the share of UNL publications that is included in the Web of Science as a percentage of the total number of publications. For such an analysis, details on the complete publication output of UNL - including all UNL publications that are not processed by the Web of Science - would be required which were not available for the purpose of the present study. It should also be taken into account that the Web of Science does not attempt to cover all publications in 2 We thank W. Glänzel (Centre for R&D Monitoring (ECOOM) at K.U.Leuven) for his kind assistance.

16 14 science and in fact is selective in its coverage. It aims to capture the scholarly and scientific communication in the most important international journals. As an alternative to gain insight in the importance of CI publications for UNL researchers, we studied the reference lists of the UNL CI papers included in the present study. These reference lists can be considered as the knowledge base on which UNL researchers build. Here, we analyze to what extent this knowledge base is covered by the Web of Science. The limitation of this approach is that only reference lists from CI publications are available while reference lists in other sources are excluded. Therefore, only a partial view can be obtained, especially in disciplines where journals are less important channels in the scholarly communication system. All references in the UNL CI papers ( ) were matched with our extended CI publication database. In this way, we can estimate the importance of CI publications to UNL researchers by determining to what extent they themselves cite CI papers, and to what extent other, non-ci, documents. Due to the extension of our database, we could only trace references dated between 1980 and Self-citations were included, as we could not exclude all self-citations for non-ci documents. Data were also collected at the level of main fields. Table 1 presents the results of the analysis. P06-12 represents the number of CI WoS articles, letters and reviews published between 2006 and As an illustration, we discuss the overall results for the Universidade Nova de Lisboa in the last row. UNL published 5,584 CI papers in the years On average, 94% of the references within these papers dated from , while 6% (the figure included in Table 1 under % Refs < 1980 ) was dated prior to Finally, 81% of the UNL references from could be matched to CI Web of Science papers (%Refs CI). This finding suggests that non-ci documents are of some importance to UNL researchers as they account for 19% of the references in their papers. However, rather large differences between main fields can be observed. In all, for seven main fields, CI coverage is high (at least 80%) for the Universidade Nova de Lisboa ( Clinical Medicine, Biological sciences: Humans ; Biological sciences: Animals & Plants ; Molecular Biology & Biochemistry ; Physics and Astronomy, Chemistry, Applied Physics and Chemistry ). CI coverage still accounts for at least two out of three cited references in four other main fields ( Mathematics, Geosciences, Psychology, Psychiatry and Behavioral Sciences and Social sciences related to Medicine ). In, Economics and Engineering CI documents constitute still the majority of the literature contained in the reference lists of the CI documents included in the present study. In these disciplines non-ci documents are of considerable importance. The same observation is valid for Other Social Sciences with just less than half of the cited references contained in the CI. Finally, CI coverage is poor in Humanities & Arts accounting for 1 out of 4 cited references.

17 15 TABLE 1: COVERAGE INDICATORS FOR THE UNIVERSIDADE NOVA DE LISBOA Main Field P Avg Nr Refs %Refs <1980 Nr. Refs >1979 %Refs CI CLINICAL MEDICINE % 32,033 88% BIOL SCI: HUMANS % 35,678 92% BIOL SCI: ANIMALS & PLANTS % 25,578 81% MOLECULAR BIOLOGY & BIOCHEM % 47,389 92% PHYSICS AND ASTRONOMY % 10,848 85% CHEMISTRY % 51,094 89% MATHEMATICS % 7,762 71% GEOSCIENCES % 16,398 68% APPLIED PHYSICS AND CHEMISTRY % 27,341 84% ENGINEERING % 18,940 57% ECONOMICS % 10,135 60% PSYCHOLOGY, PSYCHIATRY & BEHAV SC % 4,084 71% SOCIAL SCIENCES RELATED TO MEDICINE % 5,179 73% OTHER SOCIAL SCIENCES % 6,712 49% HUMANITIES & ARTS % 4,519 25% ALL DISCIPLINES 5, % 210,170 81%

18 16 4 Overall results for UNL and benchmarks Table 2 includes the overall results for Universidade Nova de Lisboa and its benchmarks for all disciplines combined. So-called block indicators are calculated for the period /13. This means that for publications from each of the publication years ( ), citations are counted up to and including For example, an eight-year citation window is used for papers published in 2006, and a two-year citation window for papers published in Self-citations were excluded for all impact indicators. Units publishing infrequently in a main field (<15% of the median number of publications of the 365 World Universities) were excluded from the rankings (see Section 2.3). This prevents that universities are ranked improperly on indicators based on relatively few publications, for instance, that a technical university without a university hospital obtains a spurious ranking in the main field Clinical Medicine. Universidade Nova de Lisboa was ranked in all main fields except Clinical Medicine and Psychology, Psychiatry and Behavioral Sciences ; Figures 1-14 present the overall results of UNL and its benchmarks for the remaining fourteen main fields for several bibliometric indicators regarding publication output and field normalized output. Indicators: P Rank (P) C Rank (C) MNCS Rank (MNCS) TNCS Rank (TNCS) Number of CI publications Ranking position of UNL and benchmarks among 365 World Universities based on the value of P Number of citations excluding self citations Ranking position of UNL and benchmarks among 365 World Universities based on the value of C Mean citation impact of the papers of UNL and benchmarks compared to the world citation average of the papers in the subfields to which they belong. Ranking position of UNL and benchmarks among 365 World Universities based on the value of MNCS Brute force indicator : P x MNCS, the total citation impact of the papers of UNL and benchmarks normalized for differences among fields, document types and publication date. Ranking position of UNL and benchmarks among 365 World Universities based on the value of TNCS Results for the ratio indicator MNCS and TNCS can be directly compared across periods.

19 17 TABLE 2: BIBLIOMETRIC INDICATORS FOR ALL DISCIPLINES University P Rank C Rank MNCS Rank TNCS Rank Univ Nova Lisboa 5, , , Univ Algarve 2,152-13, ,925 - Univ Aveiro 6,987-47, ,216 - Univ Coimbra 7,734-61, ,571 - Univ Minho 5,128-46, ,877 - Univ Lisbon 8, , , Univ Porto 12, , , Tech Univ Lisbon 9, , , Vienna Univ Technol 7,074-49, ,489 - Katholieke Univ Leuven 31, , , Univ Libre Bruxelles 11, , , ETH Zurich 26, , , Masaryk Univ - Brno 5,086-30, ,859 - Tech Univ Dresden 14, , , Univ Stuttgart 7,253-65, ,200 - Aarhus Univ 16, , , Univ Southern Denmark 7,329-88, ,869 - Univ Autonoma Barcelona 16, , , Univ Granada 11, , , Univ Seville 8, , , Åbo Akademi Univ 2,871-26, ,130 - Univ Helsinki 23, , , Univ Turku 8, , , Paris Descartes Univ 14, , , Paul Sabatier Univ 17, , , Univ Bath 6,539-65, ,406 - Univ Leeds 16, , , Univ Southampton 16, , , Aristotle Univ Thessaloniki 12, , , Dublin City Univ 2,893-22, ,151 - Univ Perugia 7,718-80, ,731 - Univ Twente 7,635-71, ,673 - Utrecht Univ 29, , , Norwegian Univ Sci & Technol 10, , , Univ Bergen 10, , , Jagiellonian Univ - Krakow 9, , , Portuguese Universities 43, , ,601 - Benchmark Universities 380,649-3,927, ,095 - Portugal 62, , ,094 - Europe 3,137,912-28,150, ,416,035 -

20 18 Table 2 shows the scores of the Universidade Nova de Lisboa among the 365 World universities across main fields and disciplines in /13. The block analysis shows that publications of UNL researchers are cited slightly above the worldwide citation average in the fields in which they are active. For Universidade Nova de Lisboa, the 5,584 publications (P) were cited 49,511 times (C) externally, i.e., by others than the (co-)authors of a publication, during The brute force indicator TNCS shows the output of papers that are equivalent with a field-normalized citation impact of 1. Here, UNL has an equivalent of 6,430 average impact units. As indicated before, based on publication output UNL is not among the largest 365 research universities in the world. Therefore it is not surprising that UNL ranking positions on the size dependent indicators (P, C, and TNCS) are at the bottom end of the ranking. Regarding field normalized impact per paper (MNCS) is on a 260 th position. From the 35 benchmark universities, 4 Portuguese universities and 9 foreign universities do not have ranking scores as they do not belong to the 365 largest research universities. The impact score of the Universidade Nova de Lisboa is comparable to that of the other Portuguese universities, in the higher end of the distribution, which most of them have field normalized impact scores that are competitive with the world average. The outlier is the University of Algarve which doesn t meet world average. On average the benchmarking institutes score some 20% better than UNL. The impact scores of 16 benchmark universities outside Portugal are substantially higher (by nearly 20% or more) while the scores of 7 universities (Dresden, Stuttgart, Barcelona, Paris Descartes, Bath, Perugia and Norwegian Science) are at least 10% higher. The impact score of UNL is substantially better than 7 foreign benchmark universities (Masaryk, Seville, Abo Akademi, Turku, Thessaloniki, Dublin City and Krakow) with impact scores that are at least 10% lower than world average. Below, results are described in greater detail for one of the main fields. Main fields We made an analysis of the relation between output (in terms of CI-publications) and impact for 14 distinct main fields in which the Universidade Nova de Lisboa is active. Figures 1 14 combine output figures (P) and field-normalized impact results (MNCS) As an illustration, we describe some of the results in the main field Chemistry (see Figure 6). The Universidade Nova de Lisboa is positioned clearly above the world average indicated in the Figure by the reference line. In fact the field normalized impact is 26% above world average which corresponds to a 184 st position within the pool of World Universities (see Appendix III). This is a significant improvement over the results in Among the benchmark universities, UNL belongs to the higher ranked universities with regard to field normalized. Appendix III includes the complete set of tables for the fourteen main fields.

21 19 FIGURE 1: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 CLINICAL MEDICINE KU Leuven 1.8 Southampton Leeds 1.6 UL Bruxelles Utrecht Trondheim Aarhus So Denmark Helsinki ETH Zurich 1.4 Toulouse III Bergen Paris V Univ Perugia Turku TU Dresden 1.2 Twente UA Barcelona UC UL UNL MNCS 1 Univ Seville Jagiellonian Granada UP Thessaloniki Brno TOTAL PUBLICATIONS

22 20 FIGURE 2: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 BIOL SCI: HUMANS 1.6 ETH Zurich Southampton 1.4 Trondheim Paris V KU Leuven Leeds Utrecht UL Bruxelles Helsinki 1.2 Univ Perugia Aarhus Bath UC So Denmark TU Dresden Toulouse III UM Bergen UA Barcelona UL Thessaloniki 1 UTL UNL Turku Univ Seville UP Granada MNCS 0.8 Jagiellonian Brno TOTAL PUBLICATIONS

23 21 FIGURE 3: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 BIOL SCI: ANIMALS & PLANTS Leeds 2 ETH Zurich Utrecht Bath UNL UM 1.5 Southampton Toulouse III Twente UA Barcelona Aarhus Paris V So Denmark Turku KU Leuven Helsinki Stuttgart UTL MNCS Abo Akad UL Bruxelles TU Dresden UA Trondheim Granada Bergen UP 1 UC UAlg Univ Seville UL Thessaloniki Univ Perugia Brno Jagiellonian TOTAL PUBLICATIONS

24 22 FIGURE 4: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 MOLECULAR BIOLOGY & BIOCHEM Utrecht 2 ETH Zurich So Denmark Helsinki 1.5 Paris V Twente TU Dresden KU Leuven Trondheim UC Bergen Southampton UL Bruxelles Leeds Aarhus MNCS 1 TU Wien Abo Akad DCU Brno Univ Perugia Bath Stuttgart UL Turku UA Barcelona Toulouse III UA Univ Seville UM UTL UP UAlg Thessaloniki Granada UNL Jagiellonian TOTAL PUBLICATIONS

25 23 FIGURE 5: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 PHYSICS AND ASTRONOMY UM Granada MNCS 2.5 UNL Southampton Helsinki ETH Zurich UA Barcelona Bergen 2 Twente UC Univ Perugia UL UL Bruxelles Thessaloniki Stuttgart TU Wien TU Dresden Utrecht 1.5 UP Aarhus Bath So Denmark Trondheim Leeds UTL Jagiellonian Toulouse III 1 DCU UA Turku KU Leuven Univ Seville Brno TOTAL PUBLICATIONS

26 24 FIGURE 6: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 CHEMISTRY Aarhus 1.6 Utrecht ETH Zurich Bath Twente 1.4 DCU KU Leuven UNL Leeds Southampton Stuttgart 1.2 Helsinki TU Dresden UA Toulouse III Abo Akad UA Barcelona 1 Granada Univ Perugia UL Bruxelles TU Wien Thessaloniki Trondheim So Denmark Univ Seville UTL UP MNCS 0.8 Paris V Bergen UM Turku UL UC UAlg 0.6 Jagiellonian Brno TOTAL PUBLICATIONS

27 25 FIGURE 7: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 MATHEMATICS Helsinki 1.8 Trondheim 1.6 ETH Zurich 1.4 Bath TU Wien 1.2 So Denmark Bergen Aarhus UA Toulouse III Univ Seville UAlg DCU Univ Perugia Utrecht UA Barcelona Leeds KU Leuven MNCS 1 Paris V Twente Southampton UL Bruxelles Stuttgart UL Granada UM 0.8 Turku TU Dresden UC UTL Brno UP Jagiellonian 0.6 Thessaloniki UNL TOTAL PUBLICATIONS

28 26 FIGURE 8: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 GEOSCIENCES So Denmark 2 Leeds Utrecht ETH Zurich 1.5 TU Wien UL Bruxelles Aarhus Bergen Southampton Twente UA Barcelona Abo Akad Helsinki Toulouse III Bath UNL TU Dresden KU Leuven MNCS 1 Turku Univ Seville Univ Perugia Stuttgart UTL UP UL Trondheim Granada UM UC Brno UA Thessaloniki UAlg Jagiellonian TOTAL PUBLICATIONS

29 27 FIGURE 9: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 APPLIED PHYSICS AND CHEMISTRY Utrecht UL Bruxelles ETH Zurich Aarhus So Denmark Twente Southampton Abo Akad Bath Univ Perugia Helsinki Stuttgart Leeds TU Dresden KU Leuven MNCS 1 Paris V Bergen Turku DCU UNL Granada UM Trondheim UA Barcelona Thessaloniki Univ Seville UP UA Toulouse III UTL TU Wien UL UC Brno Jagiellonian TOTAL PUBLICATIONS

30 28 FIGURE 10: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 ENGINEERING 1.6 ETH Zurich Aarhus Utrecht 1.4 Granada KU Leuven Helsinki Twente UP Trondheim UTL 1.2 UL Bruxelles UM Leeds UNL Paris V Bergen Bath UA TU Wien Thessaloniki Southampton 1 DCU Abo Akad So Denmark UA Barcelona TU Dresden UC Stuttgart Univ Seville Toulouse III Univ Perugia MNCS 0.8 Jagiellonian UL Brno 0.6 Turku TOTAL PUBLICATIONS

31 29 FIGURE 11: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 ECONOMICS Utrecht ETH Zurich 1.4 Trondheim UNL 1.2 Bath Leeds KU Leuven Toulouse III Turku Southampton UL Bruxelles Aarhus 1 DCU TU Wien Thessaloniki Bergen UTL So Denmark Abo Akad Univ Seville Twente Granada UA Barcelona Paris V UC MNCS 0.8 UA Univ Perugia TU Dresden UM UP 0.6 Helsinki Brno TOTAL PUBLICATIONS

32 30 FIGURE 12: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 SOCIAL SCIENCES RELATED TO MEDICINE Utrecht 1.4 Trondheim TU Dresden Southampton ETH Zurich Bath Aarhus Leeds KU Leuven So Denmark 1.2 Twente UA Barcelona Bergen Helsinki Jagiellonian DCU UL Bruxelles 1 Toulouse III Turku UC Paris V Granada UP UNL Thessaloniki MNCS 0.8 UM UTL UL 0.6 Univ Seville TOTAL PUBLICATIONS

33 31 FIGURE 13: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 OTHER SOCIAL SCIENCES ETH Zurich Utrecht 1.4 Aarhus Leeds So Denmark Trondheim Twente Southampton 1.2 Stuttgart Bergen KU Leuven TU Dresden Turku UTL Bath MNCS 1 TU Wien UNL UM Thessaloniki UL Bruxelles Paris V DCU UC Helsinki 0.8 Abo Akad UAlg UA UL UP Toulouse III Brno Granada UA Barcelona 0.6 Univ Seville Jagiellonian TOTAL PUBLICATIONS

34 32 FIGURE 14: IMPACT COMPARED TO WORLD SUBFIELD AVERAGE /13 HUMANITIES & ARTS TU Wien 2.5 Twente 2 Leeds Aarhus Southampton Utrecht MNCS 1.5 UNL Stuttgart ETH Zurich So Denmark 1 UTL Bath Toulouse III Trondheim Thessaloniki Paris V Bergen UA Barcelona Helsinki DCU UM UL Univ Seville KU Leuven Jagiellonian Abo Akad 0.5 TU Dresden Univ Perugia UP UA Brno Turku Granada UL Bruxelles UC TOTAL PUBLICATIONS

35 33 5 Highly cited papers in main fields We made an analysis of the presence of the Universidade Nova de Lisboa among the most frequently cited papers (cf. Section 2.2). For this analysis, we applied a fixed citation window of 4 years for each publication. For example, for papers published in 2006, citations are counted up to (and including) 2009; for papers published in 2007 citations are counted up to 2010 etc. Finally for papers published in 2010, citations are counted up to As 2013 is the last complete year in our database for which we can count citations (but not publications), this analysis relates only to papers published during Appendix IV presents tabular data for the different disciplines. As an example, we discuss some of the most important results for all disciplines combined (see Table 3). The analysis of the most frequently cited papers (top 10% and top 5%) in All disciplines shows that Universidade Nova de Lisboa published 3,639 articles and reviews during (P06-). The number of Top 10% papers is 373 (was 284) and the number of Top 5% papers is 184 (was 133). A considerable increase compared to the previous report by 32% and 38% respectively. If one relates these numbers to the total number of published papers, it can be observed that the Universidade Nova de Lisboa authored exactly the expected number of highly cited papers. This was 8% less highly cited Top 10% and 14% less highly cited Top-5% papers than expected previously. These ratios are comparable to those for most other Portuguese universities, with the possible exception of Algarve which scores a fraction lower. The UNL scores are as high as the ratios for Portugal as a whole but marginally below the European level and substantially lower than those of most benchmark universities outside Portugal. UNL scores are on the same level as the scores of the Universities of Granade, Turku and Dublin City University and substantially above the Masaryk University and Krakow.

36 34 TABLE 3: HIGHLY CITED PAPERS FOR ALL DISCIPLINES University P Ptop 10% Rank PPtop 10% Rank Ptop 5% Rank PPtop 5% Rank Univ Nova Lisboa 3, Univ Algarve 1, Univ Aveiro 4, Univ Coimbra 4, Univ Minho 3, Univ Lisbon 5, Univ Porto 7, Tech Univ Lisbon 6, Vienna Univ Technol 4, Katholieke Univ Leuven 21,522 3, , Univ Libre Bruxelles 7,350 1, ETH Zurich 17,440 3, , Masaryk Univ - Brno 3, Tech Univ Dresden 9,157 1, Univ Stuttgart 4, Aarhus Univ 10,668 1, Univ Southern Denmark 4, Univ Autonoma Barcelona 10,262 1, Univ Granada 6, Univ Seville 5, Åbo Akademi Univ 1, Univ Helsinki 15,999 2, , Univ Turku 5, Paris Descartes Univ 9,305 1, Paul Sabatier Univ 11,825 1, Univ Bath 4, Univ Leeds 10,880 1, Univ Southampton 11,098 1, Aristotle Univ Thessaloniki 8, Dublin City Univ 1, Univ Perugia 5, Univ Twente 4, Utrecht Univ 19,627 3, , Norwegian Univ Sci & Technol 7, Univ Bergen 7, Jagiellonian Univ - Krakow 6, Portuguese Universities 27,455 2, , Benchmark Universities 252,118 32, , Portugal 39,279 3, , Europe 2,126, , ,