Assessment of Research Performance in Biology: How Well Do Peer Review and Bibliometry Correlate?
|
|
- Polly Foster
- 6 years ago
- Views:
Transcription
1 Assessment of Research Performance in Biology: How Well Do Peer Review and Bibliometry Correlate? BARRY G. LOVEGROVE AND STEVEN D. JOHNSON Bibliometric indices based on publishing output, and citation records used to measure scientific quality, are increasingly being employed to supplement and even replace traditional alternatives, such as the peer-review system. In this article we question whether peer review can predict bibliometric indices for individual researchers. We compared the ratings of scientific quality obtained using a peer-review system with the most popular bibliometric scores (h-, m-, and g-indices; total citations; and mean number of citations per publication) for 163 botanists and zoologists. Although the peer-review ratings were correlated with the bibliometric measures, they explained less than 40 percent of the variation in the scores. Most of this unexplained variation is presumably due to limitations of both the peer-review system and bibliometric scores. We propose a synergy between peer-review and bibliometric scores that can improve the assessment of scientific quality, especially by benchmarking peer-review decisions against bibliometric thresholds. Keywords: bibliometric scores, Hirsch index, peer review, citation record, publication record Scientists must regularly undergo evaluation of their performance for the purposes of promotion and grant awards, yet there is considerable controversy about the best means of obtaining information on the research performance of individual scientists. In the past, the peer-review system was considered a fair mechanism for measuring scientific quality (Kostoff 1997). Increasingly, though, this system is being supplemented, and even replaced, by bibliometric indices based on publishing output and citation records (Ball 2005, Hirsch 2005, Egghe 2006, Kelly and Jennions 2006, Lehmann et al. 2007). We pose several questions of interest to all researchers. First, how well does peer review predict bibliometric indices for individual researchers? Second, does peer review offer any obvious advantages over bibliometric scores, and vice versa? The latter question is critically important to grantawarding agencies, because bibliometric indices are obtainable almost instantly and, if screened appropriately for homonyms (different people with the same names) and synonyms (one person with different names and initials), should avoid human subjectivity. Individual scientists, too, want a system that provides the most accurate, and thus fair, evaluation of their performance that is possible. The National Research Foundation (NRF) in South Africa has long used a carefully managed system of international peer review to rate researchers according to their scientific performance. The rating evaluation is based purely on a scientist s past research performance, and does not involve assessment of project proposals or grant applications. Thus, we would expect a scientist s rating to correlate closely with his or her bibliometric score, because the rating is principally an assessment of the individual s research output and standing in the international scientific community. The NRF rating is the outcome of a two-stage process. First, applications are sent to members of subject-specific specialist committees composed mostly of invited NRF-rated peers and NRF management. For each application, the specialist committee chooses the names of at least six national or international reviewers. Half of the reviewers are selected from a list of potential reviewers supplied by the applicant, whereas the rest are selected independently by the specialist committee. Reviewers are requested to assess (a) the quality of the research-based outputs of the last seven years as well as the impact of the applicant s work in his/her field and how it has Barry G. Lovegrove ( lovegrove@ukzn.ac.za) and Steven D. Johnson ( johnsonsd@ukzn.ac.za) are with the School of Biological and Conservation Sciences at the University of KwaZulu-Natal in Scottsville, South Africa American Institute of Biological Sciences. 160 BioScience February 2008 / Vol. 58 No. 2
2 impacted...adjacent fields, and (b) the applicant s standing as a researcher in terms of both a South African and international perspective ( Between 2003 and 2006, for example, the NRF obtained 12,649 peer-review reports (59 percent from outside South Africa) to rate the research performance of 1554 researchers. These reviews are assessed by assessment panels of a specialist committee and NRF management, who agree on a rating. Scientists apply every five years to be rerated (see for additional information on the rating process). Although the NRF rating process has no bearing on a separate grant application process that typically follows a successful rating (nonrated biologists can also apply for NRF grants), the rating is the most important variable in a formula used by the NRF when allocating research funds to individual researchers. Each of the three main rating categories described below A, B, and C includes two to three subcategories (see for subcategory details). A yardstick for peer review All five bibliometric measures varied significantly among, and showed a significant positive relationship with, peerreview rating categories (figure 1, table 1). However, variance in these scores explained by ratings (R 2 ) was low, varying from 19.4 percent (m-index) to 38.5 percent (total citations; A rating: A-rated researchers are unequivocally recognized by their peers as leading international scholars in their field for the high quality and impact of their recent research outputs. B rating: B-rated researchers are those who enjoy considerable international recognition by their peers for the high quality of their recent research outputs. C rating: C-rated scientists are established researchers with a sustained recent record of productivity in the field who are recognized by their peers as having (a) produced a body of quality work, the core of which has coherence and attests to ongoing engagement with the field, and (b) demonstrated the ability to conceptualize problems and apply research methods to investigating them. Here we compare the NRF peer-review ratings of 163 botanists and zoologists with various bibliometric measures: the h-index the h number of papers with more than h citations each (Hirsch 2005); the m-index h divided by scientific age (years from first publication to last publication; Hirsch 2005); the g-index similar to h, but weighted by highly cited papers (Egghe 2006); the total number of citations; and the mean number of citations per paper, obtained from Thomson Scientific s ISI Web of Science ( We included the mean number of citations per publication in our analyses because this measure has been proposed to be better than the Hirsch index (Lehmann et al. 2007). Given concerns that citation practices might vary among scientific disciplines (Batista et al. 2006), we restricted our analyses to a sample of 163 biologists (botanists and zoologists) who were rated or rerated by the end of Figure 1. Frequency distributions of Hirsch s h-index for 8 subcategory ratings (C3 to A1) used by the National Research Foundation, South Africa, to measure scientific quality. The mean ± standard deviation of each subcategory is plotted at the top of each graph with the actual mean value. The subcategories within parentheses show the results of post hoc Tukey tests from the analysis of variance (table 1). These represent subcategories that are significantly different (p <0.05) from the value of the subcategory where they are listed. For example, the h-index of subcategory B2 is significantly different from C1, C2, and C3 (those in parentheses), but not from A1, A2, B1, and B3. February 2008 / Vol. 58 No. 2 BioScience 161
3 Table 1. Results of univariate analysis of variance of various bibliometric measures between National Research Foundation rating subcategories. Bibliometric F 7,162 R 2 p measure (ratings) h-index < m-index < g-index < Total citations < Mean citations < per publication Note: All data were log 10 -transformed to achieve homogeneity of variances. table 1). The percentages of h- and g-indices explained were identical (37.6 percent). Thus, peer review could at best explain 40 percent of the variation in the bibliometric measures of South African biologists. Post hoc Tukey tests showed that the mean h-index in a particular rating category was seldom significantly different from that in adjacent categories. The mean h-index of the B1 category, for example, was actually lower than that of the B2 category. Unlike the data from a recent report (Lehmann et al. 2007), our data suggest that the number of citations per publication is not a better estimator of scientific quality than the h- or g-indices. Put another way, our data suggest that scientists highly regarded by their peers are more consistently likely to have high h- or g-indices than high values for most alternative bibliometric measures, including the mean number of citations per paper. Hirsch (2005) argued that the h-index may provide a useful yardstick with which to compare, in an unbiased way, different individuals competing for the same resource when an important evaluation criterion is scientific achievement. The h-index is gaining popularity because it is considered one of the most robust measures of quantitative and qualitative research performance: it is insensitive to large numbers of poorly cited papers as well as small numbers of big-hit, highly cited papers. However, the latter supposed advantage may actually be a drawback of the index, because it does not recognize the value of papers with exceptional scientific impact (Egghe 2006). Egghe (2006) claimed that the g-index inherits all the good properties of the h index and...yields a better distinction and order of the scientists from the point of view of visibility. Like the h-index, the g-index is computed from a researcher s publication list ranked by decreasing number of citations; however, the g-index is weighted by exceptionally highly cited articles (Egghe 2006). The g-index requires more computational time than the h-index, but can still be computed by hand in about three minutes. In our data, there was a strong correlation between the g- and h-indices (R 2 = 0.905, p < 0.001; figure 2), but most of the unexplained 9.5 percent variance could be attributed to 3 of the 163 researchers, whose g-index exceeded the upper 95 percent prediction limit of the regression. The g-indices of Figure 2. The correlation between the h- and g-indices of 163 South African biologists. these researchers (rated A2, B2, and C2) were disproportionately higher than their h-indices because they were co - authors on a few papers with more than 200 citations. Given the similar variance between NRF subcategories explained by the h- and g-indices, the h-index and its variants should be the primary bibliometric yardstick, but calculating the g- index as well may help to identify outstanding individuals (sensu Hirsch 2005) in a discipline-specific context. Sources of peer-review and bibliometric variance Both peer-review and bibliometric scores have recognized sources of error and bias that can account for the large unexplained variance between NRF ratings and bibliometric scores. Although science is essentially founded on peer review, a recent study identified 25 different sources of bias in processes involving manuscript reviews and grant peer review (Daniel et al. 2007). Citing Shashok (2005), these authors emphasized that because reviewers are humans, their behaviour, whether performing salaried duties, enjoying their leisure time or writing reviews, is influenced by factors that cannot be predicted, controlled or standardized. Some recognized and contradictory sources of bias are gender, nationality, and scientific discipline (Daniel et al. 2007). Nevertheless, peer review of researchers is unlikely to be replaced entirely by bibliometric measures. The general correlation between bibliometric measures and peer-review research ratings found in this study and others (Bornmann and Daniel 2006, van Raan 2006, Daniel et al. 2007) is encouraging to advocates of either approach. Numerous refinements of the bibliometric indices have been published, and the h-index in particular is undergoing extensive scrutiny of its advantages and limitations (Batista et al. 2006, Bornmann and Daniel 2007, Jin et al. 2007). Some important limitations of the h-index are that (a) it does not control for scientific age, (b) it is sensitive to disciplinespecific citation patterns, and (c) if calculated from Thomson Scientific s ISI Web of Science, it is underestimated 162 BioScience February 2008 / Vol. 58 No. 2
4 because books and book chapters are excluded (Bornmann and Daniel 2007). Moreover, although it has been suggested that self-citations do not markedly alter the ranking patterns of researchers (Cronin and Meho 2007), the h-indexes for individuals with many cooperating coauthors can be inflated compared with those for researchers who work alone (Vinkler 2007). Nevertheless, these limitations can be overcome with additional research and computation of an individual s research output. For example, the implications of the lack of time cor rection in the h-index are that it cannot differentiate between active and inactive researchers, and that it always puts young researchers at a disadvantage relative to older, wellestablished counterparts. Indeed, the h- or g-indices never decrease; in a ratchet-like fashion, they either increase or remain the same with time (Hirsch 2005), and thus they cannot identify researchers whose productivity has waned. Hirsch proposed the m-index as a time-corrected h-index (Hirsch 2005), but the m-index is valid only if the h-index increases linearly with time, which is not always the case (Burrell 2007, Egghe 2007). Burrell (2007) advocated the calculation of the h-rate, the slope of the relationship between the h-index and scientific age (equivalent to the m-index when the slope is linear), as a meaningful indication of age-related research performance, whereas Jin and colleagues (2007) proposed the AR-index, which can increase or decrease with time in response to research outputs. Because many h-index-versus-time curves are not linear, the h-rate and AR-index can be calculated during slow and fast phases of a researcher s career. This approach could help explain, for example, why some B-rated South African biologists have h-indices higher than those of their A-rated counterparts: the B-rated researchers may have become unproductive later in their careers and subsequently become down-rated (information that can certainly be obtained through peer review, but also through close scrutiny of variations in the h-rate over time). For young scientists who have yet to establish clear biblio - metric patterns on the basis of citations, peer review will always remain an invaluable tool. Peer review is used, for example, by the NRF to rate young researchers (Y rating) and young researchers of exceptional potential (P rating). Similar systems that identify promising young scientists exist in other countries, and require an assessment process that does not depend heavily on citation profiles. A synergistic approach Despite the disadvantages of the h-index, its advantages cannot be ignored by research evaluators. First, it can be obtained very rapidly from several Web-based databases. Second, it is considerably cheaper to obtain, in terms of administration costs, although the Thomson Scientific ISI Web of Science is itself a very expensive service. Third, it avoids the perceived subjectivity, rivalry, nepotism, and secrecy of peer-review systems. Fourth, bibliometric measures are universally comparable within, though possibly not among, disciplines. Fifth, it provides transparency, allowing researchers to monitor progress in their careers and to identify what is needed to achieve personal targets. Clearly, there is merit in a synergistic approach whereby the rating of scientific quality through peer review can be benchmarked against bibliometric measures of performance within a discipline (Batista et al. 2006). It is difficult to accept that some of the glaring discrepancies evident in figure 1 can be attributed solely to the recognized limitations of the peerreview and bibliometric measures discussed above. These discrepancies include an A-rated biologist with an h of 15, lower than the mean h-index for the B1 and B2 subcategories, and a B2-rated researcher who has an h of 4, lower than the mean for all lower subcategories (B3 C3). On the other hand, the h-index of one unfortunate B2 researcher (h = 33) far exceeds the mean of both A subcategories. Hirsch (2005) advocated benchmark h-index ranges that can be used to separate successful scientists from outstanding scientists and unique individuals. In the disciplines of botany and zoology, the South African NRF rating categories of A, B, and C appear to correspond roughly to h-indices of more than 20, 10 20, and less than 10, respectively (figure 1). Indeed, we would argue that it is incumbent on the NRF, or any other body that evaluates researchers, to explain major deviations in rating patterns, and in this respect the h-index (and variants thereof) can prove a useful tool for identifying ratings that fall outside certain thresholds. These cases clearly deserve close scrutiny by peer evaluation panels and specialist committees. Acknowledgments We thank Lorne Wolfe and the National Research Foundation (NRF) in South Africa for comments on the manuscript. The NRF subcategory data were provided to the authors by the NRF under a contract of confidentiality. Three anonymous reviewers provided constructive suggestions that were implemented in the manuscript. References cited Ball P Index aims for fair ranking of scientists. Nature 436: 900. Batista PD, Campiteli MG, Kinouchi O, Martinez AS Is it possible to compare researchers with different scientific interests? Scientometrics 68: Bornmann L, Daniel H-D Selecting scientific excellence through committee peer review a citations analysis of publications previously published to approval or rejection of post-doctoral research fellowship applicants. Scientometrics 68: What do we know about the h-index? Journal of the American Society for Information Science and Technology 58: Burrell Q Hirsch index or Hirsch rate? Some thoughts arising from Liang s data. Scientometrics 73: Cronin B, Meho L Using the h-index to rank influential information scientists. Journal of the American Society for Information Science and Technology 57: Daniel H-D, Mittag S, Bornmann L The potential problems of peer evaluation in higher education and research. Pages in Cavalli A, ed. Quality Assessment for Higher Education in Europe. London: Portland Press. February 2008 / Vol. 58 No. 2 BioScience 163
5 Egghe L Theory and practise of the g-index. Scientometrics 69: Dynamic h-index: The Hirsch index in function of time. Journal of the American Society for Information Science and Technology 58: Hirsch JE An index to quantify an individual s research output. Proceedings of the National Academy of Sciences 102: Jin BH, Liang LM, Rousseau R, Egghe L The R- and AR-indices: Complementing the h-index. Chinese Science Bulletin 52: Kelly CD, Jennions MD The h index and career assessment by numbers. Trends in Ecology and Evolution 21: Kostoff RN The principles and practices of peer review. Science and Engineering Ethics 3: Lehmann S, Jackson AD, Lautrup BE Measures for measures. Nature 444: Shashok K Standardization versus diversity: How can we push peer review research forward? Medscape General Medicine 7: 11. van Raan AFL Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics 67: Vinkler P Eminence of scientists in the light of the h-index and other scientometric indicators. Journal of Information Science 33: doi: /b Include this information when citing this material. 164 BioScience February 2008 / Vol. 58 No. 2
REFLECTIONS ON THE H-INDEX
B&L Business & Leadership Nr. 1-2012, pp. 101-106 ISSN 2069-4814 EVALUATION OF ACADEMIC IMPACT REFLECTIONS ON THE H-INDEX Anne-Wil HARZING * 1. Introduction Since Hirsch s first publication of the h-index
More informationhg-index: A New Index to Characterize the Scientific Output of Researchers Based on the h- and g- Indices
hg-index: A New Index to Characterize the Scientific Output of Researchers Based on the h- and g- Indices S. Alonso, F.J. Cabrerizo, E. Herrera-Viedma, F. Herrera November 25, 2008 Abstract To be able
More informationThe Adapted Pure h-index
Chai, Hua, Rousseau and Wan 1 The Adapted Pure h-index Jing- chun Chai 1 Ping-huan Hua 1 Ronald Rousseau 3,4 Jin-kun Wan 1,2 04 June 2008 Abstract The pure h-index, introduced by Wan, Hua and Rousseau
More informationBeyond the Durfee square: Enhancing the h-index to score total publication output
Jointly published by Akadémiai Kiadó, Budapest Scientometrics, Vol. 76, No. 3 (2008) 577 588 and Springer, Dordrecht DOI: 10.1007/s11192-007-2071-2 Beyond the Durfee square: Enhancing the h-index to score
More informationComparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups
Jointly published by Akadémiai Kiadó, Budapest Scientometrics, Vol. 67, No. 3 (2006) 491 502 and Springer, Dordrecht DOI: 10.1556/Scient.67.2006.3.10 Comparison of the Hirsch-index with standard bibliometric
More informationSCIENCE & TECHNOLOGY
Pertanika J. Sci. & Technol. 20 (2): 197-203 (2012) SCIENCE & TECHNOLOGY Journal homepage: http://www.pertanika.upm.edu.my/ Short Communications The h-index in Elsevier s Scopus as an Indicator of Research
More informationThe h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level
Journal of Informetrics 1 (2007) 193 203 The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level Rodrigo Costas, María Bordons Centro de Información
More informationSoil science and the h index
Jointly published by Akadémiai Kiadó, Budapest Scientometrics, Vol. 73, No. 3 (2007) 257 264 and Springer, Dordrecht DOI: 10.1007/s11192-007-1811-z Soil science and the h index BUDIMAN MINASNY, a ALFRED
More informationSocializing the h-index
Graham Cormode * Socializing the h-index Qiang Ma S. Muthukrishnan Brian Thompson 5/7/2013 Abstract A variety of bibliometric measures have been proposed to quantify the impact of researchers and their
More informationRestricting the h-index to a citation time window: A case study of a timed Hirsch index
Restricting the h-index to a citation time window: A case study of a timed Hirsch index Michael Schreiber Institute of Physics, Chemnitz University of Technology, 917 Chemnitz, Germany. Phone: +49 371
More informationUsing Bibliometric Big Data to Analyze Faculty Research Productivity in Health Policy and Management
Using Bibliometric Big Data to Analyze Faculty Research Productivity in Health Policy and Management Christopher A. Harle, PhD [corresponding author] Joshua R. Vest, PhD, MPH Nir Menachemi, PhD, MPH Department
More informationComparison of scientists of the Brazilian Academy of Sciences and of the National Academy of Sciences of the USA on the basis of the h-index
258 Brazilian Journal of Medical and Biological Research (2008) 41: 258-262 ISSN 0100-879X Concepts and Comments Comparison of scientists of the Brazilian Academy of Sciences and of the National Academy
More informationSocializing the h-index
Socializing the h-index Graham Cormode * Qiang Ma S. Muthukrishnan Brian Thompson AT&T Labs Research * Rutgers University graham@research.att.com {qma, muthu, bthom}@cs.rutgers.edu Abstract A variety of
More informationDoes it matter which citation tool is used to compare the h-index of a group of highly cited researchers?
Author manuscript, published in "Australian Journal of Basic and Applied Sciences 7, 4 (2013) 198-202" Does it matter which citation tool is used to compare the h-index of a group of highly cited researchers?
More informationComparing Journal Impact Factor and H-type Indices in Virology Journals
University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Library Philosophy and Practice (e-journal) Libraries at University of Nebraska-Lincoln 2012 Comparing Journal Impact Factor
More informationV-Index: An Index based on Consistent Researcher Productivity
V-Index: An Index based on Consistent Researcher Productivity Ali Daud Department of Computer Science and Software Engineering, International Islamic University, Islamabad, 44000, ali.daud@iiu.edu.pk S
More informationDevelopment G-Index and H-Index : Dgh-Index
Development G-Index and H-Index : Dgh-Index Ahmed AbdulHassan AL-Fatlawi Computer Science and Information Technology, University of Sumer Abstract Of the most important indicators such as the h-index,a
More informationDoes it Matter Which Citation Tool is Used to Compare the H-Index of a Group of Highly Cited Researchers?
MPRA Munich Personal RePEc Archive Does it Matter Which Citation Tool is Used to Compare the H-Index of a Group of Highly Cited Researchers? Hadi Farhadi and Hadi Salehi and Melor Md Yunus and Aghaei Chadegani
More informationh-index: A REVIEW OF LITERATURE
h-index: A REVIEW OF LITERATURE Keshava* Vani C. Hiremath** Manjunath G. Lamani*** Gireesh Ganjihal**** ABSTRACT The reviews of literature is an important component of any research study. Bibliometric
More informationThe elephant in the room: The problem of quantifying productivity in evaluative scientometrics
The elephant in the room: The problem of quantifying productivity in evaluative scientometrics Ludo Waltman, Nees Jan van Eck, Martijn Visser, and Paul Wouters Centre for Science and Technology Studies,
More informationarxiv: v2 [cs.dl] 5 Oct 2017
A two-dimensional index to quantify both scientific research impact and scope arxiv:1706.01882v2 [cs.dl] 5 Oct 2017 M. Adda-Bedia and F. Lechenault Univ. Lyon, ENS de Lyon, Univ. Claude Bernard, CNRS,
More informationReflections on recent developments of the h-index and h-type indices
1 Reflections on recent developments of the h-index and h-type indices Ronald Rousseau 1,2 01 June 2008 Abstract A review is given of recent developments related to the h-index and h-type indices. Advantages
More informationThe Research on H-index-Based Academy Credit Evaluation Based on Fractional Theory
Send Orders for Reprints to reprints@benthamscience.ae The Open Cybernetics & Systemics Journal, 2015, 9, 2949-2954 2949 Open Access The Research on H-index-Based Academy Credit Evaluation Based on Fractional
More informationGENERAL ARTICLES. Chun-Yang Yin*
Do impact factor, h-index and Eigenfactor TM of chemical engineering journals correlate well with each other and indicate the journals influence and prestige? Chun-Yang Yin* This study aims to determine
More informationThe Hirsch-index of set partitions
The Hirsch-index of set partitions by L. Egghe Universiteit Hasselt (UHasselt), Campus Diepenbeek, Agoralaan, B-359 Diepenbeek, Belgium (*) and Universiteit Antwerpen (UA), Stadscampus, Venusstraat 35,
More informationHow to map excellence in research and technological development in Europe
COMMISSION OF THE EUROPEAN COMMUNITIES Brussels, 12.3.2001 SEC(2001) 434 COMMISSION STAFF WORKING PAPER How to map excellence in research and technological development in Europe TABLE OF CONTENTS 1. INTRODUCTION...
More informationFaculty Development. Measures of Success and How to Attain Them
Measures of Success and How to Attain Them Kathy Griendling, PhD Professor of Medicine Vice Chair for Research and Faculty Development Assistant Dean for Faculty Development Why measure success? To fulfill
More informationUsing the h-index to measure the quality of journals in the field of Business and Management
Using the h-index to measure the quality of journals in the field of Business and Management John Mingers 1 (corresponding author) 1 Centre for the Evaluation of Research Performance Kent Business School,
More informationThe Objectivity of National Research Foundation Peer Review Based Ratings in South Africa
The Objectivity of National Research Foundation Peer Review Based Ratings in South Africa J.W. Fedderke ERSA working paper 300 July 2012 Economic Research Southern Africa (ERSA) is a research programme
More informationAssessing scientific research performance and impact with single indices
To appear in: Scientometrics Assessing scientific research performance and impact with single indices J. Panaretos 1 and C.C. Malesios Department of Statistics Athens University of Economics and Business
More informationKIER DISCUSSION PAPER SERIES
KIER DISCUSSION PAPER SERIES KYOTO INSTITUTE OF ECONOMIC RESEARCH Discussion Paper No.808 How Should Journal Quality be Ranked? An Application to Agricultural, Energy, Environmental and Resource Economics
More informationDATA-INFORMED DECISION MAKING (DIDM)
DATA-INFORMED DECISION MAKING (DIDM) Leadership and decision-making can benefit from data-informed decision making, also called evidencebased practice. DIDM supports decisions informed by measurable outcomes;
More informationCitation Statistics (discussion for Statistical Science) David Spiegelhalter, University of Cambridge and Harvey Goldstein, University of Bristol We
Citation Statistics (discussion for Statistical Science) David Spiegelhalter, University of and Harvey Goldstein, University of We welcome this critique of simplistic one-dimensional measures of academic
More informationLord Stern s review of the Research Excellence Framework
Lord Stern s review of the Research Excellence Framework The British Society for Immunology (BSI) is the largest immunology society in Europe. We are a learned society representing the interests of members
More informationAssigning publications to multiple subject categories for bibliometric analysis An empirical case study based on percentiles
The current issue and full text archive of this journal is available at www.emeraldinsight.com/0022-0418.htm JDOC 70,1 52 Received 17 October 2012 Revised 9 January 2013 Accepted 15 January 2013 Assigning
More informationScience of Science Amin Mazloumian. Chair of Sociology, in particular of Modeling and Simulation
Science of Science Amin Mazloumian Chair of Sociology: in particular of Modeling and Simulation Chair of Sociology, in particular of Modeling and Simulation Scientometrics: The science of measuring science
More informationDoes Quantity Make a Difference?
Does Quantity Make a Difference? Peter van den Besselaar 1 & Ulf Sandström 2 1 p.a.a.vanden.besselaar@vu.nl VU University Amsterdam, Department of Organization Sciences & Network Institute 2 ulf.sandstrom@oru.se
More informationTowards a Broader Understanding of Journal Impact: Measuring Relationships between Journal Characteristics and Scholarly Impact
Towards a Broader Understanding of Journal Impact: Measuring Relationships between Journal Characteristics and Scholarly Impact X. Gu, K. L. Blackmore 1 Abstract The impact factor was introduced to measure
More informationScientometrics and the Digital Existence of Scientist(s): Methodology, Myth(s) & Reality
Scientometrics and the Digital Existence of Scientist(s): Methodology, Myth(s) & Reality Vijay Kothari Ph.D. Ahmedabad vijay.kothari@nirmauni.ac.in vijay23112004@yahoo.co.in State Level Symposium at SRKI
More informationBMC Medical Research Methodology 2009, 9:33
BMC Medical Research Methodology BioMed Central Correspondence Assessing the impact of biomedical research in academic institutions of disparate sizes Vana Sypsa and Angelos Hatzakis* Open Access Address:
More informationA modified method for calculating the Impact Factors of journals in ISI Journal Citation Reports: Polymer Science Category in
Jointly published by Akadémiai Kiadó, Budapest Scientometrics, and Kluwer Academic Publishers, Dordrecht Vol. 60, No. 2 (2004) 217 235 A modified method for calculating the Impact Factors of journals in
More informationIndicators for Measuring University Research Performance in Thailand
Indicators for Measuring University Research Performance in Thailand Somchai Numprasertchai Department of Computer Engineering, Faculty of Engineering, Kasetsart University, Chatujak, Bangkok, THAILAND
More informationDetecting h-index manipulation through self-citation analysis
Scientometrics (2011) 87:85 98 DOI 10.1007/s11192-010-0306-5 Detecting h-index manipulation through self-citation analysis Christoph Bartneck Servaas Kokkelmans Received: 15 July 2010 / Published online:
More informationWho is Pressing for Metrics on Scholarly Output?
Who is Pressing for Metrics on Scholarly Output? Workshop on Rankings in Social Sciences and Humanities University of Ottawa, September 24, 2013 Ian D. Clark (www.ian-clark.ca) School of Public Policy
More informationUsing the H-index to Rank Influential Information Scientists
Published in the Journal of the American Society for Information Science and Technology, 57, no. 9 (July 2006): 1275-1278 Using the H-index to Rank Influential Information Scientists Blaise Cronin and
More information48 Market Rate Analysis
783 48 Market Rate Analysis Key concepts and terms Arithmetic mean or average Capsule job description Inter-quartile range Lower quartile Market rate survey Median Benchmark jobs Derived market rate Job
More informationDoes Massive Funding Support of Researchers Work?: Evaluating the Impact of the South African Research Chair Funding Initiative
Does Massive Funding Support of Researchers Work?: Evaluating the Impact of the South African Research Chair Funding Initiative J.W. Fedderke and M. Velez ERSA working paper 389 November 2013 Economic
More informationInvestment Platforms Market Study Interim Report: Annex 8 Gap Analysis
MS17/1.2: Annex 8 Market Study Investment Platforms Market Study Interim Report: Annex 8 Gap July 2018 Annex 8: Gap Introduction 1. One measure of whether this market is working well for consumers is whether
More informationTopic 3 Calculating, Analysing & Communicating performance indicators
National Erasmus + Office Lebanon Higher Education Reform Experts In collaboration with the Issam Fares Institute and the Office of Institutional Research and Assessment Ministry of Education and Higher
More informationTHOMSON REUTERS: INCITES
THOMSON REUTERS: INCITES An objective analysis of people, programs, and peers THOMSON REUTERS: SINGULAR EXPERT IN RESEARCH DISCOVERY The global research environment is changing: it s more collaborative,
More informationAn index to quantify an individual s scientific research output that takes into account the effect of multiple coauthorship
Scientometrics (2010) 85:741 754 DOI 10.1007/s11192-010-0193-9 An index to quantify an individual s scientific research output that takes into account the effect of multiple coauthorship J. E. Hirsch Received:
More informationPERFORMANCE APPRAISAL
PERFORMANCE APPRAISAL Performance appraisal (PA) is the process of evaluating how well employees perform their jobs when compared to a set of standards, and then communicating that information to those
More informationGLOSSARY OF COMPENSATION TERMS
GLOSSARY OF COMPENSATION TERMS This compilation of terms is intended as a guide to the common words and phrases used in compensation administration. Most of these are courtesy of the American Compensation
More information2015 Research Trainee Program Competition for Post-Doctoral Fellowship Awards EVALUATION CRITERIA FOR REVIEWERS
2015 Research Trainee Program Competition for Post-Doctoral Fellowship Awards EVALUATION CRITERIA FOR REVIEWERS VERSION 1 LAST UPDATED: JANUARY 28, 2015 Note to ALL Reviewers: Read all assigned applications
More informationParticipant Guide Lesson 6 Evaluating Performance. Slide 1. DPMAP Rev.2 July Lesson 6: Evaluating Performance. DPMAP Rev.
Slide 1 Lesson 6: Evaluating Performance 1 Slide 2 (2) 2 Slide 3 Learning Objectives Upon completion of this lesson, you will be able to: Recognize important facets of the Evaluating Phase. Describe how
More informationSurvey Results: Appendix: See our Web Page: The University of Texas at Austin 2014
REPORT ID: 721 Survey Results: Survey Respondent Information... 1 Survey Constructs... 4 Survey Climate Areas... 11 Primary Items... 13 Additional Items... 32 *Additional Items are not included if none
More informationarxiv: v1 [astro-ph.im] 16 Mar 2011
The h-index in Australian Astronomy Kevin A. Pimbblet A,B A School of Physics, Monash University, Clayton, VIC 3800, Australia B Email: Kevin.Pimbblet@monash.edu arxiv:1103.3130v1 [astro-ph.im] 16 Mar
More informationRecording and Quality Monitoring Systems Making the ROI Case
Recording and Quality Monitoring Systems Making the ROI Case February 4, 2004 Jim Veilleux, VoiceLog LLC Recording-based quality monitoring systems ( RBMS ) are rapidly penetrating the contact center marketplace,
More informationBibliometric analysis of the Netherlands Research School for the Socio-Economic and Natural Sciences of the Environment
Bibliometric analysis of the Netherlands Research School for the Socio-Economic and Natural Sciences of the Environment W. Gerritsma M.B. Duizendstraal H. Fransen M.E. Loman April 2007 Wageningen UR Library
More information4 The balanced scorecard
SUPPLEMENT TO THE APRIL 2009 EDITION Three topics that appeared in the 2007 syllabus have been removed from the revised syllabus examinable from November 2009. If you have the April 2009 edition of the
More informationTracking and measuring your research impact: A guide to using metrics and altmetrics.
Tracking and measuring your research impact: A guide to using metrics and altmetrics. Jane Haigh Subject Librarian, Edinburgh Napier University. This session will provide an overview of some of the most
More informationSUPPLEMENTARY INFORMATION
Conform and be funded: Supplementary information Joshua M. Nicholson and John P. A. Ioannidis Methods Authors of extremely highly-cited papers To identify first and last authors of publications in the
More informationComparative Study on Software Firms in Bangladesh
growth [4]. Earlier report stated that earning from outsourcing Comparative Study on Software Firms in Bangladesh Sushmiata Bose, Assistant Professor, University of Liberal Arts Bangladesh, Dhaka, Bangladesh
More informationRanking Leading Econometrics Journals Using Citations Data from ISI and RePEc
Ranking Leading Econometrics Journals Using Citations Data from ISI and RePEc Chia-Lin Chang Department of Applied Economics Department of Finance National Chung Hsing University Taiwan Michael McAleer
More informationOpen Science and Changing Scholarly Communication
Global Science Forum (GSF) NESTI Workshop, April 9, 2018 Open Science and Changing Scholarly Communication Kim Holmberg Senior researcher, PhD RUSE - Research Unit for the Sociology of Education University
More informationPerformance criteria
Performance criteria Q 7-03. What is the usefulness of performance criteria? With any complex and continuing problem such as state debt issuance, it is critical to employ performance criteria - set in
More informationSPE DISTINGUISHED LECTURER SERIES is funded principally through a grant of the SPE FOUNDATION
SPE DISTINGUISHED LECTURER SERIES is funded principally through a grant of the SPE FOUNDATION The Society gratefully acknowledges those companies that support the program by allowing their professionals
More informationThe π-index: a new indicator for assessing scientific impact
The π-index: a new indicator for assessing scientific impact Peter Vinkler Chemical Research Center, Hungarian Academy of Sciences Abstract. There are several simple and sophisticated scientometric indicators
More informationApplication of h and h-type indices at meso level: A case of Malaysian engineering research
Malaysian Journal of Library & Information Science, Vol. 20, no. 3, 2015: 77-86 Application of h and h-type indices at meso level: A case of Malaysian engineering research Muzammil Tahira 1, Rose Alinda
More informationCopyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and
Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and private study only. The thesis may not be reproduced elsewhere
More informationUOA 28, Mechanical, Aeronautical and Manufacturing Engineering
UOA 28, Mechanical, Aeronautical and Manufacturing Engineering This statement should be read alongside the statement for Main Panel G and the generic statement. Panel G Absences of chair and declaration
More informationIntegrating Quantitative and Fundamental Research The Art and the Science of Investing
Integrating Quantitative and Fundamental Research Introduction Quantitative and fundamental investment processes exist across a continuum. Many fundamental approaches use quantitative screens to help focus
More informationROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 1
ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 1 ROI: A Useful Tool for Corporate Learning Evaluation Cheri L. Fenton Purdue University ROI: A USEFUL TOOL FOR CORPORATE LEARNING EVALUATION 2 ROI:
More informationLesson 6: Evaluating Performance
Lesson 6: Evaluating Performance (2) Learning Objectives Upon completion of this lesson, you will be able to: Recognize important facets of the Evaluating Phase. Describe how employees inputs benefit the
More informationDiscussion of Nonfinancial Performance Measures and Promotion-Based Incentives
DOI: 10.1111/j.1475-679X.2008.00276.x Journal of Accounting Research Vol. 46 No. 2 May 2008 Printed in U.S.A. Discussion of Nonfinancial Performance Measures and Promotion-Based Incentives MICHAEL GIBBS
More informationAon Talent, Rewards & Performance. January 2018
January 2018 In most cases, you can t begin to fix a problem until you understand why it exists in the first place. This is certainly true for the complex challenge of addressing gender pay equity. In
More informationMEANINGFUL METRICS AND WHAT TO DO WITH THEM A DDS WORKSHOP
MEANINGFUL METRICS AND WHAT TO DO WITH THEM A DDS WORKSHOP WHAT ARE METRICS Tools we use to try and measure the worth or value research has by calculating its impact. Include basic measures such as numbers
More informationIntroduction to Business Research 3
Synopsis Introduction to Business Research 3 1. Orientation By the time the candidate has completed this module, he or she should understand: what has to be submitted for the viva voce examination; what
More informationThe Invariant Method is Manipulable
László Á. Kóczy, Martin Strobel The Invariant Method is Manipulable RM/08/029 Maastricht research school of Economics of TEchnology and ORganizations Universiteit Maastricht Faculty of Economics and Business
More informationTHE FELLOWS OF THE AMERICAN INSTITUTE OF CERTIFIED PLANNERS (FAICP)
THE FELLOWS OF THE AMERICAN INSTITUTE OF CERTIFIED PLANNERS (FAICP) Election to the College of Fellows is one of the highest honors that the American Institute of Certified Planners, the professional institute
More informationTowards best practices for authorship and research evaluation
Towards best practices for authorship and research evaluation Effects of performance metrics & the Leiden Manifesto Sarah de Rijcke Munin Conference on Scholarly Publishing Tromsø, 23 November 2017 1 Some
More informationTHANK YOU FOR JOINING ISMPP U TODAY!
THANK YOU FOR JOINING ISMPP U TODAY! The program will begin promptly at 11:00 am eastern November 14, 2012 ISMPP WOULD LIKE TO THANK the following Corporate Platinum Sponsors for their ongoing support
More informationMotivation at Work: Goal-setting and Expectancy Theory. Presented by Jason T Wu
Motivation at Work 1 Running head: Motivation at Work Motivation at Work: Goal-setting and Expectancy Theory Presented by Jason T Wu Management 6334 01 SPRING Dr. F. Robert Buchanan Updated: May 10 th,
More informationThe Impact of Schedule Pressure on Software Development: A Behavioral Perspective
Association for Information Systems AIS Electronic Library (AISeL) ICIS 2003 Proceedings International Conference on Information Systems (ICIS) December 2003 The Impact of Schedule Pressure on Software
More informationPerformance Appraisal: Methods
Paper: 01 Module: 20 Principal Investigator Co-Principal Investigator Paper Coordinator Content Writer Prof. S P Bansal Vice Chancellor Maharaja Agrasen University, Baddi Prof YoginderVerma Pro Vice Chancellor
More informationHigher Education Funding Council for England Call for Evidence: KEF metrics
January 2018 Higher Education Funding Council for England Call for Evidence: KEF metrics Written submission on behalf of the Engineering Professors Council Introduction 1. The Engineering Professors Council
More informationIEEE FELLOW COMMITTEE
IEEE FELLOW COMMITTEE Recommendation Guide S/TC-FEC Evaluators and IEEE Judges (September 2017) THE INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, INC. 3 Park Avenue, 17 th Floor New York, N.Y. 10016-5997,
More informationJelena Petrovic Hilary Harris Chris Brewster NEW FORMS OF INTERNATIONAL WORKING. CReME Research R e p o r t 1/00
Jelena Petrovic Hilary Harris Chris Brewster NEW FORMS OF INTERNATIONAL WORKING CReME Research R e p o r t 1/ Cranfield School of Management Cranfield, England INTRODUCTION The new forms of international
More informationThe Continuing Competence Program for Psychologists Practicing in Nova Scotia. A Guide for Participants
The Continuing Competence Program for Psychologists Practicing in Nova Scotia A Guide for Participants Guide Revised April 2017 1 Table of Contents Introduction to the Continuing Competence Program.3 1.
More informationA Scientometric Analysis of Aquaculture Literature during 1999 to 2013: Study Based on Scopus Database
Asian Journal of Information Science and Technology ISSN: 2231-6108 Vol. 6 No. 1, 2016, pp.8-14 The Research Publication, www.trp.org.in A Scientometric Analysis of Aquaculture Literature during 1999 to
More informationMake your research visible! Luleå University Library
Make your research visible! Luleå University Library Why this guide? Maximizing the visibility and impact of research is becoming ever more important in the academic world with tougher competition and
More informationAlessandra Faggian, The Ohio State University, USA
THEPUBLISHINGPROCESS Alessandra Faggian, The Ohio State University, USA *Thanks to Mark Partridge, Ohio State University and James Alm, Tulane University for sharing materials from similar presentations
More informationRecent trends in co-authorship in economics: evidence from RePEc
MPRA Munich Personal RePEc Archive Recent trends in co-authorship in economics: evidence from RePEc Katharina Rath and Klaus Wohlrabe Ifo Institute for Economic Research 17. August 2015 Online at http://mpra.ub.uni-muenchen.de/66142/
More informationRussell Group response to Lord Stern s review of the Research Excellence Framework (REF)
Russell Group response to Lord Stern s review of the Research Excellence Framework (REF) 1. Summary The REF is a fundamental part of the UK s dual support system for research funding. Whilst there is certainly
More informationYi-Jing Wu. Texas Tech University Phone:
Yi-Jing Wu Texas Tech University Phone: 806-834-2734 Rawls College of Business Email: yi-jing.wu@ttu.edu School of Accounting 703 Flint Avenue Lubbock, TX 79409 Education and Certification University of
More informationDAAD-NRF JOINT IN-COUNTRY MASTER S AND DOCTORAL SCHOLARSHIP PROGRAMME FRAMEWORK
DAAD-NRF JOINT IN-COUNTRY MASTER S AND DOCTORAL SCHOLARSHIP PROGRAMME FRAMEWORK DIRECTORATE: International Relations and Cooperation (IRC) DATE: MAY 2017 1 P a g e Table of Contents 1. Background... 3
More informationDAAD-NRF JOINT IN-COUNTRY MASTER S AND DOCTORAL SCHOLARSHIP PROGRAMME FRAMEWORK
DAAD-NRF JOINT IN-COUNTRY MASTER S AND DOCTORAL SCHOLARSHIP PROGRAMME FRAMEWORK DIRECTORATE: International Relations and Cooperation (IRC) DATE: MAY 2016 1 P a g e Table of Contents Endorsements... Error!
More informationResearch Productivity of Pakistan in Medical Sciences during the period
European Review for Medical and Pharmacological Sciences Research Productivity of Pakistan in Medical Sciences during the period 1996-2012 S.A. MEO, A.A. ALMASRI, A.M. USMANI 1 2013; 17: 2839-2846 Department
More informationEffects of time-based Biases in Review Communities Kevin Ho, Sean O Donnell CS224W Final Report
Effects of time-based Biases in Review Communities Kevin Ho, Sean O Donnell CS224W Final Report [0.0] Abstract Prior work has used social biases to inform predictive models of evaluation-driven networks,
More informationA Systematic Approach to Performance Evaluation
A Systematic Approach to Performance evaluation is the process of determining how well an existing or future computer system meets a set of alternative performance objectives. Arbitrarily selecting performance
More information