Reflections on norms for the h-index and related indices
© Copyright 2007-2016 Anne-Wil Harzing. All rights reserved.
Eight version, 23 August 2016
Ever since Publish or Perish has been available, academics have asked me for "norm scores" for the various indices. I guess everyone likes to compare himself or herself to see how they are doing. I have always hesitated to do this as these scores are so easily taken out of context and can take on a life of their own.
However, most publications discussing the h-index and related indices deal with academics in the (Natural) Sciences. There are large differences in research output between the different disciplines (see Reflections on the h-index and Citation Analysis Across Disciplines). H-indices in the Natural Sciences are much higher than in the Social Sciences and Humanities. Hence, I felt that in order to support academics in the Social Sciences and Humanities it would be appropriate to provide some systematic evidence that lower citation indices can be expected in these disciplines.
A calculation of citation metrics for an individual academic requires one to be at least familiar with the field in question (in order to be able to eliminate publications by authors with similar names) and preferably with the individuals work. Therefore, my "norm scores" only pertain to the two fields that I am familiar with: Management and International Business.
It is difficult to establish an appropriate comparison group. However, I think most academics will agree that fellow academics who have been elected as presidents of the peak professional organization in their field would probably constitute an appropriate (if ambitious) benchmark. Therefore, I have calculated the various citation metrics (h-index, g-index, contemporary h-index and individual h-index) for presidents of the Academy of Management (AoM) and the Academy of International Business (AIB) for the last 25 years. Since AIB presidentship does not rotate every year, the data are based on 15 individuals, whereas for AoM they are based on 25 individuals.
It is important to note that the record of specific individuals might be both underestimated and overestimated. Most of the publications of academics that were presidents in the early years will be relatively old. Google Scholar does not perform as well for older publications, because these publications and the publications that cite them have not (yet) been posted on the web. Hence their citation metrics might be understated. On the other hand, given that Google Scholar cannot go back in time, we calculated their citation metrics as of the current date, rather than the date at which they became president, hence overstating their citation metrics.
Since most presidents of AIB and AoM are senior, well-established scholars, they will have published a lot of their important work a while ago. This means that their contemporary h-indices might be lower than those of newly-minted professors. On the other hand, the overall top scorer in the combined group of AIB and AoM presidents (John Dunning) is one of the oldest academics in the group and has a high contemporary h-index.
With the aid of Publish or Perish I found the following citation metrics as of 6 March 2007:
|6 March 2007
I repeated this exercise on 26 March 2013, i.e. 6 years later. As can easily be verified, all citation metrics have gone up significantly. This is caused partly caused by a natural increase of citation levels for this group of high-performing academics. However, it also likely to reflect a further expansion of Google Scholar coverage over the last six years.
Hence care is required when comparing citation metrics over time. This is true for all citation databases, but probably more so for Google Scholar than for ISI Web of Science and Scopus.
|26 March 2013
I repeated this exercise again on 23 August 2016, i.e. 3.5 years later. Citation metrics have gone up significantly again. This clearly indicates that whenever one uses norm scores they should ideally be less than a year old. In 2016, I added the hIa (annualized individual h-index) which presents the average number of single-author equivalent impactful publications an academic pubilshes a year. I also back-dated this for the 2013 data.
As can be easily verified, this is the statistic that changes least over the years as one needs to continue to gather new impactful papers in order to counteract the natural decline of this index over the years. As such it might be one of the best indices to use for comparisons over time.
|23 August 2016
It should be noted that AoM and AIB presidents are a select group of high-performing academics. Hence one cannot expect every professor in Management or International Business to display similar citation metrics. However, those professors that do display similar citation metrics (or even exceed them) could certainly be seen as high performers themselves.
I invite academics to submit norm scores for similar groups in their own subdiscipline with a short description of their search parameters. I would suggest that comparison groups need to include at least 10 academics (and preferably more) to avoid idiosyncrasies. If I am confident that the data are accurate, I will post them on my website with attribution to the person who has compiled them. Please submit your results to email@example.com.
- Research performance of marketing academics and departments: An international comparison (Open Access) by Geoff Soutar, Ian Wilkinson and Louise Young, published in Australian Journal of Marketing (2015) compares citation metrics of marketing academics in the top 500 research universities, as well as metrics for 2263 academics and all universities in Australia and New Zealand.
- Australian Marketing Scholars’ Research Impact: An Update and Extension was presented by Geoff Soutar at the 2013 ANZMAC Conference at the University of Auckland Business School, Dec 2-4 2013. At the same conference Geoff presented a paper analysing marketing journals entitled Marketing Journal and Paper Impact: Some Additional Evidence.
- Measuring the research contribution of management academics using the Hirsch-index by John Mingers, published in Journal of the Operational Research Society, applies the h-index to three groups of management scholars: BAM fellows, INFORMS fellows and members of COPIOR.
- Cumulative and career-stage citation impact of social-personality programs and their members by Brian Nosek and co-authors provides benchmarks for evaluating impact across the career span in psychology, and other disciplines with similar citation patterns. In press for Personality and Social Psychology Bulletin. Supplementary page with career-stage impact calculators: http://projectimplicit.net/nosek/papers/citations/
- In Characterizing author citation ratings of herpetologists using Harzing’s Publish or Perish Malcolm L. McCallum analyzed a random sample of herpetologists. He used linear regression to analyze the influence of career length and publication count on their h-score, g-score, e-score, and m-quotient and provides mean scores for each author metric for herpetologists at various career lengths.
- Evaluating the Productivity of Social Work Scholars Using the h-Index by Jeffrey Lacasse, David Hodge & Kristen Bean, published in Research in Social Work Practice introduces the h-index and related statistics to social work faculty. It also presents the results of a comprehensive study of 337 tenure-track faculty in top-10 universities and 215 editorial board members in the field.
- Publish or Perish
- Publish or Perish Tutorial
- The Publish or Perish Book
- Reflections on the h-index
- Google Scholar - a new data source for citation analysis
- Google Scholar: the democratization of citation analysis?
- A Google Scholar h-Index for Journals
- Working with ISI data: Beware of Categorisation Problems
- Citation analysis across disciplines
Copyright © 2017 Anne-Wil Harzing. All rights reserved. Page last modified on Sat 20 May 2017 18:52
Anne-Wil Harzing is Professor of International Management at Middlesex University, London. In addition to her academic duties, she also maintains the Journal Quality List and is the driving force behind the popular Publish or Perish software program.