The individual annualised h-index: a 10-year study

Short summary of my white paper: The individual annualised h-index: an ecologically rational heuristic?

Short summary of my white paper: The individual annualised h-index: an ecologically rational heuristic?. You can skip directly to relevant sections of the white paper by using these links.

Introduction
Peer review
Heuristics
ERH metric
Methods
Sample
Metrics
Descriptives
Unbiased metrics
Disciplines
Career stage
Gender
Prediction
Promotion
-
 Mobility
Awards
Sensitivity Checks
Conclusion
References

Introduction

A single-minded focus on publication and citation metrics for the evaluation of research performance is problematic. There is no substitute for a careful, considered, unbiased, and qualitative evaluation of research quality by experts in the field.

However, time and expertise are not always available. Moreover, assessment is often – consciously or subconsciously – influenced by the journal in which an article was published, the country in which the research was conducted, the author’s university affiliation, as well as their demographic characteristics such as gender and race. Therefore, metrics are recommended as an aid to support decision-making in a range of academic decisions, such as performance evaluation and promotions.

Ecologically rational heuristics

However, these metrics need to be suited for the environment, i.e. they need to be ecologically rational heuristics (ERH). Heuristics – rules of thumb in lay language – are decision strategies that simplify decision-making by ignoring part of the information and focusing on one or a few dimensions. They oppose the common assumption that more knowledge or information is always better. Using relatively simple calculations, they are easy to explain to and understood by lay users.

Ecological rationality argues that the rationality of a decision is not universal; it depends on the specific circumstances in which it takes place. Key to effective decision-making is to achieve one's goals in a specific context. The concept of ecological rationality was introduced by Gerd Gigerenzer, who argues that heuristics are not irrational or always second-best to optimization, as long as they are fit for the decision context.

The ERH perspective was recently introduced in Bibliometrics by Lutz Bornmann who published several papers on the topic. Here ecological rationality means answering the question “when the use of specific bibliometric indicators leads to good decisions on performance and when it does not” (Bornmann, Ganser, & Tekles: 2022: 101237). However, to date these ideas do not seem to have gathered much traction in the bibliometric community, let alone in the broader area of “citizen bibliometrics”, bibliometrics practiced by research managers and scientists.

Peer review versus metrics

In this white paper I argue that the use of ecologically rational heuristics in academic decision-making is an excellent way to strike a balance between:

  • The uncritical acceptance of metrics. This appears to be the position of many - though by no means all - professional bibliometricians. Their predominant interest seems to be in finding the “perfect metric”, with little interest in its practical application or barriers to acceptance. However, perfect metrics do not exist; different metrics may be useful for different environments. Moreover, even metrics that are mathematically perfect might not be practically useful for “citizen bibliometrics”.
  • The blanket refusal to consider metrics, instead advocating peer review only. This is the position adopted by many academics, especially in the Social Sciences & Humanities, as well as by many key decision-makers in higher education. As I have argued elsewhere (see my white paper Research Impact 101) this position is often based on strawman comparisons and anecdata, as well as an idealised view of peer review.

Through my provision of the free Publish or Perish software since 2006 and publications in bibliometrics since 2008, I have aimed to provide a bridge between professional and citizen bibliometricians. During that time, it has always struck me how little space there seems to be for a position between these two extreme camps. This vacuum may have occurred because we do not think carefully enough about the purpose of metrics and how their usefulness varies with their purpose.

In this white paper, I argue that the individual annualised h-index – hIa for short – that we introduced 10 years ago in Publish or Perish and tested with a matched sample of academics, is an ecologically rational heuristic that can be used to effectively and efficiently evaluate research performance across career stages and disciplines. As such it can be used predictively to identify high performers at an early stage in their career and to conduct a fair and inclusive comparison of research performance across disciplines.

Want to read the full paper?

You can read the full white paper here: The individual annualised h-index: an ecologically rational heuristic?. You can skip directly to relevant sections of the white paper by using these links.

Introduction
Peer review
Heuristics
ERH metric
Methods
Sample
Metrics
Descriptives
Unbiased metrics
Disciplines
Career stage
Gender
Prediction
Promotion
-
 Mobility
Awards
Sensitivity Checks
Conclusion
References

Find the resources on my website useful?

I cover all the expenses of operating my website privately. If you enjoyed this post and want to support me in maintaining my website, consider buying a copy of one of my books (see below) or supporting the Publish or Perish software.

Aug 2022:

Only
£5.95...
Nov 2022:

Only
£5.95...
Feb 2023:

Only
£5.95...
May 2023:

Only
£5.95...

academia behind the scenes research focus publish or perish tips that's interesting citation metrics research evaluation bibliometric research h-index hia social sciences research metrics