Citation analysis for the Social Sciences: metrics and data-sources
In the last decades, the use of metrics for research evaluation seems to have become an integral part of the academic landscape. The adverse impact of this “audit culture” is well documented (see e.g. Adler & Harzing, 2009). However, since the reversal of this trend is unlikely, research into fairer and more inclusive ways of measuring research performance is gaining more and more momentum.
Research into fairer and more inclusive measurement of research performance
My own bibliometric research since 2007 has been part of that movement. My latest article in this stream of research provides a longitudinal and cross-disciplinary comparison of the three major databases for citation analysis: Google Scholar, Scopus and the Web of Science. As it was presented at a workshop on research evaluation in Madrid, it comes complete with a set of slides and a youtube video of the presentation.
- Harzing, A.W.; Alakangas, S. (2016) Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison, Scientometrics, 106(2): 787-804. Available online... - Publisher's version (read for free) - Presentation slides - Video presentation of this article - ESI top 1% most Highly Cited Paper - ESI hot paper
Peer review vs. bibliometrics
Although a range of studies has found strong correlations between peer review and bibliometric indicators, most governmental research assessment exercises rely solely on peer review for the Social Sciences and Humanities as bibliometric coverage is deemed insufficient for these disciplines.
When executed properly and without bias, peer review is no doubt preferable to the exclusive use bibliometric indicators. On the other hand, peer review is very time-consuming and has an inherent element of subjectivity. Assessors are expected to base their assessment purely on the quality of the publications submitted. However, in practice both the perceived quality of the journal in which the article was published, and the perceived status of the university affiliation of the author(s) are likely to create positive or negative halo effects.
Comparing Google Scholar with the Web of Science
In Harzing (2013) I therefore investigated the use of Google Scholar, which has a much better coverage for the Social Sciences and Humanities than the traditional source of citation data: Thomson Reuter’s Web of Science (also known as ISI). A study of 20 Nobelists in Medicine, Physics, Chemistry and Economics showed that Google Scholar displayed considerable stability over time. In addition, coverage for disciplines that had traditionally been poorly represented in Google Scholar (Chemistry and Physics) was increasing rapidly. Google Scholar’s coverage was also comprehensive; all of the 800 most cited publications by our Nobelists could be located in Google Scholar.
A follow-up study (Harzing, 2014) comparing 2012 and 2013 coverage, found that - after a period of significant expansion for Chemistry and Physics - Google Scholar coverage was now increasing at a stable rate. The increased stability and coverage might thus make Google Scholar much more suitable for research evaluation and bibliometric research purposes than it has been in the past.
Use the right metrics for cross-disciplinary comparisons
With research librarian Satu Alakangas, I thus embarked on a large-scale comparative project of 146 academics across 37 disciplines in five broad disciplinary areas (Humanities, Social Sciences, Engineering, Sciences and Life Sciences). We collected quarterly publication and citation data for Google Scholar, Scopus and Web of Science for two years (July 2013-July 2015).
Our longitudinal comparison of eight data points between 2013 and 2015 showed a consistent and reasonably stable quarterly growth for both publications and citations across the three databases. This suggests that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons.
Our cross-disciplinary comparison of the three databases included four key research metrics: publications, citations, h-index, and hI,annual, an annualised individual h-index introduced by Harzing, Alakangas & Adams (2014). We showed that both the data source and the specific metrics used change the conclusions that can be drawn from cross-disciplinary comparisons.
More specifically, we find that when using the h-index as a metric and the Web of Science as a data source, Life Science and Science academics dramatically outperform their counterparts in Engineering, the Social Sciences and Humanities.However, when using the hI,annual and Google Scholar or Scopus as a data source, Life Science, Science, Engineering and Social Science academics all show a very similar research performance; whereas the average Humanities academic has a hI,annual that is half to two thirds as high as the other disciplines.
We thus argue that a fair and inclusive cross-disciplinary comparison of research performance is possible, provided we use a data source with more comprehensive coverage, such as Google Scholar or Scopus, and the recently introduced hI, annual - a h-index corrected for career length and co-authorship patterns - as the metric of choice.
Other papers referred to in this blog
- Harzing, A.W. (2013) A preliminary test of Google Scholar as a source for citation data: A longitudinal study of Nobel Prize winners, Scientometrics, 93(3): 1057-1075. Available online... - Publisher’s version
- Harzing, A.W. (2014) A longitudinal study of Google Scholar coverage between 2012 and 2013, Scientometrics, 98(1): 565-575. Available online... - Publisher’s version
- Harzing, A.W.; Alakangas, S.; Adams, D. (2014) hIa: An individual annual h-index to accommodate disciplinary and career length differences, Scientometrics, 99(3): 811-821. Available online... - Publisher’s version
Drop me a line
Free pre-publication versions of these papers are hyperlinked. If you’d like to have an official reprint for these papers, just drop me an email.
Related blog posts
- Publish or Perish: Realising Google Scholar's potential to democratise citation analysis
- Bibliometrics in the Arts, Humanities, and Social Sciences
- To rank or not to rank
- Is Google Scholar flawless? Of course not!
- Working with ISI data: Beware of categorisation problems
- Bank error in your favour? How to gain 3,000 citations in a week
- Microsoft Academic is one year old: the Phoenix is ready to leave the nest
- Running the REF on a rainy Sunday afternoon: Do metrics match peer review?
Copyright © 2018 Anne-Wil Harzing. All rights reserved. Page last modified on Tue 7 Aug 2018 09:26
Anne-Wil Harzing is Professor of International Management at Middlesex University, London and visiting professor of International Management at Tilburg University. In addition to her academic duties, she also maintains the Journal Quality List and is the driving force behind the popular Publish or Perish software program.