Microsoft Academic: Is the Phoenix getting wings?
Compares citation coverage in the new Microsoft Academic with Google Scholar, Scopus and Web of Science
After a whole string of publications on Google Scholar as a source of citation data, I have now written up what is the first large-scale analysis of the new Microsoft Academic. Please note: Since version 5 and version 6 my free software program Publish or Perish allows searching in the Microsoft Academic data source.
- Harzing, A.W.; Alakangas, S. (2017) Microsoft Academic: Is the Phoenix getting wings?, Scientometrics, vol. 110, no. 1, pp. 371-383. Available online... - Publisher's version (read for free) - Press coverage in Scientific American and Nature
In this article, we compare publication and citation coverage of the new Microsoft Academic with all other major sources for bibliometric data: Google Scholar, Scopus, and the Web of Science, using a sample of 145 academics in five broad disciplinary areas: Life Sciences, Sciences, Engineering, Social Sciences, and Humanities.
Overall, just like my first small-scale study on this topic, our large-scale comparative study suggests that the new incarnation of Microsoft Academic presents us with an excellent alternative for citation analysis. We therefore conclude that the Microsoft Academic Phoenix is undeniably growing wings; it might be ready to fly off and start its adult life in the field of research evaluation soon.
Comparing publications, citations, h-index and hIa
The easiest way to summarise the article is probably through its five figures. Figure 1 compares the average number of papers and citations across the four databases. On average, Microsoft Academic reports more papers per academic than Scopus and Web of Science and less than Google Scholar. However, in addition to covering a wider range of research outputs (such for instance as books), both Google Scholar and Microsoft Academic also include so-called “stray” publications, i.e. publications that are duplicates of other publications, but with a slightly different title or author variant. Hence, a comparison of papers across databases is probably not very informative.
However, citations can be more reliably compared across databases as stray publications typically have few citations. As Figure 1 shows, on average Microsoft Academic citations are very similar to Scopus and Web of Science citations and substantively lower only than Google Scholar citations. On average, Microsoft Academic provides 59% of the Google Scholar citations, 97% of the Scopus citations and 108% of the Web of Science citations.
The aforementioned differences in citation patterns are also reflected in the differences in the average h-index and hIa (individual annual h-index) for our sample (see Figure 2). On average, the Microsoft Academic h-index is 77% of the Google Scholar h-index, equal to the Scopus h-index, and 108% of the Web of Science h-index. The Microsoft Academic hIa-index is on average 71% of the Google Scholar index, equal to the Scopus index and 113% of the Web of Science index. Again Microsoft Academic, Scopus and Web of Science present very similar metrics.
Cross-disciplinary comparisons
Microsoft Academic has fewer citations than Scopus and, marginally, than Web of Science for the Life Sciences and Sciences (see Figure 3). However, overall citation levels for the Life Sciences and Sciences are fairly similar across three of the four databases. To a lesser extent this is true for Engineering as well. For three of our five disciplines, Microsoft Academic thus differs substantially in citation counts only from Google Scholar, providing between 57% and 67% of Google Scholar citations.
Confirming our earlier study based on the same sample of academics (Harzing & Alakangas, 2016), the differences between disciplines are much smaller when considering the hIa, which was specifically designed to adjust for career length and disciplinary differences (see Figure 4). Again we see that Microsoft Academic provides metrics that are very similar to Scopus and Web of Science for the Life Sciences and the Sciences.
For Engineering and the Humanities, the Microsoft Academic hIa is very similar to the Scopus hIa, whereas it is 1.2 (Engineering) to 1.5 (Humanities) times as high as the Web of Science hIa. Only for the Social Sciences is the Microsoft Academic hIa substantially higher than both the Scopus and the Web of Science hIa. The Google Scholar hIa is higher for all disciplines than the Microsoft Academic hIa, from 1.3 times as high for Engineering to 1.9 times as high for the Humanities.
MAS estimated citation counts
Microsoft Academic only includes citation records if it can validate both citing and cited papers as credible. Credibility is established through a sophisticated machine learning based system and citations that are not credible are dropped. The number of dropped citations, however, is used to estimate “true” citation counts. These estimated citation counts were added to the Microsoft Academic database in July/August 2016.
Taking Microsoft Academic estimated citation counts rather than linked citation counts as our basis for the comparison with Scopus, Web of Science, and Google Scholar does change the comparative picture quite dramatically. Looking at our overall sample of 145 academics, Microsoft Academic’s average estimated citation counts (3873) are much higher than both Scopus (2413) and Web of Science (2168) citation counts. However, Microsoft Academic average estimated citation counts (3873) are also very similar to Google Scholar’s average counts (3982); presenting a difference of less than 3%.
With regard to disciplines, Figure 5 shows that although Microsoft Academic estimated citation counts are closer to Google Scholar citation counts for all disciplines, Microsoft Academic gets closer for some disciplines than for others. For the Life Sciences Microsoft Academic estimated citation counts are in fact 12% higher than Google Scholar counts, whereas for the Sciences they are almost identical. For Engineering, Microsoft Academic estimated citation counts are 14% lower than Google Scholar citations, whereas for the Social Sciences this is 23%. Only for the Humanities are they substantially (69%) lower than Google Scholar citations.
Related YouTube video
Other articles on the same themes
- Harzing, A.W. (2013) A preliminary test of Google Scholar as a source for citation data: A longitudinal study of Nobel Prize winners, Scientometrics, vol. 93, no. 3, pp. 1057-1075. Available online...
- Harzing, A.W. (2013) Document categories in the ISI Web of Knowledge: Misunderstanding the Social Sciences?, Scientometrics, vol. 93, no. 1, pp. 23-34. Available online...
- Harzing, A.W. (2014) A longitudinal study of Google Scholar coverage between 2012 and 2013, Scientometrics, vol. 98, no. 1, pp. 565-575. Available online...
- Harzing, A.W.; Alakangas, S.; Adams, D. (2014) hIa: An individual annual h-index to accommodate disciplinary and career length differences, Scientometrics, vol. 99, no. 3, pp. 811-821. Available online...
- Harzing, A.W. (2015) From h-index to hIa: The ins and outs of research metrics, www.harzing.com white paper.
- Harzing, A.W. (2015) Health warning: Might contain multiple personalities. The problem of homonyms in Thomson Reuters Essential Science Indicators, Scientometrics, vol. 105, no. 3, pp. 2259-2270. Available online... - Publisher's version (read for free) - [Press coverage in The Times and the Times Higher Education].
- Harzing, A.W.; Mijnhardt, W. (2015) Proof over promise: Towards a more inclusive ranking of Dutch academics in Economics & Business, Scientometrics, vol. 102, no. 1, pp. 727-749. Available online... - Publisher's version (read for free)
- Harzing, A.W. (2016) Microsoft Academic (Search): a Phoenix arisen from the ashes?, Scientometrics, vol. 108, no. 3, 1637-1647. Available online... - Publisher's version (read for free)
- Harzing, A.W.; Alakangas, S. (2016) Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison, Scientometrics, vol. 106, no. 2, pp. 787-804. Available online... - Publisher's version (read for free) - Presentation slides - Video presentation of this article - ESI top 1% most Highly Cited Paper - ESI hot paper
- Harzing, A.W. (2017) Running the REF on a rainy Sunday afternoon: Do metrics match peer review?, www.harzing.com white paper.
- Harzing, A.W.; Alakangas, S. (2017) Microsoft Academic is one year old: the Phoenix is ready to leave the nest, Scientometrics, vol. 112, no. 3, pp. 1887-1894. Available online... - Publisher's version (read for free)
Copyright © 2022 Anne-Wil Harzing. All rights reserved. Page last modified on Thu 2 Jun 2022 17:38
Anne-Wil Harzing is Emerita Professor of International Management at Middlesex University, London and visiting professor of International Management at Tilburg University. She is a Fellow of the Academy of International Business, a select group of distinguished AIB members who are recognized for their outstanding contributions to the scholarly development of the field of international business. In addition to her academic duties, she also maintains the Journal Quality List and is the driving force behind the popular Publish or Perish software program.