Where to submit your paper? Compare journals by impact

Shows how to use Publish or Perish to compare journals by impact

After creating a “short-list” of journals that you might want to publish your paper in, one of the criteria to make your final choice might be the standing or rank of the journal.

Journal rankings: stated preference vs revealed preference

In general we can distinguish two broad approaches to ranking journals: stated preference (or peer review) and revealed preference (Tahai & Meyer, 1999). Stated preference involves members of particular academic community ranking journals on the basis of their own expert judgments. There are hundreds of specific university journal rankings. Harzing’s Journal Quality List (JQL) aggregates a range of these rankings in Economics & Business. Opinions might be based on anything from a large-scale worldwide survey of academics to a small group of individuals with decision-making power, but will always contain some element of subjectivity.

Revealed preference rankings are based on actual publication behaviour and generally measure the citation rates of journals using Thomson Reuter's [now Clarivate] Web of Knowledge. Most commonly used is the Journal Citation Reports (JCR), which provide the yearly Journal Impact Factors (JIF). However, any source of citation data can be used. Publish or Perish is ideally suited to measure the impact of journals with Google Scholar or Microsoft Academic data.

Mingers and Harzing (2007) show that there is a high degree of correlation between journal rankings based on stated and revealed preference. However, as Tahai & Meyer (1999) point out, stated preference studies have long memories: perceptions of journals normally change only slowly. As such, revealed preference studies provide a fairer assessment of new journals or journals that have recently improved their standing. Therefore, revealed preference studies can often present a more accurate picture of journal impact.

Worked example: Accounting journals

Because of differences in accounting rules across countries, Accounting is a localized discipline. As a result, not many of its journals are listed in the Clarivate Journal Citation Reports. Only 30% of the journals in Finance & Accounting listed on the Journal Quality List are included in the Journal Citation Reports. In contrast, three quarters or more of the journals listed on the JQL in Economics or Management Information Systems are listed. Hence, if one wants to compare the citation impact of Accounting journals, using Google Scholar and Publish or Perish is often the only alternative. The table below lists a selection of Accounting journals, including the journals generally recognized as the top-5 accounting journals. The analysis was conducted in July 2010, but the relative rankings are unlikely to have changed much. The table first lists the ISI Journal Impact Factor for 2009 (where available) and the ABDC (Australian Business Dean’s Council) rank, a popular journal ranking list in Australia.


It then reports on the results of a Publish or Perish impact analysis for papers published in the journals between 2005 and July 2010. I report the average number of citations per paper, the Google Scholar h-index and g-index. In order to get a realistic citations per paper count, I merged duplicate papers, removed book reviews, commentaries, obituaries, conference announcements, call for papers, etc. as these items rarely ever attract citations. Including them would distort comparisons between journals that include these items and journals that do not.

The top-5 accounting journals (all A* ranked in the ABDC ranking) stand apart in terms of their Journal Impact Factor and Google Scholar metrics. However, there is quite a difference in terms of citations per paper for the remaining journals. European Accounting Review, Accounting and Business Research and Journal of Accounting and Public Policy have a citation per paper rate that is 2-3 times as high as the journals towards the bottom of the list. This is true despite the fact that they were all ranked A on the Australian journal ranking list.

The GS h-index and GS g-index also show similar differences. Overall though, there is a very strong correlation between the three GS-based impact measures (0.86 between GS cpp and GS h-index; 0.92 between GS cpp and GS g-index; 0.98 between the h-index and g-index). It is, however, interesting to see that some journals (e.g. Critical Perspectives on Accounting) publish a fairly large number of impactful papers, as evidenced by the relatively high h-index and g-index, even though the average number of citations per paper is not very high.


In conclusion, our example shows that even when comparing journals that score similarly in stated preference (peer review) rankings, can have very different impact scores. Given that very few Accounting journals have ISI journal impact factors, a Google Scholar based impact analysis is an excellent way to assess the impact of non-IS listed journals. This would apply equally to journals in many other areas of the Social Sciences and Humanities. A PoP impact analysis for the journal in question thus allows you to make a more-informed choice when you chose a journal to submit your paper to. Give it a try for your next paper and let me know how you fare.