Editorial: How to navigate in the ocean of indexers, metrics, and rankings in the management field

Flavio Hourneaux Junior (Administração, Faculdade de Economia Administração e Contabilidade, Universidade de São Paulo, Sao Paulo, Brazil)
Kavita Miadaira Hamza (Business Administration, Faculdade de Economia Administração e Contabilidade, Universidade de São Paulo, Sao Paulo, Brazil)
Ronaldo de Oliveira Santos Jhunior (FEA, Universidade de São Paulo, Sao Paulo, Brazil)

RAUSP Management Journal

ISSN: 2531-0488

Article publication date: 2 May 2023

Issue publication date: 5 May 2023

572

Citation

Hourneaux Junior, F., Hamza, K.M. and Santos Jhunior, R.d.O. (2023), "Editorial: How to navigate in the ocean of indexers, metrics, and rankings in the management field", RAUSP Management Journal, Vol. 58 No. 2, pp. 90-96. https://doi.org/10.1108/RAUSP-04-2023-272

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Emerald Publishing Limited


Following in good journals’ footsteps, our editorials have been trying to give our readers relevant information on the complex world of publishing in the management field. One particular aspect to be considered by the authors is the journals’ quality assessments, represented by several indexers, metrics and rankings, which work as parameters for researchers identifying the best (and most proper) outlets for publishing their research.

As both institutions and researchers are regularly evaluated by their academic impact that depends on those parameters (De Rond and Miller, 2005; Aguinis et al., 2014, 2020), the choice of the right outlet becomes critical (Aguinis et al., 2020). Therefore, before submitting a research paper, it is necessary to identify the proper outlet if the authors want to maximise their chances. Some constraining factors should be considered, such as the adherence to the journal’s editorial line (Linton, 2012; Sun and Linton, 2014) and language barriers (Cargill and Burgess, 2017), among others unrelated to the article’s technical aspects. In addition, identifying target journals is commonly done by analysing academic metrics, and here lies a problem for some (especially young) researchers – understanding the different indexes, comparing them and choosing the journal that fits each researcher’s needs.

Considering this situation, this editorial aims to summarise the best-known academic metrics, including indexers and impact factors regarding journals in the management field. We hope this information helps our readers to better navigate in such a complex environment and, consequently, to get better results in their careers.

This paper is structured as follows: First, we describe the key databases and metrics of two leading scientific content publishing organisations, Elsevier and Clarivate. Second, we discuss some of the world’s most prestigious lists and quality guides for scientific journals, such as the Chartered Association of Business Schools’ (CABS) Academic Journal Guide (AJG) and the Financial Times List. Lastly, we present the Google scientific metrics, which are more comprehensive and keep growing in relevance. In the end, we offer our final comments.

1. Elsevier related

1.1 Scopus

Launched in 2004, Scopus is Elsevier’s scientific publication database. Scopus includes over 34,346 peer-reviewed journals in the top-tier topic disciplines of biological sciences, social sciences, physical sciences and health sciences out of almost 36,377 titles (22,794 active and 13,583 inactive titles) from nearly 11,678 publishers (Elsevier, 2023a, 2023b). Three sources are covered: book series, scientific journals and trade journals. Four different numerical quality measures for each journal (h-index, CiteScore, Scimago Journal & Country Rank [SJR] and Source Normalised Impact per Paper [SNIP]) are examined annually to determine if they are of a high enough standard to be included in the Scopus database (Elsevier, 2016, 2021). Below, each one of these measures is detailed.

1.2 Scopus h-index

The h-index measures the number of journal articles cited by at least h different sources (Elsevier, 2021). More than for academic journals, it is also relevant to scientists and countries, for instance, and measures not only the scientific production of journals but also their scientific influence (Elsevier, 2021).

1.3 CiteScore

CiteScore is a metric representing the annual average number of citations to recent journal publications (Elsevier, 2016). Elsevier introduced this journal rating metric in December 2016 as an alternative to the commonly used Journal Citation Reports (JCR) impact factors (calculated by Clarivate) (Elsevier, 2016, 2021, 2022).

To calculate a journal’s CiteScore, one must consider the number of articles, reviews, conference papers, book chapters, etc., citing a journal over four years. This number must then be divided by the number of the same document categories indexed in Scopus and published over the same period (Elsevier, 2021).

CiteScore annual calculations reveal the typical number of citations for the whole year. The measured impact, however, may shift more quickly than that. Thus, the CiteScore Tracker offers a current picture of how a journal performs throughout the year, aggregating it monthly (Elsevier, 2022). In addition, new titles will have their CiteScore metrics the year after Scopus indexes them for the first time (Elsevier, 2022).

1.4 Scimago Journal & Country Rank

The SJR is a freely accessible site that provides journals and national scientific indicators derived from the Scopus database (Scimago, 2023). These indicators help evaluate and analyse scientific fields. Separate journals may be compared or examined, as well as individual country rankings. Academic journals may be classified according to their subject area (27 major thematic areas), subject category (309 specific subject categories) or country (Scimago, 2023). Citation data is derived from over 34,100 titles published by over 5,000 international publishers and country performance measures from 239 countries worldwide (Scimago, 2023).

1.5 Source Normalised Impact per Paper

The SNIP is a statistic that inherently considers variations in citation patterns specific to different fields. This is accomplished by contrasting the number of citations per journal publication with the field’s potential for citations, which is determined by the number of articles that cite each journal (Elsevier, 2021). As the value of a single citation is higher for journals in sectors where citations are less probable, and vice versa, SNIP quantifies contextual citation impact. It also permits direct comparison of journals in various subject domains (Elsevier, 2021). SNIP is derived yearly using Scopus data.

2. Clarivate related

2.1 Web of Science

A collection of bibliographic references from interdisciplinary fields, Web of Science (WoS) – previously known as Web of Knowledge – includes publications from the social, natural and behavioural sciences as well as the humanities.

The WoS Core Collection is connected to the WoS interdisciplinary platform through data and patent indexes (Clarivate, 2023). It is a comprehensive platform that enables researchers to follow concepts through time and disciplinary boundaries using data from more than 155 million records and approximately 1.7 billion cited references. Almost 34,000 journals are included in the WoS database (Clarivate, 2023).

2.2 Journal Citation Reports

Clarivate Analytics publishes the JCR every year. It is accessible through the WoS Core Collection and is connected to the WoS database (Clarivate, 2022). It details scholarly social and natural science publications like the Journal Impact Factor (JIF). The JCR was initially released as a component of the Science Citation Index. Nowadays, taking into account the context of indexed journals, the JCR is based on citations derived from the Science Citation Index Expanded (SCIE), the Social Sciences Citation Index (SSCI), the Arts & Humanities Citation Index (AHCI) and the Emerging Sources Citation Index (ESCI) (Clarivate, 2022). Other significant Clarivate indexes for determining the JIF and other metrics are the Conference Proceedings Citation Index (CPCI) and the Book Citation Index (BKCI) (Clarivate, 2022).

2.3 Journal Impact Factor

The JIF of an academic publication is a scientometric index calculated by Clarivate that measures the annual mean number of citations of papers published in the previous two years in a particular journal, as indexed by the WoS (Clarivate, 2022). As a journal-level indicator, it is widely used as a stand-in for a journal’s relative significance within its field; journals with higher impact factor values are accorded the status of being more significant or carrying more prestige within their respective disciplines than those with lower values (Clarivate, 2022). JIF metrics apply to journals indexed in the SCIE and SSCI (Clarivate, 2022).

The JIF is defined as citations to the journal in the JCR year to items published in the previous two years divided by the total number of scholarly items (articles and reviews) published in the journal in the preceding two years (Clarivate, 2022). The JCR year is the last complete year in that year’s JCR data collection. For instance, 2021 is the JCR year for the 2022 release.

The numerator of the JIF consists of any citation from material published in the JCR year to material published in the journal in the previous two years, regardless of the type of item cited (Clarivate, 2022). Each cited reference in a scholarly publication is an indication of impact. As a result, regardless of the cited document type, JCR combines all citations to a specific journal in the numerator. The citations in the JIF numerator are derived from all indexes in the WoS Core Collection: SCIE, SSCI, AHCI, ESCI, CPCI and BKCI (Clarivate, 2022). Citeable items are all materials indexed as articles or reviews on the WoS. The denominator does not contain items of any other document type, such as editorial content or letters (Clarivate, 2022).

2.4 Journal Citation Index

The Journal Citation Index (JCI) is a field-normalised measure derived for all WoS Core Collection journals published in the JCR. The value is the average category-normalised citation impact for papers published in the previous three years (Clarivate, 2021). The 2022 JCI, for example, was calculated for journals that published citeable items in 2019, 2020 and 2021, considering all citations from any document indexed between 2019 and 2022.

The JCI is the mean Category Normalized Citation Impact (CNCI) for all papers and reviews published in the previous three years (Clarivate, 2021). CNCI is an article-level statistic that normalises using three key elements: field (category), document type (article, review, etc.) and year of publication. CNCI measures a paper’s relative citation impact as a ratio of citations compared to a worldwide baseline (Clarivate, 2021). For example, a CNCI of 1.0 reflects the global average; values above 1.0 indicate greater than average citation impact (e.g. 2.0 is twice the average), whereas values below 1.0 indicate less than average citation effect (e.g. 0.5 is half the average).

3. National references and lists

3.1 Chartered Association of Business Schools Academic Journal Guide

The AJG’s objective is to assist scholars in making informed decisions about the journals they may want to publish in. It offers information about several journals covering essential or relevant business and management studies topics (CABS, 2021).

After evaluating numerous articles, the AJG’s ratings are determined through peer review, editorial and expert opinions (CABS, 2021). Citation statistics supplement the ratings. The weighted average of journal metrics is only one component of the AJG’s basis. Instead, it is guided by measurements. The rankings of journals are based on discussions held by the Scientific Committee’s topic specialists with professional peers and academic organisations about the relative status of journals in each subject area (CABS, 2021). The guide uses a grading system of 1 to 4+, with 4+ being the highest score, to represent the quality of business and management journals (CABS, 2021).

3.2 Australian Business Deans Council Journal Quality List

The Australian Business Deans Council (ABCD) is responsible for creating an index of journals known as the ABDC Journal Quality List. Authors may use this list to evaluate the quality of business journals. The ABDC list is constructed using the following rating scale: A*, A, B and C. The highest-ranked publications in the field of business are graded A* and A (ABCD, 2019).

3.3 Financial Times List

This is a list of journals of major value in the context of business and related fields. Numerous business schools worldwide use the Financial Times List to assess the quality and impact of research published by their academic staff.

3.4 Comité National de la Recherche Scientifique – catégorisation des revues en économie et en gestion

The list of journals of the National Committee for Scientific Research is an essential and widely recognised reference tool in France and internationally (CNRS, 2020). The list is intended to include journals publishing mainly articles in economics or management. Therefore, the journals are categorised into four levels as in previous mandates.

Category 1 brings together journals that structure economics and management (CNRS, 2020) that regularly publish particularly innovative articles. The arbitration process is demanding and transparent; most apply strict ethical rules such as banning submissions by management committee members and having low self-citation. Category 2 brings together journals with high selectivity with a demanding and transparent arbitration process (CNRS, 2020). These journals regularly welcome significant and occasionally very innovative contributions. They can play a structuring role in certain scientific fields or schools of thought. The visibility of published works is thus substantial. Category 3 combines selective journals with a demanding and transparent arbitration process (CNRS, 2020). These journals can accommodate important contributions. The visibility of the published work remains broad. Category 4 brings together journals with an arbitration process that respects international standards but with less selectivity (CNRS, 2020). These journals welcome original contributions to a relatively small community, particularly on national issues or issues of interest.

Thematic fields also classify the journals. These fields may evolve according to the evolution of the disciplines themselves.

3.5 Brazilian lists

3.5.1 Qualis Coordination of Superior Level Staff Improvement.

It serves as Brazil’s official categorisation for scientific production. It is kept up by the Brazilian Ministry of Education-affiliated Coordination of Superior Level Staff Improvement. Qualis aims to categorise and assess the academic means used to produce scientific publications (ABCD USP, 2023). The classification is done through a system of grades based on the journal’s quality and its level of dissemination, considering national or international contexts (ABCD USP, 2023). The following levels organise the classification: A1, A2, A3, A4, B1, B2, B3 and B4 – where A1 is the highest and B4 is the lowest in the classification.

3.5.2 Scientific Periodicals Electronic Library.

Launched in 2012, it is a platform that collects scientific output from Brazilian publications made online available in the management and related fields. It works by bringing together a variety of scientific publications that are all freely available and consultable. Since 2015, Scientific Periodicals Electronic Library has used the following indices to assess the impact index of indexed journals:

  • the average number of references per article;

  • the impact of two and five years;

  • the Immediacy index;

  • the self-citation rate;

  • the impact of two and five years without self-citation;

  • the citation half-life; and

  • the H-index (Rafael, 2023).

4. Google Scholar

Authors may quickly assess the prominence and impact of papers in academic journals using Google Scholar Metrics. In addition, to assist authors in choosing where to publish their research, Google Scholar Metrics compiles citations to a wide range of publications (Google Scholar, 2022).

4.1 Google Scholar Metrics

A journal’s Google h-index, such as in Scopus, is the highest number “h” such that at least “h” articles in that publication were cited at least “h” times each (Google Scholar, 2022). The h-core of a journal is a collection of the journal’s most-cited h articles. These are the articles upon which the h-index is built (Google Scholar, 2022).

A publication’s h-median is the median of the citation counts in its h-core (Google Scholar, 2022). The h-median measures the distribution of citations to the h-core articles.

A publication’s h5-index, h5-core and h5-median are the h-index, h-core and h-median of only those articles released during the previous five full calendar years (Google Scholar, 2022).

5. Final comments

As we have mentioned, we hope this editorial helps our readers better understand how to deal with so many indexers, metrics and ranking, increasing their chances of publication and, consequently, getting better results in their careers.

The list of metrics presented above is not exhaustive. However, it represents some of the best-known and most-used criteria for the papers’ destination choice. Depending on the authors’ context, such as country and institutional publication strategies, maybe other factors could be worth considering for them to make a proper choice.

References

ABCD. (2019). ABDC Journal Quality List released. Retrieved from https://abdc.edu.au/2019-abdc-journal-quality-list-released/ (accessed 24 February 2023).

ABCD USP. (2023). Periódicos Qualis CAPES. Agência de Bibliotecas e Coleções Digitais da Universidade de São Paulo. Retrieved from www.abcd.usp.br/apoio-pesquisador/escrita-publicacao-cientifica/selecao-revistas-publicacao/qualis-periodicos/ (accessed 21 March 2023).

Aguinis, H., Cummings, C., Ramani, R. S., & Cummings, T. G. (2020). An A is an A: the new bottom line for valuing academic research. Academy of Management Perspectives, 34(1), 135154, doi: 10.5465/amp.2017.0193.

Aguinis, H., Shapiro, D. L., Antonacopoulou, E., & Cummings, T. G. (2014). Scholarly impact: a pluralist conceptualization. Academy of Management Learning & Education, 13(4), 623639. doi: 10.5465/amle.2014.0121.

CABS. (2021). Academic Journal Guide 2021. Chartered Association of Business Schools. Retrieved from https://charteredabs.org/academic-journal-guide-2021/ (accessed 24 February 2023).

Cargill, M. & Burgess, S. (Eds) (2017). Publishing research in English as an additional language: Practices, pathways and potentials, Adelaide: University of Adelaide Press. 10.20851/english-pathways

Clarivate. (2021). Introducing the journal citation indicator. Retrieved from https://clarivate.com/wp-content/uploads/dlm_uploads/2021/05/Journal-Citation-Indicator-discussion-paper.pdf (accessed 24 February 2023).

Clarivate. (2022). Journal Citation Reports reference guide. Retrieved from https://clarivate.com/wp-content/uploads/dlm_uploads/2022/06/JCR-2022-Reference-Guide.pdf (accessed 24 February 2023).

Clarivate. (2023). Web of Science coverage details. Resources for Librarians – LibGuides at Clarivate Analytics, Retrieved from https://clarivate.libguides.com/librarianresources/coverage#:∼:text=The%20Web%20of%20Science%20provides,research%20results%20and%20measure%20impact (accessed 24 February 2023).

CNRS. (2020). Catégorisation des revues en économie et en gestion. Retrieved from https://ftp.gate.cnrs.fr/IMG/pdf/categorisation37_liste_juin_2020-2.pdf (accessed 24 February 2023).

De Rond, M., & Miller, A. N. (2005). Publish or perish. Journal of Management Inquiry, 14(4), 321329, doi: 10.1177/1056492605276850.

Elsevier. (2016). CiteScore: a new metric to help you track journal performance and make decisions. Elsevier.com, Retrieved from www.elsevier.com/connect/editors-update/citescore-a-new-metric-to-help-you-choose-the-right-journal (accessed 24 February 2023).

Elsevier. (2021). Measuring a journal’s impact. Elsevier.com, Retrieved from www.elsevier.com/authors/tools-and-resources/measuring-a-journals-impact#:∼:text=Source%20Normalized%20Impact%20per%20Paper%20(SNIP)%20is%20a%20sophisticated%20metric,of%20publications%20citing%20that%20journal (accessed 24 February 2023).

Elsevier. (2023a). What is Scopus about? Scopus: Access and use Support Center, Retrieved from https://service.elsevier.com/app/answers/detail/a_id/15100/supporthub/scopus/related/1/ (accessed 24 February 2023).

Elsevier. (2023b). The CiteScore metrics advantage. CiteScore metrics are freely available on Scopus | Elsevier solutions, Retrieved from www.elsevier.com/solutions/scopus/how-scopus-works/metrics/citescore (accessed 24 February 2023).

Google Scholar. (2022). Google Scholar Metrics. Retrieved from https://scholar.google.com/intl/en/scholar/metrics.html#overview (accessed 24 February 2023).

Linton, J. (2012). How to get your papers rejected (or not). Technovation, 32(1), 68, doi: 10.1016/j.technovation.2011.09.006.

Rafael, S. (2023). Spell: ten years of contribution to science. ANPAD, Retrieved from https://anpad.org.br/en/newsletter-news/january-march-2023-edition-volume-3-issue-1/news/spell-ten-years-of-contribution-to-science/ (accessed 21 March 2023).

Scimago. (2023). SJR – Scimago Journal & Country Rank. SJR – About Us, Retrieved from www.scimagojr.com/aboutus.php (accessed 24 February 2023).

Sun, H., & Linton, J. D. (2014). Structuring papers for success: making your paper more like a high impact publication than a desk reject. Technovation, 34(10), 571573, doi: 10.1016/j.technovation.2014.07.008.

Further reading

Stolowy, H. (2017). Letter from the editor: why are papers desk rejected at European Accounting Review? European Accounting Review, 26(3), 411418, doi: 10.1080/09638180.2017.1347360.

Related articles