Insights into the accuracy of the World’s Top 2% of Scientists list by Stanford University
Description
The tools and methods used to measure the impact of an individual’s research work are H-index, citations, Altmetrics, and impact factor of journals where the work is published. The databases widely used for the H-index and citations are Web of Science, Scopus, and Google Scholar. Web of Science indexes over 10000 journals while Scopus indexed over 15,000 journals. Google Scholar has more journals indexed and more publication types than the other databases but it is not comprehensive and accurate, because individual researchers determine the results, unlike the other two. The h-index is considered a measure of both the scientific productivity and scientific impact of a scientist. In citation analysis, the number of times an article is cited by other works is used to measure the impact of a publication/author. Altmetrics is based on the attention of a published work through social media, citations, and article downloads. Impact factor (IF) is a measure of the importance or rank of a journal, it is based on the citations published papers the previous two years. A few years ago John Ioannidis and co-authors of Stanford University created a unique publicly available database of top-cited scientists in the world using Scopus data from Elsevier, which has enthralled the scientific community, institutions, and media[1-3]. This database, touted as a new approach to eliminate the misuse of citation metrics, has enthralled the scientific community, institutions, and media. Many institutions used this as a yardstick to assess the quality of researchers, some institutions widely publicized the list and gave rewards to researchers who were listed. At the same time, some researchers look at this list with skepticism by citing problems with the methodology used in the c-score-based ranking. To evaluate the accuracy of c-score-based ranking, being my name figured in both the career and individual year lists, I have done a detailed analysis of the matrix parameters. 2578 and 4635 Indian authors appear in the career and single-year categories, respectively. For comparison of the ranking, the last 25 years (1998-2022) of Nobel laureates of Physics, chemistry, and medicine, and the top 100 rank holders in the list are chosen. The latest career-long, and single-year-based databases (2022) were used for this analysis. The details of the analysis are presented here. Though the article says the selection is based on the top 100,000 scientists by c-score (with and without self-citations) or a percentile rank of 2% or above in the sub-field, the actual career-based ranking list has 204644 names [1]. The single-year database contains 210199 names. So, the list contains names of the top 4 %, if we assume 100000 scientists on the list as 2%! The entry of many authors having single digit H index and a very meager total number of citations indicates serious shortcomings of the c-score-based ranking methodology.
Files
Steps to reproduce
Although the c-score-based world ranking captures the overall impact of research papers of research scientists, it does not give an accurate account of the impact of original research papers. c-score-based ranking also suffers from the same problems as the citation or H-index-based ranking (e.g. Research com, AD Scientific Index, IRINS, etc.). The lack of normalization of citations with respect to the total number of publications/co-authors/review papers; non-inclusion of publications in non-Scopus indexed journals, non-mapping of author names in the Scopus database, excessive citations of review papers, extra benefits of first/last and single authors, lack of credit for corresponding authors, anomaly in subject classification, etc., are some of the shortcomings associated with the methodology used in the c-score based ranking. Finally, although citations and h-index are influence and productivity indicators of scholarly work, respectively they need not be the ultimate parameters revealing the true impact of discoveries. Furthermore, coercive citations and unethical publishing practices are hampering the true assessment of scholarly publications. Reference: [1] Ioannidis JPA et al. . A standardized citation metrics author database annotated for scientific field. (2019) PLoS Biol, 17(8), art. no.: e3000384. pmid:31404057 [2] Ioannidis JPA, et al. (2020) Updated science-wide author databases of standardized citation indicators. PLoS Biol 18(10): e3000918. https://doi.org/10.1371/journal.pbio.3000918 [3] Ioannidis JPA, "Updated science-wide author databases of standardized citation indicators" 4 October 2023 (Version 6), DOI:10.17632/btchxktzyw.6