1
|
Correlating article citedness and journal impact: an empirical investigation by field on a large-scale dataset. Scientometrics 2023. [DOI: 10.1007/s11192-022-04622-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
AbstractIn spite of previous research demonstrating the risks involved, and counsel against the practice as early as 1997, some research evaluations continue to use journal impact alone as a surrogate of the number of citations of hosted articles to assess the latter’s impact. Such usage is also taken up by research administrators and policy-makers, with very serious implications. The aim of this work is to investigate the correlation between the citedness of a publication and the impact of the host journal. We extend the analyses of previous literature to all STEM fields. Then we also aim to assess whether this correlation varies across fields and is stronger for highly cited authors than for lowly cited ones. Our dataset consists of a total of almost one million authorships of 2010–2019 publications authored by about 28,000 professors in 230 research fields. Results show a low correlation between the two indicators, more so for lowly cited authors as compared to highly cited ones, although differences occur across fields.
Collapse
|
2
|
Superior identification index: Quantifying the capability of academic journals to recognize good research. Scientometrics 2022. [DOI: 10.1007/s11192-022-04372-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
3
|
Zhang L, Wei Y, Sivertsen G, Huang Y. The motivations and criteria behind China's list of questionable journals. LEARNED PUBLISHING 2022. [DOI: 10.1002/leap.1456] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Lin Zhang
- Center for Science, Technology & Education Assessment (CSTEA), School of Information Management Wuhan University Wuhan Hubei Province China
- Center for Studies of Information Resources, School of Information Management Wuhan University Wuhan Hubei Province China
- Centre for R&D Monitoring (ECOOM) and Department MSI Leuven Belgium
| | - Yahui Wei
- Center for Science, Technology & Education Assessment (CSTEA), School of Information Management Wuhan University Wuhan Hubei Province China
- Center for Studies of Information Resources, School of Information Management Wuhan University Wuhan Hubei Province China
| | - Gunnar Sivertsen
- Nordic Institute for Studies in Innovation, Research and Education Oslo Norway
| | - Ying Huang
- Center for Science, Technology & Education Assessment (CSTEA), School of Information Management Wuhan University Wuhan Hubei Province China
- Center for Studies of Information Resources, School of Information Management Wuhan University Wuhan Hubei Province China
- Centre for R&D Monitoring (ECOOM) and Department MSI Leuven Belgium
| |
Collapse
|
4
|
Affiliation(s)
- Mark S. Allen
- School of Psychology, University of Wollongong, NSW, Australia
| | - Dragos Iliescu
- Faculty of Psychology and Educational Sciences, University of Bucharest, Romania
| |
Collapse
|
5
|
Huang Y, Li R, Zhang L, Sivertsen G. A comprehensive analysis of the journal evaluation system in
China. QUANTITATIVE SCIENCE STUDIES 2021. [DOI: 10.1162/qss_a_00103] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
Abstract
Journal evaluation systems reflect how new insights are critically reviewed and published, and the prestige and impact of a discipline’s journals is a key metric in many research assessment, performance evaluation, and funding systems. With the expansion of China’s research and innovation systems and its rise as a major contributor to global innovation, journal evaluation has become an especially important issue. In this paper, we first describe the history and background of journal evaluation in China and then systematically introduce and compare the most currently influential journal lists and indexing services. These are the Chinese Science Citation Database (CSCD), the Journal Partition Table (JPT), the AMI Comprehensive Evaluation Report (AMI), the Chinese S&T Journal Citation Report (CJCR), “A Guide to the Core Journals of China” (GCJC), the Chinese Social Sciences Citation Index (CSSCI), and the World Academic Journal Clout Index (WAJCI). Some other influential lists produced by government agencies, professional associations, and universities are also briefly introduced. Through the lens of these systems, we provide comprehensive coverage of the tradition and landscape of the journal evaluation system in China and the methods and practices of journal evaluation in China with some comparisons to how other countries assess and rank journals.
Collapse
Affiliation(s)
- Ying Huang
- School of Information Management, Wuhan University, China
- Centre for R&D Monitoring (ECOOM) and Dept. MSI, KU Leuven, Belgium
| | - Ruinan Li
- School of Information Management, Wuhan University, China
| | - Lin Zhang
- School of Information Management, Wuhan University, China
- Centre for R&D Monitoring (ECOOM) and Dept. MSI, KU Leuven, Belgium
| | - Gunnar Sivertsen
- Nordic Institute for Studies in Innovation, Research and Education, Tøyen, Oslo, Norway
| |
Collapse
|
6
|
|
7
|
Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World. PUBLICATIONS 2021. [DOI: 10.3390/publications9010012] [Citation(s) in RCA: 133] [Impact Index Per Article: 44.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022] Open
Abstract
Nowadays, the importance of bibliographic databases (DBs) has increased enormously, as they are the main providers of publication metadata and bibliometric indicators universally used both for research assessment practices and for performing daily tasks. Because the reliability of these tasks firstly depends on the data source, all users of the DBs should be able to choose the most suitable one. Web of Science (WoS) and Scopus are the two main bibliographic DBs. The comprehensive evaluation of the DBs’ coverage is practically impossible without extensive bibliometric analyses or literature reviews, but most DBs users do not have bibliometric competence and/or are not willing to invest additional time for such evaluations. Apart from that, the convenience of the DB’s interface, performance, provided impact indicators and additional tools may also influence the users’ choice. The main goal of this work is to provide all of the potential users with an all-inclusive description of the two main bibliographic DBs by gathering the findings that are presented in the most recent literature and information provided by the owners of the DBs at one place. This overview should aid all stakeholders employing publication and citation data in selecting the most suitable DB.
Collapse
|
8
|
|
9
|
Pech G, Delgado C. Assessing the publication impact using citation data from both Scopus and WoS databases: an approach validated in 15 research fields. Scientometrics 2020. [DOI: 10.1007/s11192-020-03660-w] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
10
|
Towards a More Realistic Citation Model: The Key Role of Research Team Sizes. ENTROPY 2020; 22:e22080875. [PMID: 33286646 PMCID: PMC7517479 DOI: 10.3390/e22080875] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/09/2020] [Revised: 08/06/2020] [Accepted: 08/08/2020] [Indexed: 11/24/2022]
Abstract
We propose a new citation model which builds on the existing models that explicitly or implicitly include “direct” and “indirect” (learning about a cited paper’s existence from references in another paper) citation mechanisms. Our model departs from the usual, unrealistic assumption of uniform probability of direct citation, in which initial differences in citation arise purely randomly. Instead, we demonstrate that a two-mechanism model in which the probability of direct citation is proportional to the number of authors on a paper (team size) is able to reproduce the empirical citation distributions of articles published in the field of astronomy remarkably well, and at different points in time. Interpretation of our model is that the intrinsic citation capacity, and hence the initial visibility of a paper, will be enhanced when more people are intimately familiar with some work, favoring papers from larger teams. While the intrinsic citation capacity cannot depend only on the team size, our model demonstrates that it must be to some degree correlated with it, and distributed in a similar way, i.e., having a power-law tail. Consequently, our team-size model qualitatively explains the existence of a correlation between the number of citations and the number of authors on a paper.
Collapse
|
11
|
|
12
|
Antonoyiannakis M. Impact factor volatility due to a single paper: A comprehensive analysis. QUANTITATIVE SCIENCE STUDIES 2020. [DOI: 10.1162/qss_a_00037] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
We study how a single paper affects the impact factor (IF) of a journal by analyzing data from 3,088,511 papers published in 11639 journals in the 2017 Journal Citation Reports of Clarivate Analytics. We find that IFs are highly volatile. For example, the top-cited paper of 381 journals caused their IF to increase by more than 0.5 points, while for 818 journals the relative increase exceeded 25%. One in 10 journals had their IF boosted by more than 50% by their top three cited papers. Because the single-paper effect on the IF is inversely proportional to journal size, small journals are rewarded much more strongly than large journals for a highly cited paper, while they are penalized more for a low-cited paper, especially if their IF is high. This skewed reward mechanism incentivizes high-IF journals to stay small to remain competitive in rankings. We discuss the implications for breakthrough papers appearing in prestigious journals. We question the reliability of IF rankings given the high IF sensitivity to a few papers that affects thousands of journals.
Collapse
Affiliation(s)
- Manolis Antonoyiannakis
- Department of Applied Physics & Applied Mathematics, Columbia University, 500 W. 120th St., Mudd 200, New York, NY 10027
- American Physical Society, Editorial Office, 1 Research Road, Ridge, NY 11961-2701
| |
Collapse
|
13
|
Percentile and stochastic-based approach to the comparison of the number of citations of articles indexed in different bibliographic databases. Scientometrics 2020. [DOI: 10.1007/s11192-020-03386-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
|
14
|
|
15
|
Evaluating research and researchers by the journal impact factor: Is it better than coin flipping? J Informetr 2019. [DOI: 10.1016/j.joi.2019.01.009] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
|
16
|
Lognormal distribution of citation counts is the reason for the relation between Impact Factors and Citation Success Index. J Informetr 2018. [DOI: 10.1016/j.joi.2017.12.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
17
|
Katchanov YL, Markova YV. The “space of physics journals”: topological structure and the Journal Impact Factor. Scientometrics 2017. [DOI: 10.1007/s11192-017-2471-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
18
|
Zhang L, Rousseau R, Sivertsen G. Science deserves to be judged by its contents, not by its wrapping: Revisiting Seglen's work on journal impact and research evaluation. PLoS One 2017; 12:e0174205. [PMID: 28350849 PMCID: PMC5369779 DOI: 10.1371/journal.pone.0174205] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2016] [Accepted: 03/06/2017] [Indexed: 11/30/2022] Open
Abstract
The scientific foundation for the criticism on the use of the Journal Impact Factor (JIF) in evaluations of individual researchers and their publications was laid between 1989 and 1997 in a series of articles by Per O. Seglen. His basic work has since influenced initiatives such as the San Francisco Declaration on Research Assessment (DORA), the Leiden Manifesto for research metrics, and The Metric Tide review on the role of metrics in research assessment and management. Seglen studied the publications of only 16 senior biomedical scientists. We investigate whether Seglen’s main findings still hold when using the same methods for a much larger group of Norwegian biomedical scientists with more than 18,000 publications. Our results support and add new insights to Seglen’s basic work.
Collapse
Affiliation(s)
- Lin Zhang
- Dept. Management and Economics, North China University of Water Resources and Electric Power, Zhengzhou, China
- Centre for R&D Monitoring (ECOOM) and Dept. MSI, KU Leuven, Belgium
- * E-mail:
| | - Ronald Rousseau
- Dept. Mathematics, KU Leuven & Fac. of Social Sciences, University of Antwerp, Belgium
| | - Gunnar Sivertsen
- Nordic Institute for Studies in Innovation, Research and Education, Oslo, Norway
| |
Collapse
|