1
|
Boric S, Reichmann G, Schlögl C. Possibilities for ranking business schools and considerations concerning the stability of such rankings. PLoS One 2024; 19:e0295334. [PMID: 38358966 PMCID: PMC10868868 DOI: 10.1371/journal.pone.0295334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Accepted: 11/21/2023] [Indexed: 02/17/2024] Open
Abstract
In this article, we discuss possibilities for ranking business schools and analyse the stability of research rankings using different ranking methods. One focus is set on a comparison of publication-based rankings with citation-based rankings. Our considerations and discussions are based on a (small) case study for which we have examined all (six) business schools at public universities in Austria. The innovative aspect of our article is the chosen mix of methods and the explicit comparison of the results of a publication analysis with those of a citation analysis. In addition, we have developed a new indicator to check the stability of the obtained ranking results with regard to the individual business schools. The results show that the ranks of the individual business schools are quite stable. Nevertheless, we found some differences between publication-based and citation-based rankings. In both cases, however, the choice of the data source as well as switching from full to adjusted counting only have little impact on the ranking results. The main contribution of our approach to research in the field of university rankings is that it shows that focusing on a single (overall) indicator should be avoided, as this can easily lead to bias. Instead, different (partial) indicators should be calculated side by side to provide a more complete picture.
Collapse
Affiliation(s)
- Sandra Boric
- Department of Journals, Databases, and License Management, University Library Graz, University of Graz, Graz, Austria
| | - Gerhard Reichmann
- Institute of Operations and Information Systems, University of Graz, Graz, Austria
| | - Christian Schlögl
- Institute of Operations and Information Systems, University of Graz, Graz, Austria
| |
Collapse
|
2
|
Szluka P, Csajbók E, Győrffy B. Relationship between bibliometric indicators and university ranking positions. Sci Rep 2023; 13:14193. [PMID: 37648684 PMCID: PMC10468493 DOI: 10.1038/s41598-023-35306-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2022] [Accepted: 05/16/2023] [Indexed: 09/01/2023] Open
Abstract
A growing interest for demonstrating prestige and status of higher education institutions has spurred the establishment of several international ranking systems. A major percentage of these rankings include parameters related to scientific productivity. Here, we examined the differences between diverse rankings as well as correlation with bibliometric parameters and disciplines for the top universities. We investigated the top 300 universities from four international rankings, the Times Higher Education World University Ranking (THE), the QS World University Rankings (QS) the ShanghaiRanking-Academic Ranking of World Universities (ARWU) and the U.S.News Best Global Universities Ranking (USNews). The assessed parameters include ranking positions, size related and bibliometrics-related indicators of each selected ranking. The weight of scientometric parameters ranges between 20% (QS) and 75% (USNews). The most important parameters defining ranking positions include citations, international reputation, and the number of researchers, but the correlation strength varies among ranking systems. The absolute number of publications and citations are particularly important in ARWU and USNews rankings, and scientific category normalized (field weighted) citation impact is central in THE and USNews rankings. Our results confirm that universities having outstanding results in rankings using size-independent indicators (QS and THE) compared to others have significantly lower number of students. High impact research can improve position in ARWU and USNews ranking lists. Regarding to different disciplines, the main results show that outstanding universities in THE ranking have higher publication activity in social sciences and universities which perform better in USNews and QS ranking have more publications in science, technology, and medicine fields and lower score in social sciences. In brief, here we present a comprehensive analysis of the correlation between scientometric parameters and university ranking positions, as well as the performance of outstanding universities and their correlation with different disciplines, to help decision makers select parameters for strengthening and to attract the interest of prospective students and their parents via a better understanding of the functions of different ranks.
Collapse
Affiliation(s)
- Péter Szluka
- Central Library, Semmelweis University, 1088, Budapest, Hungary
| | - Edit Csajbók
- Central Library, Semmelweis University, 1088, Budapest, Hungary
- Research Center for Natural Sciences, Institute of Enzymology, Magyar Tudósok Körútja 2, 1117, Budapest, Hungary
| | - Balázs Győrffy
- Department of Bioinformatics, Semmelweis University, Tűzoltó Utca 7-9, 1094, Budapest, Hungary.
- 2nd Department of Pediatrics, Semmelweis University, 1094, Budapest, Hungary.
| |
Collapse
|
3
|
The moderating effect of altmetrics on the correlations between single and multi-faceted university ranking systems: the case of THE and QS vs. Nature Index and Leiden. Scientometrics 2022. [DOI: 10.1007/s11192-022-04548-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
4
|
When a journal is both at the ‘top’ and the ‘bottom’: the illogicality of conflating citation-based metrics with quality. Scientometrics 2022. [DOI: 10.1007/s11192-022-04402-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
5
|
On the possibilities of presenting the research performance of an institute over a long period of time: the case of the Institute of Information Science at the University of Graz in Austria. Scientometrics 2022. [DOI: 10.1007/s11192-022-04377-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
AbstractIn this paper, we demonstrate how the research performance of a university institute (department) over a long period of time can be presented and evaluated. Using the example of an information science institute at a German-speaking university, namely the (former) Institute of Information Science at the University of Graz in Austria, we present the research performance of this institute over the entire duration of its existence (33 years) in different ways. In order to be able to contextualize its performance, we compare it with that of some related institutions from all over the world. Due to the high effort involved in collecting data and the lack of data availability, the comparison must be limited to a period of a few years and—with regard to the institutions from non-German-speaking countries—to the Web of Science as data source. In this international comparison, the institute in the focus of the study shows relatively poor results. As can be seen, the choice of the data source has a major influence on the evaluation results. Especially for institutes from non-English-speaking countries with publications in their respective national languages, an exclusive use of international databases, such as Web of Science or Scopus, cannot fully consider the whole research performance. The use of personal publication lists or local research databases seems almost indispensable in these cases. A major novelty of this article is the handling of a very long evaluation period and the discussion of different ways of subdividing it. With regard to the presentation of the results, in the case of a long observation period, not only should annual and overall results be presented, but also multi-year comparisons be performed. In this way, year-by-year fluctuations can be smoothed out, and longer-term developments can be well represented.
Collapse
|
6
|
Pinar M, Horne TJ. Assessing research excellence: Evaluating the Research Excellence Framework. RESEARCH EVALUATION 2021. [DOI: 10.1093/reseval/rvab042] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Abstract
Performance-based research funding systems have been extensively used around the globe to allocate funds across higher education institutes (HEIs), which led to an increased amount of literature examining their use. The UK’s Research Excellence Framework (REF) uses a peer-review process to evaluate the research environment, research outputs and non-academic impact of research produced by HEIs to produce a more accountable distribution of public funds. However, carrying out such a research evaluation is costly. Given the cost and that it is suggested that the evaluation of each component is subject to bias and has received other criticisms, this article uses correlation and principal component analysis to evaluate REF’s usefulness as a composite evaluation index. As the three elements of the evaluation—environment, impact and output—are highly and positively correlated, the effect of the removal of an element from the evaluation leads to relatively small shifts in the allocation of funds and in the rankings of HEIs. As a result, future evaluations may consider the removal of some elements of the REF or reconsider a new way of evaluating different elements to capture organizational achievement rather than individual achievements.
Collapse
Affiliation(s)
- Mehmet Pinar
- Business School, Edge Hill University, St Helens Road, Ormskirk, Lancashire L39 4QP, UK
| | - Timothy J Horne
- Cadman Building, Staffordshire University, College Road, Stoke-on-Trent ST4 2DE, UK
| |
Collapse
|
7
|
Fine-grained academic rankings: mapping affiliation of the influential researchers with the top ranked HEIs. Scientometrics 2021. [DOI: 10.1007/s11192-021-04138-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
8
|
University rankings and institutional affiliations: Role of academic librarians. JOURNAL OF ACADEMIC LIBRARIANSHIP 2021. [DOI: 10.1016/j.acalib.2021.102387] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
9
|
|
10
|
Selten F, Neylon C, Huang CK, Groth P. A longitudinal analysis of university rankings. QUANTITATIVE SCIENCE STUDIES 2020. [DOI: 10.1162/qss_a_00052] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
Pressured by globalization and demand for public organizations to be accountable, efficient, and transparent, university rankings have become an important tool for assessing the quality of higher education institutions. It is therefore important to assess exactly what these rankings measure. Here, the three major global university rankings—the Academic Ranking of World Universities, the Times Higher Education ranking and the Quacquarelli Symonds World University Rankings—are studied. After a description of the ranking methodologies, it is shown that university rankings are stable over time but that there is variation between the three rankings. Furthermore, using principal component analysis and exploratory factor analysis, we demonstrate that the variables used to construct the rankings primarily measure two underlying factors: a university’s reputation and its research performance. By correlating these factors and plotting regional aggregates of universities on the two factors, differences between the rankings are made visible. Last, we elaborate how the results from these analysis can be viewed in light of often-voiced critiques of the ranking process. This indicates that the variables used by the rankings might not capture the concepts they claim to measure. The study provides evidence of the ambiguous nature of university rankings quantification of university performance.
Collapse
Affiliation(s)
- Friso Selten
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands
| | - Cameron Neylon
- Centre for Culture and Technology, Curtin University, Perth, Australia
| | - Chun-Kai Huang
- Centre for Culture and Technology, Curtin University, Perth, Australia
| | - Paul Groth
- Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
11
|
Valderrama P, Escabias M, Valderrama MJ, Jiménez-Contreras E, Baca P. Influential variables in the Journal Impact Factor of Dentistry journals. Heliyon 2020; 6:e03575. [PMID: 32211547 PMCID: PMC7082530 DOI: 10.1016/j.heliyon.2020.e03575] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2019] [Revised: 01/10/2020] [Accepted: 03/09/2020] [Indexed: 10/31/2022] Open
Abstract
Objective The aim of this contribution is to determine what variables influence the position, by quartiles of the impact factor, as a quality indicator of a journal in the field of Dentistry. Methods To this end, 24 journals included in Journal Citation Reports, 6 pertaining to each quartile were selected by a stratified sampling and then an ordinal regression model was estimated stepwise considering the journal impact factor quartile as response variable. Results The estimation procedure concluded that the average number of papers published yearly by a journal and the percentage of systematic reviews are the most significant variables to be considered, along with the factor representing the journal's degree of adherence to recommendations by the International Committee of Medical Journal Editors. Conclusions/Clinical significance Systematic reviews have significant effect on the Journal Impact Factor position of a journal as well as adherence to ICMJE recommendations, while papers publishing clinical trials bear no influence on this factor. Greater yearly average of published papers in a journal means a higher impact factor.
Collapse
Affiliation(s)
- Pilar Valderrama
- Vice Rectorate for Research and Transfer, University of Granada, 18071, Granada, Spain
| | - Manuel Escabias
- Department of Statistics and Operations Research, University of Granada, 18071, Granada, Spain
| | - Mariano J Valderrama
- Department of Statistics and Operations Research, University of Granada, 18071, Granada, Spain
| | | | - Pilar Baca
- Department of Dentistry, University of Granada, 18071, Granada, Spain
| |
Collapse
|
12
|
Bornmann L. Bibliometrics-based decision trees (BBDTs) based on bibliometrics-based heuristics (BBHs): Visualized guidelines for the use of bibliometrics in research evaluation. QUANTITATIVE SCIENCE STUDIES 2020. [DOI: 10.1162/qss_a_00012] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
Fast-and-frugal heuristics are simple strategies that base decisions on only a few predictor variables. In so doing, heuristics may not only reduce complexity but also boost the accuracy of decisions, their speed, and transparency. In this paper, bibliometrics-based decision trees (BBDTs) are introduced for research evaluation purposes. BBDTs visualize bibliometrics-based heuristics (BBHs), which are judgment strategies solely using publication and citation data. The BBDT exemplar presented in this paper can be used as guidance to find an answer on the question in which situations simple indicators such as mean citation rates are reasonable and in which situations more elaborated indicators (i.e., [sub-]field-normalized indicators) should be applied.
Collapse
Affiliation(s)
- Lutz Bornmann
- Administrative Headquarters of the Max Planck Society, Division for Science and Innovation Studies, Hofgartenstraße 8, 80539 Munich, Germany
| |
Collapse
|