Next Article in Journal
A Conceptual Framework for Estimating Building Embodied Carbon Based on Digital Twin Technology and Life Cycle Assessment
Previous Article in Journal
Application of the DEMATEL Model for Assessing IT Sector’s Sustainability
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Novel World University Rankings Combining Academic, Environmental and Resource Indicators

1
Department of Information Management, Chang Gung University, Taoyuan 33302, Taiwan
2
Center for Institutional Research, Chang Gung University, Taoyuan 33302, Taiwan
3
Department of Thoracic Surgery, Chang Gung Memorial Hospital at Linkou, Taoyuan 333423, Taiwan
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(24), 13873; https://doi.org/10.3390/su132413873
Submission received: 30 October 2021 / Revised: 5 December 2021 / Accepted: 13 December 2021 / Published: 15 December 2021
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
World university rankings are regarded as an important tool to assess higher education quality. There are several media sources that publish world university rankings every year. These ranking results are mainly based on academic indicators, including research and teaching, with different weightings. However, some of these academic indicators are questionable, which affects the objectivity of the ranking results. In addition, conducting more medical-related studies could enhance the research impact scores. Some universities that devote themselves to enhancing these academic indicators lose sight of their original development goals and directions. To make the rankings more comprehensive, it is necessary to consider different viewpoints in the assessment. In other words, the research question of this paper is: whether considering different kinds of indicators can provide better ranking results? Therefore, in this paper, we introduce a novel ranking approach that combines academic, environmental, and resource indicators based on the Borda count method. The top 100 world universities from the Academic Ranking of World Universities, QS World University Rankings, Times Higher Education World Universities, and U.S. News & World Report are chosen for the analysis. The comparisons between the original and new rankings show that many universities improve in the rankings, while some universities from particular countries drop in the rankings due to the scores obtained from the environmental and resource indicators.

1. Introduction

Higher education is very important to national economies. Knowledge-intensive industries have continued to increase in developing and developed countries [1]. Moreover, the most common core values from university mission statements are research, education, and students [2]. According to Marope et al. [3], since many countries have placed great emphasis on knowledge transfer and management, international talent cultivation has become a major education policy. As a result, global universities have to show their international competitiveness to attract potential students. This makes rankings necessary for higher education evaluation and assessment [4,5,6]. Currently, several well-known media sources, such as Times Higher Education World University Rankings, QS World University Rankings, Academic Ranking of World Universities, and Best Global Universities Rankings, publish world university rankings based on academic indicators every year.
Since obtaining high rankings can enhance universities’ reputations, many universities focus on their teaching and research quality to obtain better assessment results. In other words, university rankings highly affect universities’ pursuit of excellence in teaching and research, which can not only respond to students’ expectations but also attract more local and international students. Moreover, university rankings affect the viewpoints of governments and industries towards universities and influence related policies.
In general, university rankings can help improving higher education courses, evaluating the global market of higher education, enhancing learning quality, etc. [7,8]. Particularly, the quality of higher education is related to job opportunities of students [9]. Moreover, students who graduate from highly ranked universities are likely to have higher incomes, better health, and higher life quality [10]. These demonstrate the importance of university rankings for universities and students.
However, some of the existing indicators for higher education assessment, although important, heavily affect many universities’ operations, leading the universities to focus on these indicators in order to obtain better ranking results. For example, one research quality indicator in QS World University Rankings is based on peer reviews. In addition, the ranking result can differ significantly according to the nature of the chosen indicators. That is, some universities can enhance their rankings through good resources and environments, specific research topics, etc. In particular, the allocation of privileges and resources to top world universities can be harmful to other universities [11,12,13], conducting medical research and collaborating with well-known researchers can bring great advantages [14], and the specific funding amounts obtained from other countries [15] are some of the objective factors that are not considered in existing indicators of world university rankings.
Moreover, existing indicators do not consider the differences between universities regarding the environment, resource allocation and factors such as demographics, gross domestic product (GDP), education findings, school asset structure. That is, if related environmental and resource indicators are omitted, the ranking results may not be complete in a certain level. In other words, only using the existing indicators for higher education evaluation and assessment may cause some universities to have relatively low rankings and limit their future development.
Therefore, the aim of this study is to introduce related environmental and resource indicators to complement existing academic indictors in order to produce more complete university ranking results. This paper aims to combine the existing academic indicators with related environmental and resource indicators to produce more comprehensive rankings. Moreover, the effect of the new rankings on different regional universities is examined.
The rest of this paper is organized as follows. In Section 2, related media for world university rankings are overviewed. In addition, the assessment indicators, which are based on open data and questionnaires, are discussed. Section 3 describes the research methodology of this paper. Section 4 presents the research results, and Section 5 concludes the paper.

2. Literature Review

2.1. World University Rankings

The university ranking is currently a key indicator globally used to assess the quality of higher education. This evaluation system has been operated for more than twenty-five years [16]. Currently, several well-known media publish world university rankings, such as the Academic Ranking of World Universities (ARWU), Times Higher Education (THE), QS World University (QS), and U.S. News & World Report (USNWR). The following provides an overview of these sources.

2.1.1. Academic Ranking of World Universities (ARWU)

ARWU is conducted by the CWCU of Shanghai Jiao Tong University. It is claimed to be one of the three most impactful university rankings and to be based on consistent and rigorous indicators, with the advantage of being objective, stable, and transparent. The assessment indicators cover four criteria, including quality of education (10%), quality of faculty (40%), research output (40%), and per capita performance (10%).
Among these indicators, the number of alumni of an institution who have won Nobel Prizes and Fields Medals (belonging to quality of education) is questionable since the assessment period is based on 100 years [17,18,19]. Universities that have a long historical standing can take advantage of this indicator [20,21]. On the other hand, the weighting of research output including the published papers in Nature and Science and indexed in SCIE and SSCI was 70%. It was modified to 40% in 2014. This reveals that the weighting should be carefully defined since the ranking results can be affected by the weighting.

2.1.2. QS World University Rankings (QS)

QS World University Rankings, published by Quacquarelli Symonds (QS), is also recognized as one of the three most impactful university rankings. Moreover, it is the only university ranking approved by the IREG (International Ranking Expert Group), which can provide consistent and complete ranking results [22].
Universities are evaluated by six metrics, including academic reputation (40%), employer reputation (10%), faculty/student ratio (20%), citations per faculty (20%), international faculty ratio and international student ratio (10%). Among these metrics, the ranking results are highly influenced by peer reviews and/or education expert opinions, i.e., academic reputation. For example, the response rate for academic reputation was estimated to be less than 1% in 2006, which is approximately 1600 responses received from 190,000 [23,24]. In 2007, 2008, and 2009, the responses received were 3069, 6354, and 9386, respectively. To date, there are 70,000 accumulated responses. That is, the response rate increases rapidly, and the responses can significantly affect the ranking result. Since anyone can provide a response on the Internet, this method is questionable for the qualification of the reviewers [21,25,26,27].

2.1.3. Times Higher Education World University Rankings (THE)

Times Higher Education World University Rankings have been published by Thomson Reuters since 2010 and are recognized as among the three most impactful university rankings. The ranking is based on 13 separate indicators grouped under five categories including teaching—the learning environment (30%), research—volume, income and reputation (30%), citations—research influence (32.5%), international diversity (5%), and industry income—innovation (2.5%).
Although THE considers a variety of indicators, there are several criticisms in the related literature. I paticular, as the citation impact is the most important indicator used to generate rankings, it is problematic for universities that do not use English as their primary language, especially for the disciplines of social sciences and humanities [28].

2.1.4. U.S. News & World Report (USNWR)

The USNWR is an American media company that has published an annual set of rankings of American colleges and universities since 1983. The rankings are based on related information collected from educational institutions via an annual survey, government and third-party data sources, and school websites. In addition, opinion surveys of university faculty and administrators outside the school are considered.
The USNWR ranks universities by a number of indicators, including peer assessment (20%), retention (22%), social mobility (5%), faculty resources (20%), student excellence (8%), financial resources (10%), graduation rate performance (10%), and alumni giving rate (5%). In particular, the relative weights of these indicators are determined and changed over time. Moreover, the first four of the listed indicators account for the majority of the rankings, i.e., 62.5% for the 2017 methodology, and the reputational measure by peer assessment accounts for 22.5% for the 2017 methodology, which is especially important to the final ranking.
However, many higher education experts argue that the ranking methodology ignores individual fit by comparing institutions with widely diverging missions on the same scale [29,30].

2.2. Open Data and Questionnaires

Since most indicators for university rankings are based on open data and/or questionnaires, some critical issues need to be considered. Open data are freely available to everyone, and there are no restrictions from copyright, patents or other mechanisms of control. In addition, the use of open data can provide transparent information to achieve specific analysis or decision-making goals [31]. For the example of open government data, they can be used for political, social interest or economic purposes. However, when users lack data analysis skills, using open data is problematic [32,33].
On the other hand, bias is a common problem associated with questionnaires. Bias can be defined as a deviation in the accuracy of results that may result from the design process. Therefore, investigators should seek to prevent or at least minimize the bias in questionnaires. Moreover, during questionnaire design, multiple prejudices should be considered to obtain objective results. For instance, too many questions in a questionnaire may entail the presence of many designer-centered questions [34].
However, the existing university rankings based on open data and questionnaires cannot cover all aspects of university performance [35]. In other words, more related assessment indicators should be considered to make university rankings more comprehensive. Moreover, although an important part of the ranking results published by those media is based on research publications, existing related indicators of research performance have difficulty objectively reflecting research quality and quantity simultaneously [36]. Therefore, it is much better to consider both subjective and objective viewpoints for demonstrations and/or criticism, which could avoid poor-quality conclusions or decisions. That is, the use of a wide variety of transparent information, i.e., open data, makes it much easier to accept the hypotheses and research goals [37].

2.3. Other Types of Factors

In addition to related academic indicators used in the well-known media, other types of indicators are also important, which can influence the university’s development plan and future outcome. For example, a well university environment including environmental infrastructure and services can help student’s academic performance [38]. Moreover, economic, social, cultural and political factors can also affect university excellence [39]. More specifically, GDP per capita has a part of effect on university growth [40].
On the other hand, the complete resource support also plays an important role to fulfill university’s missions. For example, the quality of higher education is dependent on the funding sources [41] and resource allocation [42]. Moreover, the funding systems can influence the strategies and core tasks of universities [43].

3. Research Methodology

3.1. A Novel Approach to World University Rankings

Given the limitations of existing world university rankings discussed in Section 2, related literature has argued that the chosen indicators cannot comprehensively assess the academic contributions of universities in an objective manner [44,45,46]. Therefore, further improvements in university rankings are necessary [47].
To address this issue, in this paper, a novel approach to world university rankings is introduced. The research framework is shown in Figure 1. It extends the four sources of world university rankings, considering academic, environmental, and resource indicators. Note that the academic indicators are based on a combination of the four sources, which are ARWU, QS, THE, and USNWR, while the environmental and resource indicators are mostly based on the World Bank Open Data.
This approach assesses higher education based on different viewpoints in addition to the traditional academic perspective, which can more completely rank world universities. This is because the ranking results can be different based on the characteristics of the chosen indicators. More specifically, some universities can obtain better rankings thorough well environments, enough related resources, specific research topics, citation metrics, etc. For example, the allocation of privileges and resources to top world universities [11,12,13] and the neglect of local resource allocation between different countries and universities, including GDP, governmental policy funding, regional economy, etc. can influence the ranking results.
In particular, regarding environmental indicators, the learning and social environments of universities are considered. On the other hand, for the resource indicators, related resources made available to students by universities, industries, and governments as well as the regional economy, which affects the resources available, are also important to consider when ranking world universities. Table 1 lists the related indicators.

3.2. Ranking Score Calculation

Once the three types of indicators are collected, the Borda count method [48,49] is used to combine them to generate the final ranking score for each university. The Borda count is a single-winner election method, which is based on voters’ ranking of options or candidates in order of preference. For instance, there are three candidates, and the voter gives a score of 1 for the least interesting candidate and 3 for the most preferred candidate.
In this study, the ranking scores by the Borda count for each type of indicator are generated. Next, to combine the scores generated from the three types of indicators for the final ranking results, the following ranking function is used:
RankN = 1 3 Inc ( A + R + C )
where RankN is a specific university’s ranking score, and Inc A, Inc R, and Inc E refer to the standardization and average for the sum of the percentage scores of the academic, environmental, and resource indicators, respectively.
For example, one would like to re-rank five chosen universities from the top 100, denoted as A, B, C, D, and E. Note that the highest and lowest scores are 100 and 1, respectively. Suppose the scores of the three types of indicators for each university are A (academic: 12, environmental: 15, resources: 30), B (academic: 1, environmental: 16, resources: 20), C (academic: 50, environmental: 8, resources: 10), D (academic: 87, environmental: 81, resources: 40), and E (academic: 100, environmental: 100, resources: 100). Then, the sum of each of the three types of indicators is:
  • The sum of total academic indicators = 12 + 1 + 50 + 87 + 100 = 250
  • The sum of total environmental indicators = 15 + 16 + 8 + 81 + 100 = 220
  • The sum of total resources indicators = 30 + 20 + 10 + 40 + 100 = 200
As a result, the final scores for the five universities are as follows:
  • (12/250, 15/220, 30/200) = (0.048) + (0.068) + (0.15)/3 = 0.0887
  • 1/250, 16/220, 20/200) = (0.004) + (0.073) + (0.1)/3 = 0.059
  • C (50/250, 8/220, 10/200) = (0.2) + (0.036) + (0.05)/3 = 0.095
  • D (87/250, 81/220, 40/200) = (0.348) + (0.368) + (0.2)/3 = 0.305
  • E (100/250, 100/220, 100/200) = (0.4) + (0.455) + (0.5)/3 = 0.452
Therefore, the ranking result of these five universities is E > D > C > A > B. Note that in this paper, each indicator is weighted equally to produce the reranking result by the Borda count.

3.3. Research Design

In this paper, the top 100 world universities based on the four chosen media sources are investigated. For the example of Sweden as one designated regional country, three universities are ranked among the top 100 universities by the ARWU. Therefore, the topmost university has a score of 3, while the lowest ranked university has a score of 1. Note that the period of conducting this study was between 5 January 2021 and 30 June 2021. Table 2 shows the ranking scores of the Swedish universities.
For the environmental and resource indicators, 127 and 328 indicators are collected for each region, respectively. For each indicator, the chosen universities are ranked based on the Borda count method, and the final scores for the environmental and resource types of indicators are calculated.

4. Results

4.1. Ranking Scores

4.1.1. By Environmental Indicators

Figure 2 shows the ranking scores of different regions by the environmental indicators. As shown, the top 100 world universities in Argentina obtain the highest score (i.e., 24), while those in Finland obtain the lowest score. It should be noted that since the related environmental indicators contain some information about the population structure, e.g., the old age dependency ratio, the regions having more young populations, such as China, are likely to obtain higher scores. In contrast, some regions that have a longer average lifespan, such as Japan, obtain relatively lower scores.

4.1.2. By Resource Indicators

Figure 3 shows the ranking scores of different regions based on the resource indicators. Since most resource indicators are economically related, such as import and export trade and GDP, the ranking result is reasonable, with the top 100 world universities in Singapore obtaining the highest scores and those in Argentina having the lowest scores. It should be noted that there are 20 universities in the UK that are ranked as the top 100 world universities, and their ranking scores based on the resource indicators are relatively lower, which can significantly affect their final rankings when all of the academic, environmental, and resource indicators are considered.

4.2. New World University Rankings by Combining Academic, Environmental, and Resource Indicators

The following ranking results are based on combining academic, environmental, and resource indicators, in which the top 100 world universities ranked by the four media sources are considered individually. Table 3 shows an example of the final rankings by using the ARWU for the academic indicators, combined with the ranking results using environmental and resource indicators. Note that ‘rankings by ARWU’ refers to the original rankings by the ARWU, while ‘new rankings’ represents the differences between the original rankings and the new ones. For instance, the Massachusetts Institute of Technology is ranked third by the ARWU. After the three types of indicators are considered together, it is ranked in the second position, thus improving by one position. In contrast, Tsinghua University was ranked 54th by the ARWU, but when the three types of indicators are considered together, its rank drops six positions, i.e., to the 60th position.
As shown, for most universities, except for the top two, i.e., Harvard University and Stanford University, the rankings change notably when the ranking results include the environmental and resource indicators.
Figure 4 shows the changes from the original rankings to the new rankings according to the four media. It should be noted that in the THE, some universities have no exact ranking position in the original rankings; for example, some universities are ranked in the intervals 51–60 and 61–70. Therefore, when a university’s original ranking interval is the same as the new ranking interval, it is classified as experiencing no change.
According to Figure 4, most universities have improved their rankings. In particular, the number of universities experiencing an increase is larger than the number of universities experiencing a decrease in the ranking. This means that the additional consideration of environmental and resource indicators is very useful for improving most universities’ rankings.
On the other hand, Figure 5 shows the number of top-ranked universities in different regions (i.e., countries) according to the four sources. Most of the top-ranked universities are from the USA. This is because many American universities originally had higher scores on academic indicators than universities from other regions. Moreover, the scores on environmental indicators for American universities are relatively higher than the scores of universities in other regions. This can even improve the rankings of some American universities that obtain relatively low scores on the resource indicators.
For the universities whose rankings drop, many are from the UK, Canada, France, the Netherlands, Denmark, Finland, and Japan. More specifically, the UK universities drop the most in the rankings. For example, Cambridge University is ranked in the 7th position by the USNWR, but in the new ranking, it drops to the 76th position. One main reason is that its score on the resource indicators is relatively low, as shown in Figure 3. Compared with Singapore, the highest ranking region in terms of the resource indicators, there is no great difference between universities in the UK and Singapore for the indicators that obtain 1 to 5 scores (cf. Figure 6). Although universities in the UK obtain higher scores on some of some resource indicators than universities in Singapore do, they do so only for the lower scores, i.e., 6 and 10. In contrast, Singaporean universities obtain much larger numbers of resource indicators obtaining the scores for 21, 22, 23, and 24 than the universities in the UK. This leads to the large difference between the UK and Singapore in terms of the final scores.
This indicates that most universities can improve their rankings when the three types of indicators are combined, but this system is disadvantageous for universities in some regions. Therefore, for universities whose rankings drop, such as UK universities, the ranking scores of the resource indicators that are lower than 21 to 24 should be improved to enhance the final rankings. Table 4 lists the resource indicators of UK universities with scores of 21 to 24.

5. Conclusions

World university rankings are critical to assess the quality of higher education across different universities and countries. Several well-known media sources have published rankings every year since the early 2000s, for example, the Academic Ranking of World Universities (ARWU), QS World University Rankings (QS), Times Higher Education World University Rankings (THE), and U.S. News & World Report (USNWR). Although different media consider different indicators with different weightings to rank world universities, most of them are academically oriented, including research- and teaching-related indicators.
However, since researchers have pointed out several limitations of academic indicators, which make current ranking results incomplete and subjective, it is necessary to consider more aspects in higher education assessments. Therefore, in this paper, we attempt to add two more types of indicators, environmental and resource indicators, to the academic indicators when ranking world universities. These data can be collected from World Bank Open Data and could be more objective than the data from questionnaires used in some of the academic indicators. Moreover, these new indicators cover additional issues that may affect the scale and development direction of a university.
The new ranking approach combining the academic, environmental, and resource indicators leads to ranking results that differ from those published by the four media. Many universities take advantage of the additional two types of indicators to increase their rankings, while other universities in certain countries, such as Japan and the UK, find their rankings to drop due to the low scores on environmental or resource indicators.
In short, the new ranking approach is not proposed to replace the current rankings that focus on academic indicators. However, the new ranking results allow us to consider the possibility of using more indicators to more broadly and completely assess world universities. In other words, we can understand world universities from other viewpoints. Moreover, based on the comparison between the original rankings and the new rankings, universities can determine possible directions for improvement and future development.
Despite of the advantage of the proposed approach to produce novel world university ranking results, there are some limitations of this study. First of all, as a pilot study, the environmental and resource indicators are chosen for demonstrating the usefulness of considering other types of indicators for world university rankings. We do not propose a better strategy to define or choose more specific, appropriate, or important environmental and resource indicators for different ranking purposes. Second, we do not consider other related indicators that may lead to better ranking results, such as regional cultures, special features, and public reviews from social networks. Third, we simply weigh each indicator equally to produce the ranking result. Therefore, using different weightings to rank world universities by the Borda count can produce different ranking results. Fourth, this study only captures differences between countries rather than differences in the quality of the institution.

Author Contributions

Conceptualization, W.-C.L.; methodology, W.-C.L.; software, C.C.; validation, C.C.; formal analysis, W.-C.L. and C.C.; resources, C.C.; data curation, C.C.; writing—original draft preparation, W.-C.L. and C.C.; writing—review and editing, W.-C.L.; visualization, W.-C.L. and C.C.; supervision, W.-C.L.; project administration, W.-C.L.; funding acquisition, W.-C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported in part by the Ministry of Science and Technology of Taiwan under Grant MOST 110-2410-H-182-002, in part by the Center for Institutional Research, Chang Gung University from the Framework of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan under Grant UZRPD1H0031 and UZRPD3J0031, and in part by Chang Gung Memorial Hospital at Linkou, under Grant BMRPH13 and CMRPG3J0732.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cooke, P.; Leydesdorff, L. Regional development in the knowledge-based economy: The construction of advantage. J. Technol. Transf. 2006, 31, 5–15. [Google Scholar] [CrossRef]
  2. Breznik, K.; Law, K.M.Y. What do mission statements reveal about the values of top universities in the world? Int. J. Organ. Anal. 2019, 27, 1362–1375. [Google Scholar] [CrossRef]
  3. Marope, P.T.M.; Wells, P.J.; Hazelkorn, E. Rankings and Accountability in Higher Education: Uses and Misuses; United Nations Educational, Scientific and Cultural Organization: Paris, France, 2013. [Google Scholar]
  4. Daraio, C.; Bonaccorsi, A.; Simar, L. Rankings and university performance: A conditional multidimensional approach. Eur. J. Oper. Res. 2015, 244, 918–930. [Google Scholar] [CrossRef]
  5. Dobrota, M.; Bulajic, M.; Bornmann, L.; Jeremic, V. A new approach to the QS university ranking using the composite I-distance indicator: Uncertainty and sensitivity analyses. J. Assoc. Inf. Sci. Technol. 2016, 67, 200–211. [Google Scholar] [CrossRef]
  6. Merisotis, J.; Sadlak, J. Higher education rankings: Evolution, acceptance, and dialogue. High. Educ. Eur. 2005, 30, 97–101. [Google Scholar] [CrossRef]
  7. Deem, R.; Mok, K.H.; Lucas, L. Transforming higher education in whose image? Exploring the concept of the ‘world-class’ university in Europe and Asia. High. Educ. 2008, 21, 83–97. [Google Scholar] [CrossRef]
  8. Dill, D.; Soo, M.J. Academic quality, league tables, and public policy: A cross-national analysis of university ranking systems. High. Educ. 2005, 49, 495–533. [Google Scholar] [CrossRef]
  9. Lievens, F.; Sackett, P.R. The validity of interpersonal skills assessment via situational judgment tests for predicting academic success and job performance. J. Appl. Psychol. 2012, 97, 460–468. [Google Scholar] [CrossRef]
  10. Hazelkorn, E. Rankings and the Reshaping of Higher Education: The Battle for World-Class Excellence; Palgrave Macmillan: Houndmills, UK, 2011. [Google Scholar]
  11. Altbach, P.G. The Imperial tongue: English as the dominating academic language. Econ. Political Wkly. 2007, 42, 3608–3611. [Google Scholar] [CrossRef] [Green Version]
  12. Altbach, P.G.; Salmi, J. The Road to Academic Excellence: The Making of World-Class Research Universities; World Bank Publications: Washington, DC, USA, 2011. [Google Scholar]
  13. Kwoka, J.E., Jr.; White, L.J. The Antitrust Revolution: Economics, Competition, and Policy, 6th ed.; Oxford University Press: New York, NY, USA, 2013. [Google Scholar]
  14. Thelwall, M.; Kousha, K. ResearchGate: Disseminating, communicating, and measuring scholarship? J. Assoc. Inf. Sci. Technol. 2015, 66, 876–889. [Google Scholar] [CrossRef] [Green Version]
  15. Marginson, S. Dynamics of national and global competition in higher education. High. Educ. 2006, 52, 1–39. [Google Scholar] [CrossRef]
  16. Lukman, R.; Krajnc, D.; Glavic, P. University ranking using research, educational and environmental indicators. J. Clean. Prod. 2010, 18, 619–628. [Google Scholar] [CrossRef]
  17. Docampo, D.; Cram, L. On the internal dynamics of the Shanghai ranking. Scientometrics 2014, 98, 1347–1366. [Google Scholar] [CrossRef] [Green Version]
  18. Dobrota, M.; Dobrota, M. ARWU ranking uncertainty and sensitivity: What if the award factor was Excluded? J. Assoc. Inf. Sci. Technol. 2016, 67, 480–482. [Google Scholar] [CrossRef]
  19. Zornic, N.; Markovic, A.; Jeremic, V. How the top 500 ARWU can provide a misleading rank. J. Assoc. Inf. Sci. Technol. 2014, 65, 1303–1304. [Google Scholar] [CrossRef]
  20. Bouyssou, D.; Marchant, T. Consistent bibliometric rankings of authors and of journals. J. Informetr. 2010, 4, 365–378. [Google Scholar] [CrossRef] [Green Version]
  21. Safon, V. What do global university rankings really measure? The search for the X factor and the X entity. Scientometrics 2013, 97, 223–244. [Google Scholar] [CrossRef]
  22. Samarasekera, I.; Amrhein, C. Top Schools Don’t Always Get Top Marks. Edmont. J. 2010. Available online: https://web.archive.org/web/20101003203348/http://www.edmontonjournal.com/news/schools+always+marks/3560240/story.html (accessed on 15 December 2021).
  23. Federkeil, G. Rankings and quality assurance in higher education. High. Educ. Eur. 2008, 33, 219–231. [Google Scholar] [CrossRef]
  24. Ioannidis, J.P.A.; Patsopoulos, N.A.; Kawoura, F.K.; Tatsioni, A.; Evangelou, E.; Kouri, I.; Contopoulos-Ioannidis, D.G.; Liberopoulos, G. International ranking systems for universities and institutions: A critical appraisal. BMC Med. 2007, 5, 30. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Bastedo, M.N.; Bowman, N.A. US News & World Report college rankings: Modeling institutional effects on organizational reputation. Am. J. Educ. 2010, 116, 163–183. [Google Scholar]
  26. Bowman, N.A.; Bastedo, M.N. Anchoring effects in world university rankings: Exploring biases in reputation scores. High. Educ. 2011, 61, 431–444. [Google Scholar] [CrossRef] [Green Version]
  27. Huang, M.-H. Opening the black box of QS World University Rankings. Res. Eval. 2012, 21, 71–78. [Google Scholar] [CrossRef]
  28. Engels, T.C.E.; Ossenblok, T.L.B.; Spruyt, E.H.J. Changing publication patterns in the social sciences and humanities, 2000–2009. Scientometrics 2012, 93, 373–390. [Google Scholar] [CrossRef]
  29. Bougnol, M.-L.; Dula, J. Technical pitfalls in university rankings. High. Educ. 2015, 69, 859–866. [Google Scholar] [CrossRef]
  30. Ehrenberg, R.G. Method or madness? Inside the U.S. News & World Report College Rankings. J. Coll. Admiss. 2005, 189, 19–35. [Google Scholar]
  31. Kitchin, R. The Data Revolution; SAGE: London, UK, 2014. [Google Scholar]
  32. Gasco-Hernandez, M.; Martin, E.G.; Reggi, L.; Po, S.; Luna-Reyes, L.F. Promoting the use of open government data: Cases of training and engagement. Gov. Inf. Q. 2018, 35, 233–242. [Google Scholar] [CrossRef]
  33. Janssen, M.; Charalabidis, Y.; Zuiderwijk, A. Benefits, adoption barriers and myths of open data and open government. Inf. Syst. Manag. 2012, 29, 258–268. [Google Scholar] [CrossRef] [Green Version]
  34. Choi, B.C.K.; Pak, A.W.P. A catalog of biases in questionnaires. Prev. Chronic Dis. 2005, 2, A13. [Google Scholar] [PubMed]
  35. Waltman, L.; Wouters, P.; van Eck, N.J. Ten Principles for the Responsible Use of University Rankings 2017. 15 December. Available online: https://www.cwts.nl/blog?article=n-r2q274&title=ten-principles-for-the-responsible-use-of-university-rankings (accessed on 15 December 2021).
  36. Ventura, O.N.; Mombru, A.W. Use of bibliometric information to assist research policy making. A comparison of publication and citation profiles of Full and Associate Professors at a School of Chemistry in Uruguay. Scientometrics 2006, 69, 287–313. [Google Scholar] [CrossRef]
  37. Gelman, A.; Hennig, C. Beyond subjective and objective in statistics. J. R. Stat. Soc. Ser. A (Stat. Soc.) 2017, 180, 967–1033. [Google Scholar] [CrossRef] [Green Version]
  38. Ramli, A.; Zain, R.M.; Zain, M.Z.M.; Rahman, A.A. Environmental factors and academic performance: The mediating effect of quality of life. In International Conference on Business and Technology; Springer: Berlin/Heidelberg, Germany, 2021; pp. 2082–2105. [Google Scholar]
  39. Jabnoun, N. Economic and cultural factors affecting university excellence. Qual. Assur. Educ. 2009, 17, 416–429. [Google Scholar] [CrossRef]
  40. Valero, A.; Van Reenen, J. The economic impact of universities: Evidence from across the globe. Econ. Educ. Rev. 2019, 68, 53–67. [Google Scholar] [CrossRef]
  41. Brown, W.O. Sources of funds and quality effects in higher education. Econ. Educ. Rev. 2001, 20, 289–295. [Google Scholar] [CrossRef] [Green Version]
  42. Zhang, Q.; Ning, K.; Barnes, R. A systematic literature review of funding for higher education institutions in developed countries. Front. Educ. China 2016, 11, 519–542. [Google Scholar] [CrossRef] [Green Version]
  43. Frolich, N.; Schmidt, E.K.; Rosa, M.J. Funding systems for higher education and their impacts on institutional strategies and academia: A comparative perspective. Int. J. Educ. Manag. 2010, 24, 7–21. [Google Scholar] [CrossRef]
  44. Amsler, S.S.; Bolsmann, C. University ranking as social exclusion. Br. J. Sociol. Educ. 2012, 33, 283–301. [Google Scholar] [CrossRef]
  45. Frey, B.S.; Rost, K. Do rankings reflect research quality? J. Appl. Econ. 2010, 13, 1–38. [Google Scholar] [CrossRef] [Green Version]
  46. Ishikawa, M. University rankings, global models, and emerging hegemony: Critical analysis from Japan. J. Stud. Int. Educ. 2009, 13, 159–173. [Google Scholar] [CrossRef] [Green Version]
  47. Osterloh, M.; Frey, B.S. Ranking games. Eval. Rev. 2015, 39, 102–129. [Google Scholar] [CrossRef]
  48. Emerson, P. The original Borda count and partial voting. Soc. Choice Welf. 2013, 40, 353–358. [Google Scholar] [CrossRef]
  49. Fraenkel, J.; Grofman, B. The Borda count and its real-world alternatives: Comparing scoring rules in Nauru and Slovenia. Aust. J. Political Sci. 2014, 49, 186–205. [Google Scholar] [CrossRef]
Figure 1. The research framework.
Figure 1. The research framework.
Sustainability 13 13873 g001
Figure 2. Ranking scores of different regions by the environmental indicators.
Figure 2. Ranking scores of different regions by the environmental indicators.
Sustainability 13 13873 g002
Figure 3. Ranking scores of different regions by resource indicators.
Figure 3. Ranking scores of different regions by resource indicators.
Sustainability 13 13873 g003
Figure 4. The ranking changes according to the four sources.
Figure 4. The ranking changes according to the four sources.
Sustainability 13 13873 g004
Figure 5. The number of top-ranked universities in different regions.
Figure 5. The number of top-ranked universities in different regions.
Sustainability 13 13873 g005
Figure 6. The ranking scores obtained according to the resource indicators: UK vs. Singapore.
Figure 6. The ranking scores obtained according to the resource indicators: UK vs. Singapore.
Sustainability 13 13873 g006
Table 1. Related indicators used in this paper.
Table 1. Related indicators used in this paper.
Types of IndicatorsDescriptions
Academic
  • ARWU: quality of education, quality of faculty, research output, per capita performance;
  • THE: teaching–the learning environment, research–volume, income and reputation, citations–research influence, international diversity;
  • QS: academic reputation, employer reputation, faculty/student ratio, citations per faculty, international faculty ratio and international student ratio;
  • USNWR: peer assessment, retention, social mobility, faculty resources, student excellence, financial resources, graduation rate performance, alumni giving rate.
Environmental
  • THE: industry income–innovation;
  • World Bank Open Data: population structure, urban scale, land area, development level, livability level, tax, unemployment and employment rates.
Resources
  • USNWR: financial resources;
  • World Bank Open Data: corporate sponsorship, governmental policy funding, tuition fees, loans, GDP, regional economy.
Note: The complete environmental indicators are available at: https://drive.google.com/file/d/1AgpWvSXzhLDnurVebdUWLmMjrIxdysNg/view (accessed on 15 December 2021). The complete resources indicators are available at: https://drive.google.com/file/d/1qDxeP6Qw6tI3KeZ9cJZObt6PSD3KNyMm/view (accessed on 15 December 2021).
Table 2. Ranking scores of Swedish universities based on academic indicators.
Table 2. Ranking scores of Swedish universities based on academic indicators.
MediaUniversitiesScores
ARWUKarolinska Institute3
Uppsala University2
Stockholm University1
QSLund University3
KTH Royal Institute of Technology2
Uppsala University1
THEKarolinska Institute2
Lund University1
USKarolinska Institute2
Lund University1
Table 3. New world university rankings based on the ARWU.
Table 3. New world university rankings based on the ARWU.
Rankings by ARWUInstitutionsCountriesNew Rankings
1Harvard UniversityUSA-
2Stanford UniversityUSA-
3Massachusetts Institute of TechnologyUSA↑1
4University of California, BerkeleyUSA↑1
5Princeton UniversityUSA↑1
6Columbia UniversityUSA↑2
7California Institute of TechnologyUSA↑2
8University of ChicagoUSA↑2
9Yale UniversityUSA↑2
10University of California, Los AngelesUSA↑2
11University of WashingtonUSA↑2
12Cornell UniversityUSA↑2
13University of California, San DiegoUSA↑2
14University of PennsylvaniaUSA↑3
15Johns Hopkins UniversityUSA↑3
16Washington University in St LouisUSA↑4
17University of California, San FranciscoUSA↑4
18Northwestern UniversityUSA↑4
19University of MichiganUSA↑5
20Duke UniversityUSA↑6
21University of Wisconsin-MadisonUSA↑7
22New York UniversityUSA↑7
23University of North Carolina at Chapel HillUSA↑10
24University of Minnesota, Twin CitiesUSA↑10
25The Rockefeller UniversityUSA↑11
26University of Illinois at Urbana-ChampaignUSA↑11
27University of Colorado BoulderUSA↑16
28University of California, Santa BarbaraUSA↑17
29University of Texas Southwestern Medical Center at DallasUSA↑19
30University of Texas at AustinUSA↑21
31Vanderbilt UniversityUSA↑21
32University of Maryland, College ParkUSA↑21
33University of MelbourneAustralia↑6
34University of Southern CaliforniaUSA↑20
35University of QueenslandAustralia↑20
35University of California, IrvineUSA↑29
37University of PittsburghUSA↑31
37Monash UniversityAustralia↑41
39Mayo Medical SchoolUSA↑32
39University of SydneyAustralia↑44
41Rice UniversityUSA↑33
41University of Western AustraliaAustralia↑50
43Purdue UniversityUSA↑34
43National University of SingaporeSingapore↑48
43Australian National UniversityAustralia↑54
46Rutgers, The State University of New JerseyUSA↑33
47Boston UniversityUSA↑33
48ETH Zurich—Swiss Federal Institute of Technology ZurichSwitzerland↓29
48Carnegie Mellon UniversityUSA↑32
50University of ZurichSwitzerland↑8
50Ohio State UniversityUSA↑30
52University of GenevaSwitzerland↑8
52Georgia Institute of TechnologyUSA↑33
54Tsinghua UniversityChina↓6
54École Polytechnique Fédérale de LausanneSwitzerland↑22
54Pennsylvania State UniversityUSA↑31
57Peking UniversityChina↑14
57University of California, DavisUSA↑28
57University of BaselSwitzerland↑38
60University of British ColumbiaCanada↓29
60University of FloridaUSA↑28
62University of CambridgeUK↓59
62McMaster UniversityCanada↑4
62University of California, Santa CruzUSA↑36
65University of OxfordUK↓58
65University of TorontoCanada↓42
65McGill UniversityCanada↑2
65University of ArizonaUSA↑34
69University College LondonUK↓53
69Heidelberg UniversityGermany↓27
71Imperial College LondonUK↓44
71Technical University of MunichGermany↓21
73University of EdinburghUK↓41
73Utrecht UniversityNetherlands↓26
73LMU MunichGermany↓16
73Technion Israel Institute of TechnologyIsrael↑20
77University of CopenhagenDenmark↓47
77University of ManchesterUK↓39
77University of GroningenNetherlands↓18
77University of OsloNorway↓15
77University of GöttingenGermany↑18
82Karolinska InstituteSweden↓38
82King’s College LondonUK↓36
82Aarhus UniversityDenmark↓17
82Erasmus University RotterdamNetherlands↓9
86Pierre and Marie Curie UniversityFrance↓26
86University of BristolUK↓25
86Uppsala UniversitySweden↓23
86Leiden UniversityNetherlands↑2
90University of TokyoJapan↓56
90Paris-Sud UniversityFrance↓49
90Stockholm UniversitySweden↓16
90Cardiff UniversityUK↑9
94Kyoto UniversityJapan↓59
95Nagoya UniversityJapan↓11
96Lomonosov Moscow State UniversityRussia↓3
97Ghent UniversityBelgium↓28
98KU LeuvenBelgium↓8
99École Normale SupérieureFrance↓30
100University of HelsinkiFinland↓44
Table 4. The scores of 21 to 24 for the resource indicators of UK universities.
Table 4. The scores of 21 to 24 for the resource indicators of UK universities.
Resource IndicatorsScores
Merchandise exports by the reporting economy, residual (% of total merchandise exports)21
Merchandise exports to economies in the Arab world (% of total merchandise exports)21
International tourism, expenditures (current US$)21
International tourism, expenditures for travel items (current US$)21
International tourism, receipts for travel items (current US$)21
Labor force, female (% of total labor force)21
Wage and salaried workers, total (% of total employment)21
Wage and salaried workers, male (% of male employment)21
Forest rents (% of GDP)21
Foreign direct investment, net inflows (BoP, current US$)21
Imports of goods, services and primary income (BoP, current US$)21
Merchandise exports to high-income economies (% of total merchandise exports)22
Portfolio equity, net inflows (BoP, current US$)22
Transport services (% of service imports, BoP)22
Commercial service exports (current US$)23
Adjusted savings: consumption of fixed capital (% of GNI)23
Service exports (BoP, current US$)23
Insurance and financial services (% of commercial service exports)24
Insurance and financial services (% of service imports, BoP)24
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lin, W.-C.; Chen, C. Novel World University Rankings Combining Academic, Environmental and Resource Indicators. Sustainability 2021, 13, 13873. https://doi.org/10.3390/su132413873

AMA Style

Lin W-C, Chen C. Novel World University Rankings Combining Academic, Environmental and Resource Indicators. Sustainability. 2021; 13(24):13873. https://doi.org/10.3390/su132413873

Chicago/Turabian Style

Lin, Wei-Chao, and Ching Chen. 2021. "Novel World University Rankings Combining Academic, Environmental and Resource Indicators" Sustainability 13, no. 24: 13873. https://doi.org/10.3390/su132413873

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop