Next Article in Journal
Mining Chinese Consumer Minds: Motivations for Selling Unwanted Fashion Items in Online Resale Marketplaces
Previous Article in Journal
Safety Culture in the Disaster-Resilient Society Context: A Conceptual Exploration
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sustainability Ranking of Turkish Universities with Different Weighting Approaches and the TOPSIS Method

by
Kübra Akyol Özcan
Department of Business, Bayburt University, 69000 Bayburt, Turkey
Sustainability 2023, 15(16), 12234; https://doi.org/10.3390/su151612234
Submission received: 10 July 2023 / Revised: 3 August 2023 / Accepted: 8 August 2023 / Published: 10 August 2023
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
The concept of sustainability has become more important, especially as a result of the depletion of energy resources and increasing environmental concerns. UI GreenMetric ranks universities based on sustainability, environmental, and energy concerns, addressing issues of environmental pollution, food and water scarcity, and energy supply. By prioritizing sustainability on their campuses and campuses, universities are working to ensure a more sustainable future for humanity. This study evaluates university sustainability in energy and climate change using the UI GreenMetric ranking, focusing on Turkish universities’ sustainability ranking. It incorporates variables like infrastructure, energy, climate change, waste, water, public transportation, and educational research, using weighting approaches to reveal the most important variables for the country’s universities. The study utilized weighting techniques like CRITIC, entropy, standard deviation-based, and equal weighting approaches to obtain rankings for UI GreenMetric rankings. Entropy and equal weighting methods were found to be closest to the UI GreenMetric rankings. Universities’ rankings were analyzed using the TOPSIS method and four weighting techniques for 83 Turkish universities. For Turkish universities, the CRITIC method yielded the highest weight for energy and climate change variables, while water was identified as the most significant factor for entropy, installation infrastructure, and standard deviation-based weighting techniques.

1. Introduction

In today’s world, people care about global as well as local issues. In light of increasing concerns about energy resources and environmental protection, sustainability is now a vital issue for humanity. Across the world, the uncontrolled growth and consumption ambitions of countries, companies, and individuals pose a major risk to sustainability. Climate change, caused mostly by pollution, poses increasing challenges for modern societies as industrial activities, internal combustion engines, and energy-consuming buildings further degrade the environment. Recent research has focused less on internal combustion engines than on building operations and energy use in residential, office, and university settings, where energy performance, energy consumption, and associated activities are the main determinants of pollution [1]. Since the 1970s, increasing scientific evidence suggests that these human behaviors and activities have had significant negative impacts on the environment and, according to some indicators, these effects are so far-reaching that they threaten the sustainability of life on Earth. In societies around the world, this growing concern has pressurized governments, businesses, and industries to implement environmentally sound and sustainable management policies, practices, and operations [2]. Population growth and economic development have accelerated since the Industrial Revolution, and the amount of waste and pollutants released into the environment continues to rise. These changes pose a grave risk to human existence; the escalating problems include food shortages, energy threats, environmental degradation, ecological disasters, sluggish economic growth, and local social unrest. In these circumstances, mankind has been compelled to reevaluate its place in the ecosystem and to seek new pathways for long-term survival and development. This in turn has foregrounded sustainable development as a key strategy for socioeconomic transformation [3] but, despite an emerging consensus, interpretation of this term has proved controversial, and terms like “sustainable” and “sustainability” are often misunderstood or overused [4]. Organizations and individuals increasingly use expressions like “sustainable development”, “sustainable use of the biosphere”, and “ecological sustainability” to refer to the relationship between people and the global environment [5]. However, although the idea of sustainable development is almost universally accepted, little progress has been made in translating the concept into practice. The need for sustainable development was unanimously endorsed by the United Nations General Assembly in September 2015, leading to the formulation of the 17 Sustainable Development Goals (SDGs), which prioritize education as a key strategy to promote sustainability [6]. The associated moral imperatives include meeting needs, ensuring equity, respecting environmental limits (as constraints on human activities), and maximizing economic value. Sustainable development is now recognized as a collective institutional response in pursuit of efficiency gains and social responsibility [7]. Although there is as yet no precise definition of sustainable development, the agreed upon general definition encompasses the simultaneous satisfaction of economic, environmental, and social objectives. While corporate executives and politicians address this in terms of economic growth, sustainable development also entails the idea of an equitable society served by an ecologically sustainable economy [8]. Overuse and abuse of the concept by different interest groups have spawned multiple definitions, but sustainability can be broadly said to involve economic, environmental, and/or sociocultural responsibilities to promote and support equitable resource use in both the natural and human worlds, also known as the biosphere [9]. The most widely used definition is the one proposed by the Brundtland Commission [10], which states that sustainable development allows for economic development while taking into account environmental limits and equity by “meeting the needs of the present without compromising the ability of future generations to meet their own needs”.
The notion of sustainable development enjoys widespread acceptance, yet there remains a notable dearth of advancements in effectively operationalizing this concept. The endorsement of sustainable development by the United Nations General Assembly prompted the establishment of the 17 Sustainable Development Goals (SDGs), which emphasize the significance of education as a primary approach for advancing sustainability [11]. Sustainable development encompasses the concurrent fulfillment of economic, environmental, and social goals. According to Guerrieri et al. [12], the concept encompasses the promotion of economic growth, the establishment of an equitable society, and the development of an ecologically sustainable economy. The importance of sustainability in universities has grown significantly, as the goals of sustainable development are crucial for the formulation of policies, guidelines, and indicators that facilitate the adoption of sustainable practices by educational institutions [13]. Within the realm of university sustainability, this particular investigation will incorporate the variables employed in the calculation of UI GreenMetric 2022. These variables encompass infrastructure installation, energy climate change, waste, water, public transportation, and educational research. The research gap addressed by this study is the evaluation of university sustainability, specifically in terms of energy and climate change, using the variables employed in the UI GreenMetric ranking. While there have been previous studies on sustainability in universities, this study focuses on the specific variables used in the UI GreenMetric ranking and applies different reweighting approaches to assess their impact on rankings. The study will reweight these variables using different approaches, such as CRITIC, entropy (ENT), standard deviation-based (SDD), and equal weighting (EW), and apply the TOPSIS ranking technique to compare the outcomes with the UI GreenMetric ranking. The study aims to assess the sustainability rankings of Turkish universities by incorporating various variables such as infrastructure installation, energy climate change, waste, water, public transportation, and educational research. In this study, which of the variables used in the UI GreenMetric calculation is more important for the universities of the country under consideration will be revealed with the help of weighting approaches. The research questions that will be addressed in this study include the following: 1. What are the variables employed in the calculation of UI GreenMetric 2022? 2. How can the variables used in the UI GreenMetric 2022 be reweighted using different approaches? 3. How do different weighting approaches (CRITIC, entropy, standard deviation-based, and equal weighting) affect the ranking of universities in terms of sustainability? 4. What are the outcomes of reweighting the variables and how do they compare with the UI GreenMetric ranking? 5. What are the implications of the study findings for the sustainability of universities, and their energy and climate change practices?
The study will consist of several sections. The initial section will provide a comprehensive explanation of the concept of sustainability, including its three dimensions: environmental, social, and economic. Section 2 will emphasize the significance of sustainability within the context of university environments, highlighting the importance of universities’ contributions to addressing environmental challenges and promoting sustainable practices. The subsequent section will critically examine relevant research conducted in the existing body of literature, including a review of previous studies on university sustainability and the UI GreenMetric ranking. The subsequent section will present an analysis of the data and findings derived from the study, including the application of the reweighting approaches and the TOPSIS ranking technique to compare the outcomes with UI GreenMetric ranking. Finally, Section 7 of the research paper provides a concise overview of the main findings and the potential implications that arise from the study.

2. UI GreenMetric and Sustainability at Universities

2.1. Sustainability at Universities and Its Importance

In 1972, concerns raised in the Club of Rome report regarding the impact of human activities on the world’s ecosystems began to attract greater attention [14]. As organizations such as companies and universities bear some of the responsibility for severe environmental degradation [15,16], these entities must participate actively in the effort to save our endangered planet [17]. Education is also an important tool for guiding society toward sustainability when prevailing approaches to economic growth are not sustainable [18]. A further link between sustainability and education is the need to ensure that training courses and programs meet labor market needs in this regard [19]. Throughout history, institutions of higher education have played a significant role in both establishing and challenging societal norms, as well as preparing upcoming individuals in positions of authority, innovation, and influence [20]. In this way, universities make significant social, economic, academic, scientific, and technological contributions to their local and national environments as the foremost producers of knowledge, which is considered a key driver of economic growth [21]. This mission requires universities to take the lead in pursuing sustainability [4], both in their internal affairs (organizational goals, education, research, management) and in their dealings with external stakeholders (within the context of their regional mission) [22]. As organizations, universities contribute significantly to the personal identity, worldview, and values of their students. Through appropriate curricula and lesson plans, universities can shape their students’ commitment to sustainability and set an example for other institutions, informing future transformation and the creation of a more sustainable society [23]. Universities have a significant impact on future leaders, teachers, and parents, not only directly through education, research, and knowledge transfer but also through the example they set in being accountable for sustainability performance. Universities that take these responsibilities seriously are expected to be leaders in sustainability reporting practices [24]. Rooted in the late nineteenth century, the Sustainability in Higher Education (SHE) initiative emerged following the Stockholm Declaration of 1972. In 1990, more than 350 universities worldwide signed the Talloires Declaration, which included “a ten-point action plan to integrate sustainability and environmental literacy in university teaching, research, operations and outreach”. In the last half-century, more than 30 PDS declarations have been signed by more than 1400 universities across the world, affirming the incorporation of sustainability principles into education and research policies [25], and reflecting the issue’s increasing importance for higher education. In so doing, leading higher education institutions now acknowledge the links between sustainable development and environmental protection, social justice, economic development, governance policies, and student learning experiences, as well as innovation and competitiveness [26]. University policymakers and planners have increasingly prioritized campus sustainability as a key issue in response to the environmental effects of university activities and operations. The matter has garnered increased impetus due to the influence of state agencies responsible for environmental protection, movements promoting sustainability, university stakeholders, non-governmental organizations, and student advocates [27]. Because a significant proportion of university facilities are situated in historic buildings, it can be difficult to reduce energy costs. Over the last two decades, as university missions began to incorporate a sustainable development component, many such buildings have been adapted or repurposed. Other elements of university sustainability strategies include data reporting, relevant programs and events, changing curricula, and new research programs. The primary approaches towards sustainability pertain to three key areas, infrastructure, community, and learning. Infrastructure encompasses buildings, morphology, design, energy, food, materials, and waste management. Community pertains to governance, leadership, mission, investments, capital, health, wellness, and services. Learning encompasses curriculum planning and related aspects [28]. As a result of this growing awareness, universities now play a key role in environmental sustainability through research, updated curricula, and a more environmentally friendly campus infrastructure [29]. The role of universities in promoting sustainable development has been demonstrated in many academic studies [30,31,32,33,34,35]. Because of their size and the impact of campus activities on the environment and society at large, universities are often referred to as “small cities” in the context of sustainability. Universities are also seen to have the capacity to anticipate change and to be proactive [36]. The issues of energy use and waste generation are highlighted by the large population of associate, undergraduate, and graduate students at universities in this country, including Bursa Uludağ University (67,173), Marmara University (66,747), Akdeniz University (64,037), Selçuk University (63,221), Ankara University (61,318), Dokuz Eylül University (61,073), Kocaeli University (60,777), Istanbul University (59,900), Atatürk University (58,122), and Ege University (55,158). These numbers refer only to those in face-to-face education and do not include distance and open learning students; according to YÖK statistics, the total number of students in formal education in Turkey is 3,752,475 [37]. When teaching staff and other personnel are also included, some universities can be likened to small cities and should be managed accordingly. To be recognized as a sustainability-oriented institution, a university must address the challenge of holistically integrating the social, environmental, and economic dimensions of organizational operations and structures, as well as education delivery, research, and outreach. For that reason, fragmented initiatives involving a few departments or faculties will not suffice, and sustainability must be addressed at all institutional levels [38]. Sustainability reporting is recognized as a useful tool for improving universities’ accountability and socio-environmental performance, and more should be done to assist universities to cope with twenty-first century problems that include environmental and socioeconomic disasters, pay disparities across nations, and political instability. To that end, universities must incorporate the concept of sustainable development into future organizational arrangements, research, and education by equipping professionals with the necessary knowledge, competences, and skills to resolve ecological, social, and economic challenges across communities [39].

2.2. UI GreenMetric: A Ranking System for Campus Sustainability

In recent years, economic globalization has had a significant impact on communities and organizations, and growing international competitiveness and social scrutiny have also impacted higher education institutions. In this context, stakeholders need more information to evaluate and compare universities’ performance worldwide, and university rankings are widely used for simple and rapid comparison based on a selected set of characteristics [40] ranging from research and academic reputation to education and environmental performance. Of these three perspectives, most designers of university rankings generally prioritize research and academic reputation, followed by education-related indicators. Institutions and stakeholders are therefore increasingly concerned with university rankings, which impact global, regional, and national rankings, enabling comparisons, and influencing their environments and institutions [41,42]. Although these ranking tables have been in use for at least forty years, environmental issues have only recently been included [43]. Since 1983, when US News and World magazine published its first annual review of America’s best colleges, professional organizations and governments as well as private media organizations now produce higher education rankings [32]. Each stakeholder makes use of these rankings for their own purposes. Prospective students and their families use rankings to select a college based on education quality, future career prospects, and cost; academics use rankings to advance their own career goals; and policymakers and university administrators use rankings to assess their institution’s current status, both nationally and internationally. Finally, these rankings also attract media attention, reflecting the growing societal interest in higher education. Media institutions use university rankings to inform the public about the status of higher education institutions both at home and across the world [44]. Hazelkorn et al. [45] found empirical support for the claim that performance indicators have a strong influence on university rankings. A survey of 171 managers from 39 countries found that 87% of respondents closely monitored their ranking and that 61% set ranking targets. Across a range of metrics, global rankings compare institutions, fields, subjects, and scientists around the globe. These metrics are based mostly on bibliometric measures from databases that include Web of Science, Scopus, and Google Scholar. While the Academic Ranking of World Universities (ARWU) prioritizes academic accomplishment, the Quacquarelli Symonds World University Rankings (QS) and the Times Higher Education World University Rankings emphasize reputation and internationalization. Both University Rankings by Academic Performance (URAP) and the National Taiwan University (NTU) Rankings focus exclusively on scientific research performance. US News & World Report’s (USNWR) Top Global University Rankings emphasize academic research and overall recognition [46]. By aligning sustainability practices with institutional goals, sustainability-related university rankings can help university administrators focus on sustainable development actions. The traditional emphasis on education and research typically excludes social and environmental issues [38]. However, since 2000, university sustainability assessment tools created by external organizations have become more central than campus environmental audits and other assessments previously performed by students. These new tools commonly facilitate comparison of different universities in terms of their sustainability practices, and this increased visibility incentivizes such practices and shifts the emphasis to off-campus and out-of-classroom indicators. The use of standardized frameworks to assess sustainability encourages universities to collect and report relevant data [47]. In 2010, the University of Indonesia created the UI GreenMetric system to address campus sustainability issues. Using this approach, higher education institutions around the world are now ranked according to their implementation of campus sustainability measures. Their performance is measured against well-defined and globally accepted indicators in six broad categories [48]. Based on dimensions of environment, economy, and equity, 95 universities from 35 countries participated in the ranking in 2010; by 2021, the number of participants had risen to 956 universities from 80 countries.
The UN Environment Department’s 2030 Agenda proposes integrated approaches to sustainable development that demonstrate the social and economic benefits of a healthy environment. To reduce environmental risks and make societies and the environment more resilient, the UN goals promote environmental action for sustainable social and economic development. The UI GreenMetric criteria and indicators address the 17 SDGs, assigning weightings of 21% for energy and climate change, 18% for waste, 18% for transportation, 18% for education and research, 15% for facilities and infrastructure, and 10% for water [49].

3. Literature

3.1. Evaluation Approaches in University Rankings

University rankings are lists or tables that rank universities based on certain criteria or indicators. These rankings provide a comparative assessment of universities’ performance and reputations. They are often used by students to make decisions about which universities to attend, by policymakers to evaluate the effectiveness of higher education systems, and by institutions to benchmark their performance against their peers [50,51,52]. The CRITIC approach has been applied in various fields, including university rankings, to assess the performance of universities based on multiple criteria [53,54,55]. The entropy approach has been used in university rankings to determine the importance of different criteria in evaluating the performance of universities [56,57,58]. The SDD approach is a statistical measure that quantifies the dispersion or variability of a set of values. In the context of university rankings, the SDD approach can be used to assess the variability or diversity of performance among universities. It provides insights into the distribution of rankings and helps identify universities that deviate significantly from average performance [50,59]. The EW approach is a simple ranking method that assigns equal weights to all criteria or indicators used in the evaluation process. The EW approach has been criticized for oversimplifying the evaluation process and not capturing the nuances of performance differences among universities [59,60]. TOPSIS is a multi-criteria decision-making method used to rank alternatives based on their similarity to the ideal solution. The TOPSIS technique considers both the positive and negative distances of alternatives from the ideal solution and ranks them accordingly. It has been widely used in university rankings to assess the performance of universities based on multiple criteria [61,62,63]. The studies in Section 3.2 provide information on different approaches to university ranking reorganization.

3.2. Case Studies Pertaining to University Rankings

Bougnol and Dulá [64] compared two schemes for university classification and ranking: the Best American Research Universities report published by the Center of the University of Florida (UF) and the Data Envelopment Analysis (DEA). Although the results obtained using the DEA were largely consistent with the results in The Center’s report, the DEA was found to be completely objective after the model was built, unlike the method used by UF.
Based on the performance of 50 Turkish universities in 2016, Ömürbek and Karataş [65] calculated criteria weights using entropy, a multi-criteria decision-making (MCDM) tool, and then evaluated performance using MAUT (Multiattribute Utility Technology) and SAW (Simple Additive Weighting). Among the selected variables (scientific and technological research competence, intellectual property rights, collaboration and interaction, entrepreneurial and innovation culture, economic contribution, and commercialization), the highest weight was assigned to intellectual property. According to the MAUT and SAW rankings, Sabancı University ranked first and METU ranked second. İhsan Doğramacı Bilkent University ranked third according to the MAUT method, while Istanbul Technical University ranked third according to the SAW method. A comparative evaluation revealed that the two methods yielded similar results.
In their study, Wu et al. [66] assessed and ranked 12 private universities that were included as case studies by the Ministry of Education. The authors utilized the official performance evaluation structure developed by the Taiwan Assessment and Evaluation Association (TWAEA) to weigh various performance evaluation indices for higher education. The study employed a hybrid MCDM model, specifically the Analytic Hierarchy Process (AHP) for weighting and the VIKOR method for ranking. The comprehensive performance evaluation encompassed various variables, including management, teaching resources, internationalization, supplemental education service, discipline and guidance, general education, and professional and administrative support. Additionally, the evaluation considered the scope of social sciences, including education. According to recent rankings, TKU has secured the top position among the 12 privately funded academic institutions that specialize in the fields of literature, law, and business.
Using the Fuzzy Analytic Hierarchy Process (FAHP), Aliyev et al. [67] compared and ranked the performance of five UK universities on four criteria (teaching, research, citations, and international outlook). The FAHP approach proved to make the system consistent and facilitated ranking by priority by calculating the coefficient of variation for all alternatives. After checking the consistency of the binary matrices for all criteria and alternatives, their eigenvectors were calculated and then used to rank the five universities. The universities were ranked from best to worst as E > B > D > A > C.
Using the AHP method, Güneri Tosunoğlu [68] determined the order of importance of the URAP variables (number of articles, number of citations, total number of scientific documents, ratio of doctoral students, and number of students per faculty member). A fuzzy AHP analysis of the results of a survey of faculty members designated as decision makers identified the total number of scientific documents as the most important variable.
Using objective weighting and MCDM methods, Parlar and Palancı [69] revisited the 2018 World University Rankings for universities from 81 countries. They used the CRITIC and entropy methods for weighting and TOPSIS, MAUT, SAW, and ARAS for ranking; the BORDA counting method was used to convert these into a single ranking of universities and countries. Based on their results, Singapore ranked first, and Turkey ranked 54th according to the CRITIC method and 46th according to the entropy method.
Atici, Yasayacak, Yildiz, and Ulucan [29] used the UI GreenMetric, ARWU, QS Ranking, The Ranking, and NTU Ranking to determine the relationship between environmentalism and academic performance. Although no causality could be inferred, they confirmed that academic rankings reflect green university practices and that higher scores for environmental performance affect the institution’s academic performance. In addition, higher country-level environmental performance reinforced the positive impact of sustainability on university academic performance.
Gorgulu et al. [70] reranked 56 universities in Turkey using GreenMetric 2020 data and MCDM methods and made recommendations based on a comparison with current rankings. Entropy and COPRAS were used for weighting, and TOPSIS was used for ranking. Based on the findings, it can be inferred that water is the most significant factor for Turkey, while facilities and infrastructure are the least significant criteria. According to the UI GreenMetric ranking, the top two institutions were ranked using the COPRAS and TOPSIS methods, resulting in ITU and METU being identified as the leading institutions.
Yadegaridehkordi and Nilashi [71] sought to identify and prioritize criteria, sub-criteria, and related indicators in order of importance for green building assessments of Malaysian universities. Based on the Green Building Index (GBI), the best-known sustainability rating system for buildings in Malaysia, they identified criteria and related indicators. Using the AHP, a panel of experts on green buildings in Malaysia ranked these criteria and indicators by level of importance. The study concluded that indoor environmental quality and energy efficiency were the most important criteria for evaluating green university buildings, while innovation was the least important criterion.
Uluskan et al. [72] assessed 72 private universities listed in the Higher Education Council of Turkey (YÖK) Private Universities 2020 report, utilizing MCDM techniques. The study employed a set of criteria to evaluate the institutions, which encompassed the date of establishment, the number of academic units, the total number of students, the number of full-time faculty members, the total indoor area per student, the library area, the number of printed books and e-books in the library, and the expenditures on research and development as well as the library. The aforementioned items were subjected to weightage through the AHP and subsequently evaluated using the COPRAS, SAW, and TOPSIS methods for ranking purposes. The Borda counting method was utilized to consolidate the outcomes of the three techniques and to generate a unified ranking. The findings indicate that İhsan Doğramacı Bilkent University secured the top position, while Faruk Saraç Design Vocational School attained the lowest rank.
Karasan et al. [73] employed a novel fuzzy ranking methodology that relied on a multi-stage decision-making framework integrating DEMATEL, Cognitive Maps, VIKOR, and Fuzzy Inference Systems. The study utilized 18 distinct criteria derived from the UI GreenMetric World University ranking methodology. The findings validated the efficacy and precision of the proposed methodology in computing the ecological index of universities through the application of fuzzy linguistic expressions.
Gul and Yucesan [74] attempted to develop a university ranking model based on performance criteria in the University Monitoring and Evaluation Reports—2019 (published by the Council of Higher Education in Turkey). These performance criteria were weighted on 34 sub-criteria under five main criteria using the Bayesian Best–Worst Method (BWM). Using the TOPSIS method to rank 189 universities, seven public universities and four foundation universities were found to perform well. In additional NUTS-2 region-based and geographical region-based rankings, the Mediterranean and Aegean geographical regions and the NUTS-2 regions TR33, TR61, TR62, TR71, TR72, TRA2, and TRC1 returned high-performing universities. In this section, we briefly review the literature on university rankings. Table 1 briefly summarizes the studies conducted in this area, their methodologies, and the results obtained.

4. Materials and Methods

University rankings play a significant role in influencing the decisions of various stakeholders in the higher education sector. These rankings affect university leaders, faculty, prospective students and their families, policymakers, regulators, industry, and philanthropic investors [75]. Decision-making is a complex, multifaceted process, and MCDM is a highly accurate technique for identifying desirable outcomes in the field [76]. Multi-Criteria Decision Making (MCDM) is a rapidly expanding field within the domain of operations research that has gained considerable prominence due to its wide-ranging applications [77]. Ranking is a prevalent practice in the literature on MCDM, particularly when there is a need to delineate collections of elements or alternatives based on one or more criteria for the purpose of evaluation, comparison, or selection [78]. In recent years, there has been a significant rise in the utilization of MCDM techniques to assess the circular economy perspective and sustainability concerns. It can be asserted that these methodologies can be effectively employed to address these emerging concepts in decision-making problems. Sousa et al. (2021) conducted an analysis of 143 articles and found that MCDM techniques were utilized in research conducted between 2016 and 2020, specifically in relation to SDGs [79]. As a result, there is growing interest in developing effective decision-making methods for university rankings. One approach to decision making in university rankings is the use of MCDM methods. These methods aim to integrate various criteria and provide a comprehensive evaluation of universities. These MCDM methods allow decision makers to consider multiple factors and make informed decisions. Another approach is the use of data-driven decision-making systems. Data-driven decision-making systems, such as the Entropy Weight Gray Relation Model, help optimize curricula in higher education [80]. Integrating university rankings with other decision-making frameworks, such as the Berlin Principles on Ranking Higher Education Institutions, enhances transparency and reliability [81]. However, rankings should not be the sole basis for decision-making; decision makers should consider the specific context and goals of their institutions when using rankings as a tool for decision-making [82]. Overall, judicious use of rankings and consideration of other factors is crucial for informed decisions in the higher education sector.
The present study was based on data from 83 Turkish universities included in the UI GreenMetric Report for 2022. University rankings were analyzed using the TOPSIS method and four different weighting techniques: CRITIC, entropy (ENT), standard deviation-based (SDD), and equal weighting (EW). The applications were performed on a normalized decision matrix transformed into the interval [0, 1]. The ranking of 83 universities was based on six performance indicators: infrastructure installation (×1), energy and climate change (×2), waste (×3), water (×4), public transportation (×5), and educational research (×6). TOPSIS scores, rankings, and correlations between international rankings and TOPSIS rankings were obtained for each weighting technique. Spearman correlation coefficients were used for the correlation analysis. All analyses were performed using R-Project software, v. 2023 and the TOPSIS package, v.1.0 [83].

4.1. CRITIC

The CRITIC method was first applied by [84] to calculate decision matrix attribute weights. The CRITIC method, which is mainly used to determine attribute weights, eliminates the need for attribute independence and converts qualitative attributes into quantitative attributes [85]. This approach also determines objective weights for relative importance in MCDM problems [86]. CRITIC determines the weight and ranking of attributes in four stages and uses a correlation coefficient to determine the relationships between attributes. The stages of the CRITIC method were summarized by [85]. The process begins by creating a decision matrix and then ranks qualifications using the following processes:
  • Normalized decision matrix;
  • Correlation coefficient;
  • Index (C);
  • Weighting of attributes;
  • Final ranking of attributes.
The method developed here analyzes the evaluation matrix to extract all the information contained in the evaluation criteria. Many studies have shown that the CRITIC method obtains better objective weights [87] than entropy or standard deviation-based methods. The CRITIC method provides a comprehensive measure of the objective weights of indices based on the contrast intensity of the evaluation indices and their contradictory character. Taking account of the magnitude of variability of the indices and the correlation between them, comprehensive evaluation ensures that the objective characteristics of the data are strictly scientific. For that reason, it is more appropriate to use the CRITIC method for comprehensive analysis of the weights of indices like delays, travel time, number of stops, number of vehicles, CO2 emissions, and fuel consumption [88], assuming n possible alternatives Ai (i = 1, 2,…, n) and m evaluation criteria Cj (j = 1, 2,…, m).
Step 1: The decision matrix X is constructed as follows:
x i j = x 11 x 12 x 1 m x 21 x 22 x 2 m x n 1 x n 2 x n m   i = 1 , 2 , , n ; j = 1 , 2 , , m
where X is the x i j element of the decision matrix and the j th criterion represents the performance value of the i th alternative.
Step 2: For utility criteria
r i j = x i j m i n i x i j m a x i x i j m i n i x i j
and for cost criteria
r i j = m a x i x i j x i j m a x i x i j m i n i x i j
The original decision matrix is normalized using the equations.
Step 3: The symmetric linear correlation matrix m i j is calculated.
Step 4: The objective weight of a criterion is determined using the following equation.
W j = C j j = 1 n     C j
where C j is the amount of information contained in criterion j and is determined as follows.
C j = σ j = 1 n   1 m i j
where σ is the standard deviation of the j th criterion and the correlation coefficient between the two tests [89].

4.2. Entropy

The concept of entropy is widely used in physics, information theory, mathematics, engineering, and other sciences. First defined by Rudolph Clausius in 1865, thermodynamic entropy is a measure of the lack of work energy in a system. It is also a measure of disorder: the higher the entropy, the greater the disorder [90]. The concept of entropy is frequently referenced in the fields of economics and finance, and is prominently featured in Shannon’s (1948) [91] information theory. The theory of entropy provides a systematic approach for determining weights in a manner that is unbiased [92].
According to Liu et al. (2016, p. 5), there are six distinct steps involved in the determination of entropy. The methodology comprises several steps, namely: standardizing raw data to eliminate the influence of quantities and dimensions, calculating correlation coefficients of evaluation indices, extracting feature vectors from a matrix, identifying principal components based on contribution rate and accumulated contribution rate, calculating a comprehensive index according to the weights of the principal component factors, and classifying evaluation types using Jenks’ natural break optimization method. The steps followed in the entropy method are described in detail below [93]:
Step 1: Standardization of the original value of the indicators:
Positive indicator:
X i j = X i j m   X j m   X j m   X j
Negative indicator:
X i j = m   X j X i j m   X j m   X j
Here X i j is the standardized value of the i th evaluation object on the j th indicator. X i j is the original value. max X j and min X j are the maximum and minimum values respectively.
Step 2: The proportion of the i th evaluation object on the j th indicator:
Y i j = X i j i = 1 m     X i j
is calculated.
Step 3: Entropy of each evaluation indicator:
e j = k i = 1 m   Y i j × l n Y i j Y i j > 0
can be defined, where k = 1 / ln m and m is the number of evaluation objects. If Y i j = 0 , evaluation object i in indicator j is excluded.
Step 4: The excess entropy:
d j = 1 e j
is calculated.
Step 5: The entropy weight of each evaluation indicator can be expressed as:
w j = d j / j = 1 n   d j

4.3. Standard Deviation-Based

The normalized values of criteria are measured using the standard deviation-based method, which measures how far apart the values are. In the standard deviation-based method, there are no criterion restrictions and the calculation of criterion weights involves simple mathematical operations. This method reduces the effect of decision-maker subjectivity and ensures the adequate use of decision information [94]. In the standard deviation-based method, which is similar to the entropy approach, a small weight is assigned to an attribute with similar quality values among alternatives [95].

4.4. Equal Weighting

The equal weighting method assigns objective weights as w j = 1 / m (where m is the number of criteria), based on the assumption that all criteria are of equal importance [96]. The average weight (equal importance) should be used when no information is available from the decision maker or when there is insufficient information to distinguish the relative importance of criteria [95].

4.5. TOPSIS

For MCDM problems involving more than one criterion, TOPSIS helps the decision maker(s) to analyze, compare, and rank the available alternatives. The basic idea of this technique is that the distance function is based on decision-maker preferences or utility, and the ranking of alternatives or variants is based on combinations of distances [97]. According to Kim et al. [98], TOPSIS exhibits certain advantages compared to other MCDM techniques.
  • The efficient conclusion of a deal is achieved.
  • The approach employs a rational thought process that mirrors the decision-making logic of individuals.
  • The concept possesses a quantifiable magnitude that pertains to optimal and suboptimal options concurrently.
  • The calculation process is simple and can be conveniently programmed through the use of a spreadsheet.
  • The tool has the capability to generate visual representations of performance metrics for various options, utilizing polyscope attributes for a minimum of two dimensions.
  • The outcomes of this approach are readily explicable and readily embraced by individuals responsible for making decisions (see also [99]).
The TOPSIS technique has also attracted some criticism, including the lack of any method for determining weights (although this problem is often solved using other techniques such as AHP). Additionally, the normalization methods used in the TOPSIS decision matrix may affect the results. Finally, the measure of relative distances for ranking alternatives remains open to question [99]. According to [100], the TOPSIS method has four major disadvantages. (1) The normalized scale for each criterion usually works with a normalized decision matrix derived from a narrow gap between the measurements performed; this narrow gap is not good for ranking purposes and cannot reflect the true dominance of alternatives. (2) The TOPSIS method does not take account of the decision maker’s risk assessment. (3) Like the AHP method, the TOPSIS technique is prone to ranking reversal; that is, the final ranking may change when new alternatives are included in the model. (4) Contradictions may arise when a suboptimal alternative is replaced by a worse alternative or when a suboptimal alternative is added (as in the AHP and WPM methods).
The TOPSIS technique involves the following steps [101].
Step 1: Obtain performance data for n alternatives on k criteria. Raw measures are usually standardized by converting the raw measures x i j into standardized measures s i j .
Step 2: Create a set of importance weights w k for each criterion. These weights can be based on anything but usually reflect provisional relative importance. If standardization was performed in Step 1, the scale is not a problem.
Step 3: Identify the suitable alternative s + (to reverse the extreme performance in each criterion).
Step 4: Identify the least suitable alternative s (to reverse overperformance in each criterion).
Step 5: Determine a distance measure for each criterion for both the optimal ( s + ) and lowest point ( s ).
Step 6: For each alternative, calculate the ratio R equal to the distance to the lowest point divided by the sum of the distance to the lowest point and the feasible distance, where
R = D D + D +
Step 7: Rank alternatives by maximizing the ratio R in Step 6.
In this paper, firstly, the concepts of university rankings, universities, and sustainability are introduced through a comprehensive literature study. Then, the variables used in the UI GreenMetric ranking are weighted with different weighting methods and the TOPSIS method is applied in a case study on Turkish universities. CRITIC, entropy, SDD, and EW methods are used for weighting and TOPSIS method is used for ranking.

5. Results

Descriptive statistics of the variables in the study are given in Table 2.
According to the statistics, the average installation infrastructure value of universities is 987.710, energy and climate change value is 1041.080, waste value is 1084.340, water value is 486.630, public transportation value is 1175.720, and educational research value is 1165.900. Mean, standard deviation, minimum, and maximum values were calculated for all performance indicators of the universities. The correlation matrix obtained for the research variables is shown in Table 3.
The correlation matrix shows that all of the performance indicators of universities have statistically significant and positive relationships. The correlation values between variables are in the range [0.442, 0.698]. The weight values calculated for the research variables are given in Table 4.
Looking at the weighting techniques, the variable with the highest weight for CRITIC is energy climate change, while the variable with the highest weight for the entropy technique is the installation infrastructure variable. According to the standard deviation-based weighting technique, the most important factor is the water variable. The TOPSIS scores calculated for 83 universities according to the weighting techniques are given in Table 5.
According to TOPSIS scores, rankings of performance levels among universities can be made. The ranking levels for TOPSIS scores of 83 universities according to weighting techniques are given in Table 6.
The rankings were found to be the same using equal weighted ranking and equal weighting techniques. According to the rankings, the most successful university according to the entropy-based, standard deviation-based, and equal weighting methods is Istanbul Technical University. This result is in line with the international ranking. The CRITIC method found Özyeğin University to be the most successful university. Kilis 7 December University is the lowest in terms of success according to the CRITIC, entropy, and equal weighting approaches. According to the international ranking, Konya Technical University has the lowest success rate. According to the standard deviation-based weighting, Ankara University has the lowest achievement.
Table 7 shows the results of an analysis of the relationship between the TOPSIS score rankings based on weighting techniques and international ranking levels.
The correlations between the ranking levels of university performances were statistically significant (p < 0.05). According to the results, the weighting approach that has the closest correlation with the international ranking is the entropy and equal weighting technique. The rankings obtained with the CRITIC method have the lowest relative relationship with international rankings. Since entropy and equal weighting approaches have the same ranking, there is a full relationship.

6. Discussion

Today, making decisions is getting harder because there are more options and things change quickly. Therefore, it is necessary to make the right decisions to make life easier and to be successful in many areas. Decision-making can be defined as the selection of the most optimal among many alternatives. Multi-Criteria Decision Making, one decision-making method, has been used since the 1960s in a wide range of areas, including individual, financial, economic, industrial, educational, and political decision problems. The main goal of decision-making methods is to bring problems with a large number of alternatives and criteria to the final decision in the easiest and most successful way possible [102].
In MCDM problems, the weights of the criteria are decided by both objective and subjective means. This study aims to reveal whether the change in weighting method has an effect on the rankings. Many studies have examined the relationship between the change in weighting method and the rankings obtained by CRM methods [101,103,104,105,106,107,108,109,110]. In this study, the CRITIC, entropy, standard deviation-based, and equal weighting approaches were used as criteria weighting techniques. The weights determined with different criteria were ranked according to the TOPSIS method. The dataset for the study consists of 83 universities in Turkey included in the UI GreenMetric 2022 report.
MCDM techniques play a crucial role in university rankings. These techniques, such as CRITIC, entropy, standard deviation-based, and equal weighting approaches, and TOPSIS, allow for a comprehensive evaluation of universities based on multiple criteria. The use of these techniques ensures a more objective and systematic approach to ranking universities, taking into account various factors that contribute to their overall performance and quality. One of the main advantages of using MCDM techniques is that they provide a structured framework for evaluating universities based on multiple dimensions. This allows for a more holistic assessment of universities, considering factors such as academic reputation, research output, faculty quality, student satisfaction, infrastructure, and financial resources [111]. The use of MCDM techniques in university rankings has several benefits. Firstly, these techniques provide a more comprehensive and balanced assessment of universities, considering multiple dimensions of performance and quality. This allows for a more accurate representation of universities’ strengths and weaknesses, enabling stakeholders to make informed decisions based on a more complete picture [112]. Secondly, these techniques reduce the subjectivity in the ranking process by providing a structured framework for evaluating universities based on objective criteria [113]. This increases the transparency and credibility of the ranking results, enhancing their usefulness and acceptance among stakeholders [75]. Furthermore, MCDM techniques allow for flexibility in the ranking process, as decision makers can customize the criteria and weights based on their specific needs and preferences [67]. This ensures that the ranking reflects the priorities and values of the stakeholders involved, making it more relevant and meaningful. Additionally, these techniques enable decision makers to conduct sensitivity analyses and explore different scenarios, assessing the impact of changes in criteria or weights on the ranking results [114]. This helps in understanding the robustness of the rankings and identifying areas for improvement.
Based on the study’s analysis, the rankings based on entropy and equal weighting came out to be the most similar to the UI GreenMetric rankings. Istanbul Technical University, which ranked first in the UI GreenMetric ranking, also ranked first according to the entropy, standard deviation-based, and equal weighting methods, while it ranked eighth according to the CRITIC method. Cyprus International University, which ranked second, ranked the same according to the entropy, standard deviation-based, and equal weighting methods, but ranked fifth according to the CRITIC method. The CRITIC method found Özyeğin University to be the most successful university. According to the CRITIC, entropy, and equal weighting approaches, Kilis 7 Aralık University is at the lowest level in terms of success. According to the international ranking, Konya Technical University has the lowest success rate. According to the standard deviation-based weighting, Ankara University has the lowest achievement.
To improve their ranking in the UI GreenMetric rankings, universities should focus on several key areas, including environmental policies and practices, waste management, energy consumption, infrastructure, sustainable transportation, and community engagement. For Turkish universities, the energy and climate change variable was given the highest weight according to the CRITIC method, while for entropy, the installation infrastructure, and for the standard deviation-based weighting technique, the water variable were found to be the most important factors. These areas should be integrated into the university’s mission, vision, and strategic plans, aiming to reduce its environmental impact and promote sustainable practices across campus. Waste management practices, such as recycling, composting, and proper disposal of hazardous materials, are crucial for sustainability. Energy consumption should be reduced through energy-efficient technologies, such as LED lighting systems, insulation, and renewable energy sources. Infrastructure should be prioritized, including green buildings and sustainable transportation systems. Community engagement is essential for a university’s sustainability efforts, as it involves organizing awareness campaigns, hosting events, and collaborating with local organizations and businesses. By addressing these areas, universities can demonstrate their commitment to sustainability and improve their ranking in the UI GreenMetric. Consequently, the achievement of sustainability in the context of higher education necessitates the combined efforts of individuals as well as the systematic integration of sustainable practices [115].

7. Conclusions

Universities with global rankings aim for excellence in research, teaching, and community engagement, implementing strategic plans to achieve these objectives through effective teaching and research [116]. Universities play a crucial role in achieving a sustainable future through their teaching, research, and campus operations [117]. It is important for universities to integrate sustainability principles within their core activities, including learning and teaching, research, and engagement [118]. This can be achieved by incorporating sustainability into the curriculum, promoting interdisciplinary research on sustainability issues, and implementing sustainable practices on campus [119]. Effective leadership is essential for driving sustainability initiatives in universities [120]. Universities should appoint leaders who are knowledgeable and committed to sustainability [121]. Universities should adopt sustainability reporting and performance management practices to enhance accountability, improve performance, and foster innovation in sustainability [24]. Universities should prioritize the training of future graduates in sustainable development [122]. This can be achieved by incorporating sustainability principles and practices into the curriculum across various disciplines [119]. Universities should promote sustainability in campus operations by implementing energy-efficient practices, reducing waste and carbon emissions, and promoting sustainable transportation options [123]. Sustainability in universities requires government support, regulatory frameworks, and collaboration between policymakers and stakeholders to promote sustainable transformation and funding initiatives [124,125]. Universities should develop communication strategies to effectively communicate their sustainability initiatives and engage stakeholders in decision-making processes [126]. Policies related to sustainability in higher education should be coherent and aligned with broader sustainability goals and frameworks, such as the SDGs [127]. Universities should align their sustainability strategies and initiatives with the SDGs to ensure a comprehensive and integrated approach to sustainability [128]. Adequate funding is essential to support sustainability initiatives in higher education institutions. Governments and funding agencies should allocate resources specifically for sustainability projects and programs in universities [129]. In conclusion, the policy implications for sustainability in universities include integrating sustainability into higher education, promoting leadership for sustainability, adopting sustainability reporting and performance management practices, training future graduates in sustainable development, improving campus operations and community engagement, providing policy support for sustainability transformation, raising awareness and engaging stakeholders, fostering collaboration and partnerships, ensuring policy coherence and alignment with broader sustainability goals, and allocating funding for sustainability initiatives. These policy implications can contribute to the achievement of SDGs and the promotion of sustainability in higher education institutions.
One constraint is the determination of weights for the criteria in MCDM problems. The decision-making process for the weights involves both objective and subjective methods, which may introduce bias and subjectivity. It is important to note that the choice of weighting method can significantly impact the rankings and there is no universally accepted method for determining weights. This study focuses on a specific dataset of 83 Turkish universities included in the UI GreenMetric 2022 report. This limited sample size may not be representative of all universities globally and the rankings may not generalize to other contexts. Future studies should consider using larger and more diverse datasets to ensure the robustness and generalizability of the findings. Additionally, the study only considers a specific set of MCDM methods, including CRITIC, entropy, standard deviation-based weighting, equal weighting, and TOPSIS. There are numerous other MCDM methods available and the choice of method can impact the rankings. Future studies should explore the use of different MCDM methods to compare their effectiveness and robustness in university rankings. Moreover, this study focuses on the UI GreenMetric rankings, which specifically evaluate universities’ sustainability performance. While sustainability is an important aspect, it may not be the sole determinant of a university’s overall performance and quality. Future studies should consider incorporating a broader range of criteria to provide a more comprehensive evaluation of universities. In conclusion, the determination of weights, the reliance on specific criteria, the limited sample size, the choice of MCDM methods, and the focus on sustainability are limitations of this study. Future studies should address these limitations by considering different weighting methods, incorporating a broader range of criteria, using larger and more diverse datasets, exploring different MCDM methods, and considering multiple dimensions of university performance and quality. Additionally, future studies could also benefit from conducting longitudinal research to examine the changes in university performance over time.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data for 2022 were used. Data were obtained from https://greenmetric.ui.ac.id/rankings/overall-rankings-2022 (accessed on 10 February 2023).

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Kifor, C.V.; Olteanu, A.; Zerbes, M. Key Performance Indicators for Smart Energy Systems in Sustainable Universities. Energies 2023, 16, 1246. [Google Scholar] [CrossRef]
  2. Thomas, I.; Nicita, J. Sustainability Education and Australian Universities. Environ. Educ. Res. 2002, 8, 475–492. [Google Scholar] [CrossRef]
  3. Shi, L.; Han, L.; Yang, F.; Gao, L. The Evolution of Sustainable Development Theory: Types, Goals, and Research Prospects. Sustainability 2019, 11, 7158. [Google Scholar] [CrossRef] [Green Version]
  4. Yuan, X.; Zuo, J.; Huisingh, D. Green Universities in China—What matters? J. Clean. Prod. 2013, 61, 36–45. [Google Scholar] [CrossRef]
  5. Brown, B.J.; Hanson, M.E.; Liverman, D.M.; Merideth, R.W. Global sustainability: Toward definition. Environ. Manag. 1987, 11, 713–719. [Google Scholar] [CrossRef]
  6. Rout, P.R.; Verma, A.K.; Bhunia, P.; Surampalli, R.Y.; Zhang, T.C.; Tyagi, R.; Brar, S.; Goyal, M. Introduction to Sustainability and Sustainable Development. In Sustainability: Fundamentals and Applications; Surampalli, R., Zhang, T., Goyal, M.K., Brar, S., Tyagi, R., Eds.; Wiley: Hoboken, NJ, USA, 2020; pp. 1–19. [Google Scholar] [CrossRef]
  7. Mawonde, A.; Togo, M. The role of SDGs in advancing implementation of sustainable development. In Higher Education and Sustainability: Opportunities and Challenges for Achieving Sustainable Development Goals; CRC Press: Abingdon, UK, 2019; p. 1. [Google Scholar]
  8. Paton, J. What’s “Left” of Sustainable Development? J. Aust. Political Econ. 2008, 62, 94–119. [Google Scholar]
  9. Long, J.; Vogelaar, A.; Hale, B.W. Toward sustainable educational travel. J. Sustain. Tour. 2014, 22, 421–439. [Google Scholar] [CrossRef]
  10. Brundtland Commission; U.N. Our Common Future; United Nations: New York, NY, USA, 1987. [Google Scholar]
  11. Tian, J.; Li, J. Analysis and treatment of the conflict between sustainable development and environmental protection based on the ecotourism concept. Front. Environ. Sci. 2022, 10, 1056643. [Google Scholar] [CrossRef]
  12. Guerrieri, R.; Vanguelova, E.; Pitman, R.; Benham, S.; Perks, M.; Morison, J.I.L.; Mencuccini, M. Climate and atmospheric deposition effects on forest water-use efficiency and nitrogen availability across Britain. Sci. Rep. 2020, 10, 12418. [Google Scholar] [CrossRef]
  13. Gutiérrez-Mijares, M.E.; Josa, I.; Casanovas-Rubio, M.D.M.; Aguado, A. Methods for assessing sustainability performance at higher education institutions: A review. Stud. High. Educ. 2023, 48, 1137–1158. [Google Scholar] [CrossRef]
  14. Daub, C.-H. Assessing the quality of sustainability reporting: An alternative methodological approach. J. Clean. Prod. 2007, 15, 75–85. [Google Scholar] [CrossRef]
  15. Jabbour, C.J.C. Greening of business schools: A systemic view. Int. J. Sustain. High. Educ. 2010, 11, 49–60. [Google Scholar] [CrossRef]
  16. Bashir, H.; Araci, Z.C.; Obaideen, K.; Alsyouf, I. An approach for analyzing and visualizing the relationships among key performance indicators for creating sustainable campuses in higher education institutions. Environ. Sustain. Indic. 2023, 19, 100267. [Google Scholar] [CrossRef]
  17. Haden, S.S.P.; Oyler, J.D.; Humphreys, J.H. Historical, practical, and theoretical perspectives on green management. Manag. Decis. 2009, 47, 1041–1055. [Google Scholar] [CrossRef]
  18. UNESCO. Education for Sustainable Development Toolkit; UNESCO: Paris, France, 2006. [Google Scholar]
  19. Martin, S.; Jucker, R. Educating Earth-literate Leaders. J. Geogr. High. Educ. 2005, 29, 19–29. [Google Scholar] [CrossRef]
  20. Lozano, R.; Lukman, R.; Lozano, F.J.; Huisingh, D.; Lambrechts, W. Declarations for sustainability in higher education: Becoming better leaders, through addressing the university system. J. Clean. Prod. 2013, 48, 10–19. [Google Scholar] [CrossRef]
  21. De Filippo, D.; Sandoval-Hamón, L.A.; Casani, F.; Sanz-Casado, E. Spanish Universities’ Sustainability Performance and Sustainability-Related R&D+I. Sustainability 2019, 11, 5570. [Google Scholar] [CrossRef] [Green Version]
  22. Dagiliūtė, R.; Liobikienė, G. University contributions to environmental sustainability: Challenges and opportunities from the Lithuanian case. J. Clean. Prod. 2015, 108, 891–899. [Google Scholar] [CrossRef]
  23. Dagiliūtė, R.; Liobikienė, G.; Minelgaitė, A. Sustainability at universities: Students’ perceptions from Green and Non-Green universities. J. Clean. Prod. 2018, 181, 473–482. [Google Scholar] [CrossRef]
  24. Adams, C.A. Sustainability reporting and performance management in universities. Sustain. Account. Manag. Policy J. 2013, 4, 384–392. [Google Scholar] [CrossRef]
  25. Ragazzi, M.; Ghidini, F. Environmental sustainability of universities: Critical analysis of a green ranking. Energy Procedia 2017, 119, 111–120. [Google Scholar] [CrossRef]
  26. Tilbury, D. Environmental Education for Sustainability: A Force for Change in Higher Education. In Higher Education and the Challenge of Sustainability: Problematics, Promise, and Practice; Corcoran, P.B., Wals, A.E.J., Eds.; Springer: Dordrecht, The Netherlands, 2004; pp. 97–112. [Google Scholar]
  27. Alshuwaikhat, H.M.; Abubakar, I. An integrated approach to achieving campus sustainability: Assessment of the current campus environmental management practices. J. Clean. Prod. 2008, 16, 1777–1785. [Google Scholar] [CrossRef] [Green Version]
  28. Marrone, P.; Orsini, F.; Asdrubali, F.; Guattari, C. Environmental performance of universities: Proposal for implementing campus urban morphology as an evaluation parameter in Green Metric. Sustain. Cities Soc. 2018, 42, 226–239. [Google Scholar] [CrossRef]
  29. Atici, K.B.; Yasayacak, G.; Yildiz, Y.; Ulucan, A. Green University and academic performance: An empirical study on UI GreenMetric and World University Rankings. J. Clean. Prod. 2021, 291, 125289. [Google Scholar] [CrossRef]
  30. Corcoran, P.B.; Wals, A.E.J. The problematics of sustainability in higher education: A synthesis. In Higher Education and the Challenge of Sustainability: Problematics, Promise, and Practice; Springer: Dordrecht, The Netherlands, 2004; pp. 3–6. [Google Scholar]
  31. Lukman, R.; Glavic, P. What are the key elements of a sustainable university? Clean Technol. Environ. Policy 2007, 9, 103–114. [Google Scholar] [CrossRef]
  32. Lukman, R.; Krajnc, D.; Glavič, P. University ranking using research, educational and environmental indicators. J. Clean. Prod. 2010, 18, 619–628. [Google Scholar] [CrossRef]
  33. Disterheft, A.; Caeiro, S.; Azeiteiro, U.M.; Leal Filho, W. Sustainability Science and Education for Sustainable Development in Universities: A Way for Transition. In Sustainability Assessment Tools in Higher Education Institutions: Mapping Trends and Good Practices Around the World; Caeiro, S., Filho, W.L., Jabbour, C., Azeiteiro, U.M., Eds.; Springer International Publishing: Cham, Swizterland, 2013; pp. 3–27. [Google Scholar]
  34. Thomashow, M. The Nine Elements of a Sustainable Campus; MİT Press: Cambridge, MA, USA, 2014. [Google Scholar]
  35. Perchinunno, P.; Cazzolle, M. A clustering approach for classifying universities in a world sustainability ranking. Environ. Impact Assess. Rev. 2020, 85, 106471. [Google Scholar] [CrossRef]
  36. Lauder, A.; Sari, R.F.; Suwartha, N.; Tjahjono, G. Critical review of a global campus sustainability ranking: GreenMetric. J. Clean. Prod. 2015, 108, 852–863. [Google Scholar] [CrossRef]
  37. YÖK. Yükseköğretim Bİlgi Yönetim Sistemi. Available online: https://istatistik.yok.gov.tr/ (accessed on 1 December 2022).
  38. Galleli, B.; Teles, N.E.B.; dos Santos, J.A.R.; Freitas-Martins, M.S.; Junior, F.H. Sustainability university rankings: A comparative analysis of UI green metric and the times higher education world university rankings. Int. J. Sustain. High. Educ. 2022, 23, 404–425. [Google Scholar] [CrossRef]
  39. Muñoz-Suárez, M.; Guadalajara, N.; Osca, J.M. A Comparative Analysis between Global University Rankings and Environmental Sustainability of Universities. Sustainability 2020, 12, 5759. [Google Scholar] [CrossRef]
  40. Burmann, C.; García, F.; Guijarro, F.; Oliver, J. Ranking the Performance of Universities: The Role of Sustainability. Sustainability 2021, 13, 13286. [Google Scholar] [CrossRef]
  41. Osareh, F.; Parsaei-Mohammadi, P.; Farajpahlou, A.; Rahimi, F.A. A Comparative Study of Criteria and Indicators of Local, Regional, and National University Ranking Systems. J. Sci. Res. 2023, 12, 54–67. [Google Scholar] [CrossRef]
  42. Ayyildiz, E.; Murat, M.; Imamoglu, G.; Kose, Y. A novel hybrid MCDM approach to evaluate universities based on student perspective. Scientometrics 2023, 128, 55–86. [Google Scholar] [CrossRef] [PubMed]
  43. Suwartha, N.; Sari, R.F. Evaluating UI GreenMetric as a tool to support green universities development: Assessment of the year 2011 ranking. J. Clean. Prod. 2013, 61, 46–53. [Google Scholar] [CrossRef]
  44. Aydın, O.T. A Review on the Major Global University Ranking Systems and the Turkish Universities’ Overall Position in Rankings. Educ. Adm. Theory Pract. 2017, 23, 305–330. [Google Scholar] [CrossRef] [Green Version]
  45. Hazelkorn, E.; Loukkola, T.; Zhang, T. Rankings in Institutional Strategies and Processes: Impact or Illusion; European University Association: Brussels, Belgium, 2014. [Google Scholar]
  46. Shehatta, I.; Mahmood, K. Correlation among top 100 universities in the major six global rankings: Policy implications. Scientometrics 2016, 109, 1231–1254. [Google Scholar] [CrossRef]
  47. Davey, E. Recapturing the learning opportunities of university sustainability indicators. J. Environ. Stud. Sci. 2017, 7, 540–549. [Google Scholar] [CrossRef]
  48. Ali, E.B.; Anufriev, V.P. UI greenmetric and campus sustainability: A review of the role of african universities. Int. J. Energy Prod. Manag. 2020, 5, 1–13. [Google Scholar] [CrossRef] [Green Version]
  49. Greenmetric. UI GreenMetric Guidelines 2022. 2022. Available online: https://greenmetric.ui.ac.id/publications/guidelines (accessed on 10 February 2023).
  50. Gibbons, S.; Neumayer, E.; Perkins, R. Student satisfaction, league tables and university applications: Evidence from Britain. Econ. Educ. Rev. 2015, 48, 148–164. [Google Scholar] [CrossRef] [Green Version]
  51. Brusca, I.; Cohen, S.; Manes-Rossi, F.; Nicolò, G. Intellectual capital disclosure and academic rankings in European universities. Meditari Account. Res. 2019, 28, 51–71. [Google Scholar] [CrossRef]
  52. Li, F.; Yi, Y.; Guo, X.; Qi, W. Performance evaluation of research universities in Mainland China, Hong Kong and Taiwan: Based on a two-dimensional approach. Scientometrics 2012, 90, 531–542. [Google Scholar] [CrossRef]
  53. Yüksel, M. PISA 2018 Araştırma Sonuçlarına Göre Ülkelerin Bileşik PISA Performans Sıralaması. Muğla Sıtkı Koçman Üniv. Eğit. Fak. Derg. 2022, 9, 788–821. [Google Scholar] [CrossRef]
  54. Karaveg, C.; Thawesaengskulthai, N.; Chandrachai, A. A combined technique using SEM and TOPSIS for the commercialization capability of R&D project evaluation. Decis. Sci. Lett. 2015, 4, 379–396. [Google Scholar] [CrossRef]
  55. Jati, H.; Nurkhamid; Wardani, R. Visibility Ranking of University E-Learning Websites-CRITIC Method Approach. J. Phys. Conf. Ser. 2021, 1737, 012030. [Google Scholar] [CrossRef]
  56. Jessop, A. Entropy in multiattribute problems. J. Multi-Criteria Decis. Anal. 1999, 8, 61–70. [Google Scholar] [CrossRef]
  57. Jati, H.; Dominic, D.D. A New Approach of Indonesian University Webometrics Ranking Using Entropy and PROMETHEE II. Procedia Comput. Sci. 2017, 124, 444–451. [Google Scholar] [CrossRef]
  58. Xiang, Y.; Wang, T.; Zhang, J.; Zhang, Q. Quality Evaluation of University Maritime Education Based on Entropy Method—Taking Wuhan University of Technology as an Example. In Advances in Intelligent Systems, Computer Science and Digital Economics IV; Springer: Cham, Swizterland, 2023; pp. 857–865. [Google Scholar]
  59. Tofallis, C. A different approach to university rankings. High. Educ. 2012, 63, 1–18. [Google Scholar] [CrossRef]
  60. Berbegal-Mirabent, J.; Ribeiro-Soriano, D.E. Behind league tables and ranking systems. J. Serv. Theory Pract. 2015, 25, 242–266. [Google Scholar] [CrossRef]
  61. Nanayakkara, C.; Yeoh, W.; Lee, A.; Moayedikia, A. Deciding discipline, course and university through TOPSIS. Stud. High. Educ. 2020, 45, 2497–2512. [Google Scholar] [CrossRef]
  62. Falcón, V.V.; Martínez, B.S.; Ricardo, J.E.; Vázquez, M.Y.L. Análisis del Ranking 2021 de universidades ecuatorianas del Times Higher Education con el Método Topsis. Rev. Conrado 2021, 17, 70–78. [Google Scholar]
  63. Chen, J.-K.; Chen, I.-S. Using a novel conjunctive MCDM approach based on DEMATEL, fuzzy ANP, and TOPSIS as an innovation support system for Taiwanese higher education. Expert Syst. Appl. 2010, 37, 1981–1990. [Google Scholar] [CrossRef]
  64. Bougnol, M.-L.; Dulá, J.H. Validating DEA as a ranking tool: An application of DEA to assess performance in higher education. Ann. Oper. Res. 2006, 145, 339–365. [Google Scholar] [CrossRef]
  65. Ömürbek, N.; Karataş, T. Girişimci ve Yenilikçi Üniversitelerin Performanslarının Çok Kriterli Karar Verme Teknikleri İle Değerlendirilmesi. Mehmet Akif Ersoy Üniv. Sos. Bilim. Enst. Derg. 2018, 10, 176–198. [Google Scholar] [CrossRef] [Green Version]
  66. Wu, H.-Y.; Chen, J.-K.; Chen, I.-S.; Zhuo, H.-H. Ranking universities based on performance evaluation by a hybrid MCDM model. Measurement 2012, 45, 856–880. [Google Scholar] [CrossRef]
  67. Aliyev, R.; Temizkan, H.; Aliyev, R. Fuzzy Analytic Hierarchy Process-Based Multi-Criteria Decision Making for Universities Ranking. Symmetry 2020, 12, 1351. [Google Scholar] [CrossRef]
  68. Güneri Tosunoğlu, N. Üniversite Sıralama Göstergelerinin Bulanık Analitik Hiyerarşi Prosesi (AHP) ile Sıralanması. Yükseköğretim Bilim Derg. 2020, 10, 451–460. [Google Scholar]
  69. Parlar, G.; Palancı, O. Çok Kriterli Karar Verme Yöntemleri İle Dünya Üniversitelerinin Performanslarının Değerlendirilmesi. Süleyman Demirel Üniv. Vizyoner Derg. 2020, 11, 203–227. [Google Scholar] [CrossRef]
  70. Gorgulu, Y.; Ozceylan, E.; Ozkan, B. UI GreenMetric ranking of Turkish universities using entropy weight and COPRAS methods. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Bangalore, India, 16–18 August 2021; pp. 16–18. [Google Scholar]
  71. Yadegaridehkordi, E.; Nilashi, M. Moving towards green university: A method of analysis based on multi-criteria decision-making approach to assess sustainability indicators. Int. J. Environ. Sci. Technol. 2022, 19, 8207–8230. [Google Scholar] [CrossRef]
  72. Uluskan, M.; Akpolat, G.; Şimşek, D. Vakıf Üniversitelerinin AHP, COPRAS, SAW, TOPSIS Yöntemleriyle Değerlendirilmesi ve Borda Sayım Yöntemi İle Bütünleşik Bir Sıra Elde Edilmesi. End. Mühendisliği 2022, 33, 22–61. [Google Scholar] [CrossRef]
  73. Karasan, A.; Gündoǧdu, F.K.; Aydın, S. Decision-making methodology by using multi-expert knowledge for uncertain environments: Green metric assessment of universities. Environ. Dev. Sustain. 2022, 25, 7393–7422. [Google Scholar] [CrossRef]
  74. Gul, M.; Yucesan, M. Performance evaluation of Turkish Universities by an integrated Bayesian BWM-TOPSIS model. Socio-Econ. Plan. Sci. 2022, 80, 101173. [Google Scholar] [CrossRef]
  75. Marginson, S. University Rankings and Social Science. Eur. J. Educ. 2014, 49, 45–59. [Google Scholar] [CrossRef]
  76. Taherdoost, H.; Madanchian, M. Multi-Criteria Decision Making (MCDM) Methods and Concepts. Encyclopedia 2023, 3, 77–87. [Google Scholar] [CrossRef]
  77. Gelmez, E.; Özceylan, E. Evaluation of the Smart Cities Listed in Smart City Index 2021 by Using Entropy Based Copras and Aras Methodology. Found. Comput. Decis. Sci. 2023, 48, 153–180. [Google Scholar] [CrossRef]
  78. Rouyendegh, B.D.; Erol, S. The DEA–FUZZY ANP department ranking model applied in Iran Amirkabir University. Acta Polytech. Hung. 2010, 7, 2010–2103. [Google Scholar]
  79. Sousa, M.; Almeida, M.F.; Calili, R. Multiple criteria decision making for the achievement of the UN sustainable development goals: A systematic literature. Review and a Research Agenda. Sustainability 2021, 13, 4129. [Google Scholar] [CrossRef]
  80. Shi, Y.; Yang, X. Influencing Factors of University Core Competence: An Empirical Study Based on the Entropy Weight Gray Relation Model. Discret. Dyn. Nat. Soc. 2021, 2021, 8724591. [Google Scholar] [CrossRef]
  81. Barron, G.R.S. The Berlin Principles on Ranking Higher Education Institutions: Limitations, legitimacy, and value conflict. High. Educ. 2017, 73, 317–333. [Google Scholar] [CrossRef]
  82. Robinson-Garcia, N.; Torres-Salinas, D.; Herrera-Viedma, E.; Docampo, D. Mining university rankings: Publication output and citation impact as their basis. Res. Eval. 2019, 28, 232–240. [Google Scholar] [CrossRef]
  83. Yazdi, M.M.M. Package ‘Topsis’. CRAN. Elérhető. Available online: https://cran.rproject.org/package=topsis (accessed on 1 February 2023).
  84. Diakoulaki, D.; Zopounidis, C.; Mavrotas, G.; Doumpos, M. The use of a preference disaggregation method in energy analysis and policy making. Energy 1999, 24, 157–166. [Google Scholar] [CrossRef]
  85. Alinezhad, A.; Khalili, J. CRITIC Method. In New Methods and Applications in Multiple Attribute Decision Making (MADM); Alinezhad, A., Khalili, J., Eds.; Springer International Publishing: Cham, Switzerland, 2019. [Google Scholar] [CrossRef]
  86. Zhao, Q.-H.; Zhou, X.; Xie, R.-F.; Li, Z.-C. Comparison Of Three Weighing Methods For Evaluation of The Hplc Fingerprints Of Cortex Fraxini. J. Liq. Chromatogr. Relat. Technol. 2011, 34, 2008–2019. [Google Scholar] [CrossRef]
  87. Wang, D.; Zhao, J. Design optimization of mechanical properties of ceramic tool material during turning of ultra-high-strength steel 300M with AHP and CRITIC method. Int. J. Adv. Manuf. Technol. 2016, 84, 2381–2390. [Google Scholar] [CrossRef]
  88. Pan, B.; Liu, S.; Xie, Z.; Shao, Y.; Li, X.; Ge, R. Evaluating Operational Features of Three Unconventional Intersections under Heavy Traffic Based on CRITIC Method. Sustainability 2021, 13, 4098. [Google Scholar] [CrossRef]
  89. Marković, V.; Stajić, L.; Stević, Ž.; Mitrović, G.; Novarlić, B.; Radojičić, Z. A Novel Integrated Subjective-Objective MCDM Model for Alternative Ranking in Order to Achieve Business Excellence and Sustainability. Symmetry 2020, 12, 164. [Google Scholar] [CrossRef] [Green Version]
  90. Zhang, H.; Gu, C.-L.; Gu, L.-W.; Zhang, Y. The evaluation of tourism destination competitiveness by TOPSIS & information entropy—A case in the Yangtze River Delta of China. Tour. Manag. 2011, 32, 443–451. [Google Scholar] [CrossRef]
  91. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  92. Zou, Z.-H.; Yun, Y.; Sun, J.-N. Entropy method for determination of weight of evaluating indicators in fuzzy synthetic evaluation for water quality assessment. J. Environ. Sci. 2006, 18, 1020–1023. [Google Scholar] [CrossRef]
  93. Zhao, J.; Ji, G.; Tian, Y.; Chen, Y.; Wang, Z. Environmental vulnerability assessment for mainland China based on entropy method. Ecol. Indic. 2018, 91, 410–422. [Google Scholar] [CrossRef]
  94. Altıntaş, F.F. Karadeniz’e Kıyısı Olan Ülkelerin Deniz Sağliığı Performanslarının Analizi: SD Tabanlı Edas Yöntemi İle Bir Uygulama. Karadeniz Araştırmaları 2022, 19, 347–362. [Google Scholar]
  95. Jahan, A.; Mustapha, F.; Sapuan, S.M.; Ismail, M.Y.; Bahraminasab, M. A framework for weighting of criteria in ranking stage of material selection process. Int. J. Adv. Manuf. Technol. 2012, 58, 411–420. [Google Scholar] [CrossRef]
  96. Deng, H.; Yeh, C.-H.; Willis, R.J. Inter-company comparison using modified TOPSIS with objective weights. Comput. Oper. Res. 2000, 27, 963–973. [Google Scholar] [CrossRef]
  97. Shih, H.-S. TOPSIS Basics. In TOPSIS and Its Extensions: A Distance-Based MCDM Approach; Shih, H.-S., Olson, D.L., Eds.; Springer International Publishing: Cham, Swizterland, 2022; pp. 17–31. [Google Scholar]
  98. Kim, G.; Park, C.S.; Yoon, K. Identifying investment opportunities for advanced manufacturing systems with comparative-integrated performance measurement. Int. J. Prod. Econ. 1997, 50, 23–33. [Google Scholar] [CrossRef]
  99. Huang, Y.-S.; Li, W.-H. A Study on Aggregation of TOPSIS Ideal Solutions for Group Decision-Making. Group Decis. Negot. 2012, 21, 461–473. [Google Scholar] [CrossRef]
  100. Zavadskas, E.K.; Mardani, A.; Turskis, Z.; Jusoh, A.; Nor, K.M. Development of TOPSIS Method to Solve Complicated Decision-Making Problems—An Overview on Developments from 2000 to 2015. Int. J. Inf. Technol. Decis. Mak. 2016, 15, 645–682. [Google Scholar] [CrossRef]
  101. Olson, D. Comparison of weights in TOPSIS models. Math. Comput. Model. 2004, 40, 721–727. [Google Scholar] [CrossRef]
  102. Erpolat Taşabat, S.; Cinemre, N.; Serkan, Ş. Farklı ağırlıklandırma tekniklerinin denendiği çok kriterli karar verme yöntemleri ile Türkiye’deki mevduat bankalarının mali performanslarının değerlendirilmesi. Sos. Bilim. Araşt. Derg. 2015, 4, 96–110. [Google Scholar]
  103. Vinogradova, I.; Podvezko, V.; Zavadskas, E.K. The Recalculation of the Weights of Criteria in MCDM Methods Using the Bayes Approach. Symmetry 2018, 10, 205. [Google Scholar] [CrossRef] [Green Version]
  104. Kornyshova, E.; Salinesi, C. MCDM Techniques Selection Approaches: State of the Art. In Proceedings of the 2007 IEEE Symposium on Computational Intelligence in Multi-Criteria Decision-Making, Honolulu, HI, USA, 1–5 April 2007; pp. 22–29. [Google Scholar]
  105. Lee, H.C.; Chang, C.-T. Comparative analysis of MCDM methods for ranking renewable energy sources in Taiwan. Renew. Sustain. Energy Rev. 2018, 92, 883–896. [Google Scholar] [CrossRef]
  106. Zardari, N.H.; Ahmed, K.; Shirazi, S.M.; Bin Yusop, Z. Literature Review. In Weighting Methods and their Effects on Multi-Criteria Decision Making Model Outcomes in Water Resources Management; Zardari, N.H., Ahmed, K., Shirazi, S.M., Yusop, Z.B., Eds.; Springer International Publishing: Cham, Swizterland, 2015; pp. 7–67. [Google Scholar]
  107. Singh, M.; Pant, M. A review of selected weighing methods in MCDM with a case study. Int. J. Syst. Assur. Eng. Manag. 2021, 12, 126–144. [Google Scholar] [CrossRef]
  108. Zavadskas, E.K.; Podvezko, V. Integrated Determination of Objective Criteria Weights in MCDM. Int. J. Inf. Technol. Decis. Mak. 2016, 15, 267–283. [Google Scholar] [CrossRef]
  109. Sałabun, W.; Wątróbski, J.; Shekhovtsov, A. Are MCDA Methods Benchmarkable? A Comparative Study of TOPSIS, VIKOR, COPRAS, and PROMETHEE II Methods. Symmetry 2020, 12, 1549. [Google Scholar] [CrossRef]
  110. Odu, G. Weighting methods for multi-criteria decision making technique. J. Appl. Sci. Environ. Manag. 2019, 23, 1449–1457. [Google Scholar] [CrossRef] [Green Version]
  111. Ioannidis, J.P.; Patsopoulos, N.; Kavvoura, F.K.; Tatsioni, A.; Evangelou, E.; Kouri, I.; Contopoulos-Ioannidis, D.G.; Liberopoulos, G. International ranking systems for universities and institutions: A critical appraisal. BMC Med. 2007, 5, 30. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  112. Oladipupo, O.; Amoo, T.; Daramola, O. A Decision-Making Approach for Ranking Tertiary Institutions’ Service Quality Using Fuzzy MCDM and Extended HiEdQUAL Model. Appl. Comput. Intell. Soft Comput. 2021, 2021, 4163906. [Google Scholar] [CrossRef]
  113. Zhang, J.Z.; Srivastava, P.R.; Eachempati, P. Evaluating the effectiveness of drones in emergency situations: A hybrid multi-criteria approach. Ind. Manag. Data Syst. 2023, 123, 302–323. [Google Scholar] [CrossRef]
  114. Shahmardan, A.; Zadeh, M.H. An integrated approach for solving a MCDM problem, Combination of Entropy Fuzzy and F-PROMETHEE techniques. J. Ind. Eng. Manag. 2013, 6, 1124–1138. [Google Scholar] [CrossRef] [Green Version]
  115. Bor, Ö.; Tosun, B.; Eler, S.; Eler, N. Sport Academics’ Awareness and Knowledge of Sustainability in Higher Education in Türkiye. Sustainability 2023, 15, 6527. [Google Scholar] [CrossRef]
  116. Makki, A.A.; Alqahtani, A.Y.; Abdulaal, R.M.S.; Madbouly, A.I. A Novel Strategic Approach to Evaluating Higher Education Quality Standards in University Colleges Using Multi-Criteria Decision-Making. Educ. Sci. 2023, 13, 577. [Google Scholar] [CrossRef]
  117. Sen, G.; Chau, H.-W.; Tariq, M.A.U.R.; Muttil, N.; Ng, A.W.M. Achieving Sustainability and Carbon Neutrality in Higher Education Institutions: A Review. Sustainability 2022, 14, 222. [Google Scholar] [CrossRef]
  118. Lambrechts, W. The contribution of sustainability assessment to policy development in higher education. Assess. Eval. High. Educ. 2015, 40, 801–816. [Google Scholar] [CrossRef]
  119. Menon, S.; Suresh, M. Synergizing education, research, campus operations, and community engagements towards sustainability in higher education: A literature review. Int. J. Sustain. High. Educ. 2020, 21, 1015–1051. [Google Scholar] [CrossRef]
  120. Azizi, L. Which leadership processes encourage sustainable transitions within universities? Int. J. Sustain. High. Educ. 2022, 24, 46–68. [Google Scholar] [CrossRef]
  121. Goodall, A.H. Highly cited leaders and the performance of research universities. Res. Policy 2009, 38, 1079–1092. [Google Scholar] [CrossRef] [Green Version]
  122. Albareda-Tiana, S.; Vidal-Raméntol, S.; Fernández-Morilla, M. Implementing the sustainable development goals at University level. Int. J. Sustain. High. Educ. 2018, 19, 473–497. [Google Scholar] [CrossRef]
  123. Paradowska, M. Rivalry, excludability and positive transport externalities—Case study of a private university in Poland. Int. J. Sustain. High. Educ. 2019, 20, 1290–1312. [Google Scholar] [CrossRef]
  124. Mader, C.; Scott, G.; Razak, D.A. Effective change management, governance and policy for sustainability transformation in higher education. Sustain. Account. Manag. Policy J. 2013, 4, 264–284. [Google Scholar] [CrossRef]
  125. Vaughter, P.; McKenzie, M.; Lidstone, L.; Wright, T. Campus sustainability governance in Canada. Int. J. Sustain. High. Educ. 2016, 17, 16–39. [Google Scholar] [CrossRef]
  126. Raji, A.; Hassan, A. Sustainability and Stakeholder Awareness: A Case Study of a Scottish University. Sustainability 2021, 13, 4186. [Google Scholar] [CrossRef]
  127. Molokova, E. Higher education as a sustainable development tool. E3S Web Conf. 2021, 291, 05040. [Google Scholar] [CrossRef]
  128. Martínez-Virto, L.; Pérez-Eransus, B. The Role of the Public University of Navarre in Achieving the 1st SDG for the End of Poverty. Sustainability 2021, 13, 9795. [Google Scholar] [CrossRef]
  129. de Lima, C.R.M.; Soares, T.C.; de Lima, M.A.; Veras, M.O.; Guerra, J.B.S.O.D.A. Sustainability funding in higher education: A literature-based review. Int. J. Sustain. High. Educ. 2020, 21, 441–464. [Google Scholar] [CrossRef]
Table 1. Literature summary on university rankings.
Table 1. Literature summary on university rankings.
Author(s)MethodologyResults
Bougnol and Dulá [64]DEADEA model was objective, unlike UF’s method.
Ömürbek and Karataş [65]MAUT and SAWComparative evaluation shows similar results between methods.
Wu et al. [66]AHP and VIKORTKU has attained the highest rank among the 12 privately funded academic institutions that specialize in the disciplines of literature, law, and business.
Aliyev et al. [67]FAHPFAHP approach ensures consistency and prioritization in system ranking.
Güneri Tosunoğlu [68]AHPFuzzy AHP analysis reveals total number of scientific documents as crucial variable for faculty decision-making.
Parlar and Palancı [69]CRITIC, entropy, TOPSIS, MAUT, SAW, ARAS, and BORDASingapore tops CRITIC and entropy rankings, Turkey 54th and 46th, respectively.
Gorgulu et al. [70]Entropy, COPRAS and TOPSISIn Turkey water is the most significant factor, but facilities and infrastructure less so. Using the COPRAS and TOPSIS methodologies, TU and METU were determined to be the top two institutions.
Yadegaridehkordi and Nilashi [71]AHPStudy finds indoor environmental quality and energy efficiency crucial for green university building evaluation.
Uluskan et al. [72]AHP, COPRAS, SAW and TOPSISTop-ranked İhsan Doğramacı Bilkent University, attained the lowest score at Faruk Saraç Design Vocational School.
Karasan et al. [73]DEMATEL, Cognitive Maps, VIKOR, and Fuzzy Inference SystemsThe methodology for computing university ecological index using fuzzy linguistic expressions was validated.
Gul and Yucesan [74]BWM and TOPSIS7 public universities and 4 foundation universities were found to perform well.
Table 2. Descriptive statistics of research variables.
Table 2. Descriptive statistics of research variables.
VariableMeanSDMinMax
×1987.710207.5504551400
×21041.080378.0901501675
×31084.340383.730751800
×4486.630232.980101000
×51175.720297.9503851625
×61165.900378.2602351800
SD: Standard Deviation, Min: Minimum, Max: Maximum.
Table 3. Spearman correlation matrix for research variables.
Table 3. Spearman correlation matrix for research variables.
×1×2×3×4×5×6
×11
×20.489 *1
×30.442 *0.563 *1
×40.551 *0.685 *0.571 *1
×50.561 *0.599 *0.548 *0.647 *1
×60.698 *0.502 *0.475 *0.547 *0.660 *1
* p < 0.05.
Table 4. Variable weights.
Table 4. Variable weights.
VariableCRITICENTSDDEW
×10.0270.1690.1060.167
×20.8410.1660.1830.167
×30.0160.1660.1780.167
×40.0490.1640.2410.167
×50.0480.1680.1280.167
×60.0190.1670.1640.167
Table 5. TOPSIS scores according to weighting methods.
Table 5. TOPSIS scores according to weighting methods.
UniversityCRITICENTSDDEW
Istanbul Technical University0.9080.9010.9000.901
Cyprus International University0.9340.8790.8970.879
Erciyes University0.9070.8390.8230.838
Ozyegin University0.9810.8240.8310.824
Yildiz Technical University0.9650.8450.8570.846
Yeditepe University0.9000.8450.8360.844
Ege University0.9170.8430.8320.842
Middle East Technical University0.7050.7700.7400.769
Bartin University0.7940.7980.7960.798
Aksaray University0.7380.7790.7530.778
Tokat Gaziosmanpasa University0.9610.7590.7530.759
Sakarya University0.9010.7840.7940.784
Izmir Institute of Technology0.7380.7730.7670.772
Baskent University0.9560.7480.7210.747
Dokuz Eylul University0.7540.7560.7440.756
Inonu University Malatya0.9310.7480.7430.748
Afyon Kocatepe University0.8440.7310.7180.730
Trakya University0.6130.6910.6450.690
Bilecik Şeyh Edebali University0.7200.6860.6480.685
Kutahya Dumlupinar University0.8180.6900.6710.690
Kutahya Health Sciences University0.6790.7050.6980.705
Hasan Kalyoncu University0.8350.7120.7230.713
Mugla Sitki Kocman University0.7990.6230.5720.621
Ataturk University0.6470.6750.6680.675
Hitit University0.8960.6600.6550.659
Sabanci University0.7530.6980.7180.699
Istanbul Sabahattin Zaim University0.6700.6080.5650.606
Kastamonu University0.5150.6230.5800.622
Firat University0.8500.6660.6610.666
Cappadocia University0.7290.6330.6050.632
Bursa Uludag University0.4920.5780.5040.576
Düzce University0.7080.5820.5190.580
Mersin University0.4670.6270.6010.626
Süleyman Demirel University0.7330.5680.5130.566
Cukurova University0.6070.5840.5530.583
Selcuk University0.7350.6030.5800.602
Zonguldak Bulent Ecevit University0.5580.5940.5660.593
Niğde Ömer Halisdemir University0.7680.6160.6120.616
Gaziantep University0.5840.6020.5950.602
Osmaniye Korkut Ata University0.6870.5990.5910.598
Ondokuz Mayis University0.4200.6150.6270.616
Hacettepe University0.4750.5040.4200.501
Igdir Universitesi0.5360.5990.6390.600
Akdeniz University0.2810.5050.4450.503
Bilkent University0.2240.5300.4980.529
Mardin Artuklu Üniversitesi0.4820.5240.4860.523
Bursa Technical University0.6870.5560.5660.556
Atilim University0.6220.5490.5440.549
Piri Reis University0.7010.5730.6250.575
Izmir Bakircay University0.8240.4740.4300.472
TOBB University of Economy and Technology0.2880.5050.4980.505
Antalya Bilim Üniversitesi0.4340.5060.5090.506
Kadir Has University0.3610.4320.3650.430
Eskisehir Technical University0.2560.5150.5310.515
Bayburt University0.5980.4600.4290.459
Gazi University0.5880.4490.4080.447
Manisa Celal Bayar University0.5790.4820.4800.482
Van Yuzuncu Yil University0.5720.4490.4110.448
Artvin Çoruh University0.4530.4240.3640.422
Karamanoğlu Mehmetbey Üniversity0.4680.4460.4140.445
Adiyaman University0.6260.4300.3880.429
Dicle University0.2660.4680.4660.468
Bolu Abant Izzet Baysal University (BAIBU)0.2810.4630.4700.463
Bezmialem Vakıf University0.2470.3810.3100.379
Istanbul Gelisim University0.6530.4370.4480.438
Galatasaray University0.4970.4100.3750.409
Karadeniz Technical University0.4490.4230.4310.423
Marmara University0.3340.3620.3030.360
Sivas Cumhuriyet University0.1830.4100.4000.410
Ağrı İbrahim Çeçen Üniversitesi0.3710.3940.3800.393
Anadolu University0.4550.4070.4070.407
Usak University0.3450.3800.3770.380
Istanbul Atlas University0.4970.3930.4070.393
Cag University0.0640.3310.2680.329
Erzurum Technical University0.4470.3000.2540.298
Eskisehir Osmangazi University0.2200.2830.2420.281
Karabuk University0.7770.3470.3480.347
Bingöl University0.4240.2470.2110.246
Cankaya University0.0680.2500.2200.249
Kirikkale University0.1180.2580.2560.258
Ankara University0.2610.2150.1640.213
Kilis 7 Aralk University0.0220.1860.1910.186
Konya Technical University0.2780.1870.2260.189
Table 6. TOPSIS scores according to weighting methods.
Table 6. TOPSIS scores according to weighting methods.
UniversityUICRITICENTSDDEA
Istanbul Technical University18111
Cyprus International University25222
Erciyes University39676
Ozyegin University41767
Yildiz Technical University52333
Yeditepe University611444
Ege University77555
Middle East Technical University831121512
Bartin University919888
Aksaray University1025101210
Tokat Gaziosmanpasa University113131113
Sakarya University1210999
Izmir Institute of Technology1324111011
Baskent University144161716
Dokuz Eylul University1522141314
Inonu University Malatya166151415
Afyon Kocatepe University1714171817
Trakya University1841212622
Bilecik Şeyh Edebali University1929232523
Kutahya Dumlupinar University2017222121
Kutahya Health Sciences University2135192019
Hasan Kalyoncu University2215181618
Mugla Sitki Kocman University2318303730
Ataturk University2438242224
Hitit University2512262426
Sabanci University2623201920
Istanbul Sabahattin Zaim University2736334033
Kastamonu University2850293529
Firat University2913252325
Cappadocia University3028273127
Bursa Uludag University3153414741
Düzce University3230404440
Mersin University3357283228
Süleyman Demirel University3427434543
Cukurova University3542394139
Selcuk University3626343634
Zonguldak Bulent Ecevit University3748383938
Niğde Ömer Halisdemir University3821313031
Gaziantep University3945353335
Osmaniye Korkut Ata University4033373437
Ondokuz Mayis University4164322832
Hacettepe University4255525952
Igdir Universitesi4349362736
Akdeniz University4471515551
Bilkent University4577464946
Mardin Artuklu Üniversitesi4654475047
Bursa Technical University4734443844
Atilim University4840454245
Piri Reis University4932422942
Izmir Bakircay University5016545754
TOBB University of Economy and Technology5169504850
Antalya Bilim Üniversitesi5262494649
Kadir Has University5366627062
Eskisehir Technical University5475484348
Bayburt University5543575857
Gazi University5644596259
Manisa Celal Bayar University5746535153
Van Yuzuncu Yil University5847586158
Artvin Çoruh University5959647165
Karamanoğlu Mehmetbey Üniversity6056606060
Adiyaman University6139636663
Dicle University6273555355
Bolu Abant Izzet Baysal University (BAIBU)6370565256
Bezmialem Vakıf University6476717372
Istanbul Gelisim University6537615461
Galatasaray University6652676967
Karadeniz Technical University6760655664
Marmara University6868737473
Sivas Cumhuriyet University6979666566
Ağrı İbrahim Çeçen Üniversitesi7065696770
Anadolu University7158686368
Usak University7267726871
Istanbul Atlas University7351706469
Cag University7482757575
Erzurum Technical University7561767776
Eskisehir Osmangazi University7678777877
Karabuk University7720747274
Bingöl University7863808180
Cankaya University7981798079
Kirikkale University8080787678
Ankara University8174818381
Kilis 7 Aralk University8283838283
Konya Technical University8372827982
UI: UI GreenMetrics Ranking.
Table 7. Correlation matrix for ranking levels.
Table 7. Correlation matrix for ranking levels.
UICRITICENTSDDEW
UI1
CRITIC0.815 *1
ENT0.985 *0.816 *1
SDD0.953 *0.815 *0.988 *1
EW0.985 *0.817 *10.988 *1
UI: UI GreenMetrics Ranking, * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Akyol Özcan, K. Sustainability Ranking of Turkish Universities with Different Weighting Approaches and the TOPSIS Method. Sustainability 2023, 15, 12234. https://doi.org/10.3390/su151612234

AMA Style

Akyol Özcan K. Sustainability Ranking of Turkish Universities with Different Weighting Approaches and the TOPSIS Method. Sustainability. 2023; 15(16):12234. https://doi.org/10.3390/su151612234

Chicago/Turabian Style

Akyol Özcan, Kübra. 2023. "Sustainability Ranking of Turkish Universities with Different Weighting Approaches and the TOPSIS Method" Sustainability 15, no. 16: 12234. https://doi.org/10.3390/su151612234

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop