Next Article in Journal
Certain Bounds of Formulas in Free Temporal Algebras
Previous Article in Journal
Describing Conditional Independence Statements Using Undirected Graphs
Previous Article in Special Issue
NIPUNA: A Novel Optimizer Activation Function for Deep Neural Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Using SNAP to Analyze Policy Measures in e-Learning Roadmaps

by
Nikola Kadoić
1,*,
Nina Begičević Ređep
1 and
Dragana Kupres
2
1
University of Zagreb Faculty of Organization and Informatics, University of Zagreb, Pavlinska 2, HR-42000 Varaždin, Croatia
2
CARNET—Croatian Academic and Research Network, Josipa Marohnića 5, HR-10000 Zagreb, Croatia
*
Author to whom correspondence should be addressed.
Axioms 2023, 12(12), 1110; https://doi.org/10.3390/axioms12121110
Submission received: 23 October 2023 / Revised: 20 November 2023 / Accepted: 27 November 2023 / Published: 11 December 2023
(This article belongs to the Special Issue Applied Optimization and Decision Analysis on Interdisciplinary Areas)

Abstract

:
Creating policy measures is the final step in the process of e-learning roadmap development. Policy measures can be seen as long-term activities that need to be implemented and constantly upgraded to achieve strategic goals. For resource allocation, it is useful to prioritize policy measures. Prioritization can be implemented using multi-criteria decision-making methods. This paper analyzes policy measures in the Maldives National University’s e-learning roadmap using the social network analysis process (SNAP), which includes the analytic hierarchy process (AHP), the decision-making trial and evaluation laboratory (DEMATEL), and the PageRank centrality. In policy measure evaluation, there were more than 20 participants: persons with managerial functions at the Maldives National University (MNU) (deans, heads of departments) and persons in lecturer and researcher positions. By using the AHP, participants prioritized policy measures with respect to their importance to them. By using the DEMATEL, participants identified and prioritized policy measures with respect to their effect on other measures. Finally, by using the SNAP, it was possible to determine the prioritization list for resource allocation since it aggregates the aspects of the policy measures, their importance, and their effect on other measures.
MSC:
90B50; 91B06; 91B10

1. Introduction

Policy measures can be seen as long-term activities that need to be implemented and constantly upgraded to achieve strategic goals. Creating policy measures is the final step in the process of e-learning roadmap development. When implementing policy measures, resources for their implementation should be allocated, and it is essential to make a prioritized list of measures, because we do not have indefinite resources. There are many possible ways to carry out the implementation. Most of them include the application of different multi-criteria decision-making methods. This paper analyzes policy measures in the Maldives National University’s e-learning roadmap using the analytic hierarchy process (AHP), the decision-making trial and evaluation laboratory (DEMATEL), and the social network analysis process (SNAP).
The policy measures defined in the roadmap are the following: developing e-learning resources for study programs and setting university-wide guidelines, establishing a legal framework for e-learning, establishing university advisory services for lecturers via the University Centre for Educational Technology and Excellence (CETE), implementing research projects in the area of e-learning, establishing a quality assurance framework for e-learning, identifying the necessary infrastructure for e-learning, and improving student support services for e-learning.
In policy measure evaluation, there were more than 20 participants: persons with managerial functions at the Maldives National University (MNU) (deans, heads of departments) and persons in lecturer and researcher positions.
There are two main research questions in this study:
  • Which multi-criteria decision-making method is most suitable for the resource allocation of the MNU e-learning policy measures?
  • What are the MNU e-learning policy measure priorities for resource allocation?
    a
    What are the MNU e-learning policy measure priorities in terms of the importance of policy measures for MNU?
    b
    What are the MNU e-learning policy measure priorities in terms of their importance in affecting other policy measures?
The main contribution of this paper is related to the identification of the most suitable multi-criteria decision-making method for resource allocation purposes considering both the decision-making methods’ characteristics and the decision makers’ choices and possibilities for providing the necessary inputs for resource allocation. This is the reason why we approached the resource allocation of e-learning roadmap policy measure prioritization using three different methods. The AHP is a multi-criteria method which does not support modeling the dependencies (influences, affect on others) between the elements that are to be prioritized [1]. However, the decision-making problem of e-learning roadmap policy measures is characterized by the existence of influences between the elements since the implementation of some measures directly affects the implementation of other measures. This is the reason for using the networked prioritization method. The most often used networked prioritization method is the analytic network process (ANP). But, the ANP does not integrate the strength of the element into the network with respect to the goal, only the effect of the dependencies between the criteria and their alternatives [2]. In addition, there are many characteristics of the ANP that could be a significant challenge for implementation in the case of MNU (low understanding of some ANP steps, a problem having no criteria or alternatives). Thus, the SNAP method is developed to incorporate both the strength of the elements and their effects into the network. Additionally, the steps of the SNAP in terms of providing inputs are easier to understand. Consequently, the AHP is applied to identify the most important policy for decision-making experts (strength), the DEMATEL is applied to construct the network of influences (affecting) between the policies, and, finally, the SNAP is applied to obtain the resource allocation priorities.
This paper is organized as follows: In Section 2, the e-learning roadmap policy measures are presented. In Section 3, we provide the literature review results related to policy measures, roadmaps, and the application of multi-criteria decision-making methods in general, but also in the area of e-learning and resource allocation. Section 4 includes a description of the SNAP method. Section 5 presents the methodology of the research. Section 6 provides the results. Section 7 presents a discussion of the results, the summarized answers to both research questions, and future implications. Finally, Section 8 concludes the paper.

2. The MNU e-Learning Roadmap Policy Measures

The MNU e-Learning Roadmap (eLR) is part of the AMED project [3]. The e-learning roadmap contributes to the structuring of an institutional framework for the comprehensive development of e-learning at MNU. The main purpose of the MNU e-LR is to support the further quality development of e-learning at MNU by identifying and proposing policy measures. The e-LR is a powerful communication tool that shows how MNU will achieve the university’s strategic goals in e-learning. The coordinator of the development process was the Croatian Academic and Research Network—CARNET—with the support of an external consultant, while the main beneficiary was MNU and especially the university’s Centre for Educational Technology and Excellence (CETE).
The e-Learning roadmap was developed using the following:
  • The MNU’s needs analysis, previously developed within the AMED project;
  • The MNU’s e-learning policy analysis, previously developed within the AMED project;
  • MNU’s strategic plans for 2013–2017, 2018–2022, and 2020–2025 [4,5,6];
  • The participatory process and active dialogue with project partners, especially the lecturers, MNU management, and the staff at the Center for Educational Technology and Excellence in which the Theory of Change was used as a methodology framework.
The starting point of developing the e-learning roadmap was the needs analysis and the MNU e-learning policy analysis conducted in the first phase of the AMED project. The analyses revealed a positive disposition among university staff towards e-learning, but also some concerns about the quality of e-learning (as opposed to f2f instruction) and the response to the COVID-19 pandemics. The policies identified existing policies, and those should be built upon and strengthened wherever possible. The data were collected from senior decision makers and faculty members, as well as the CETE members, using focus groups and interviews (face-to-face (f2f) and online), surveys, online meetings, and e-mail communication.
The main findings of the AMED needs analysis are:
  • e-Learning training needs of lecturers and senior-level staff were detected;
  • Institutional readiness for online learning was identified and evaluated as low;
  • Infrastructure for e-learning acknowledged existing computer laboratories, multimedia facilities, and dedicated open study areas available to students in faculties and campuses, Moodle, TurnItIn, Self-Service portal, 21-staff IT department (IT support, web and application development);
  • Connectivity issues were detected, especially low bandwidth and Wi-Fi semi-coverage of campus facilities;
  • The need to upgrade the existing hardware was also identified since the hardware configurations were weak;
  • The lack of a written plan for the systematic upgrading and improvement of IT infrastructure (a low level of the strategic planning of e-learning implementation) was identified;
  • In Moodle, a large amount of “latent” data kept in the system was identified, sometimes with empty courses or items that were hidden to students.
The main findings of the MNU e-learning policy analysis proposed several recommendations:
  • to strengthen the existing policies that support e-learning at MNU by using clear communication and open support from the MNU decision makers;
  • to create new policies related to the ICT infrastructure development (including high bandwidth and computer equipment for teachers and students) and legal framework;
  • to invest in the MNU’s capacity for innovation and research in e-learning to enhance the existing competencies;
  • to establish advisory support services for teachers and students.
The findings from both analyses and MNU’s strategic plans were used to lead the participatory process based on the Theory of Change (TOC) methodology, specifically (1) by identifying long-term goals and (2) by the backward mapping of the activities to reach the desired long-term goals using the IOOI approach (INPUT–OUTPUT–OUTCOME–IMPACT) [7,8].
The participatory consultation process included a face-to-face workshop based on the Art of Hosting World Caffe methodology during the project meeting in Barcelona in November 2019 with 20 MNU teachers, decision makers, the CETE staff, and the whole project team. During the workshop, the group identified long-term goals as a first step in the TOC methodology. Instead of devising new goals, which could be time-consuming and difficult to complete, have, and be accredited by MNU, the group used several already accepted MNU strategic plans (2013–2017, 2018–2022, and 2020–2025) and identified five university strategic goals to which the e-learning should contribute:
  • Academic Excellence;
  • Exemplary Research and Innovation;
  • Quality People;
  • Conducive Working Environment; and
  • Student Empowerment and Success.
The group identified strategic goals (I) and desired outcomes (II) from the university’s strategic goals documents and proposed outputs (III) that contribute to the defined outcomes (II). The workshop therefore resulted in an Input–Output–Outcome–Impact proposal, specifically policy outputs (III) which were connected to strategic goals (I) and outcomes (II). The consultation process continued online throughout 2020 because of the COVID-19 measures. It included meetings, interviews, and a questionnaire distributed to MNU staff. The policy inputs (activities needed to be introduced by MNU to produce outputs (III)) were proposed during the consultation process between the subcontracted external consultant and MNU staff. The policy measures are presented in Table 1 [3].
The Maldives is an archipelago consisting of 1190 low-lying coral islands spread around 90,000 square kilometers, out of which only 187 islands are inhabited by less than half a million people [9]. Tourism, fishing, and shipping are main economic activities, with tourism being the most heavily depended on. At present, the quality of education is the top priority on the Education Agenda. This was envisioned by The Maldives’ Education Strategic Action Plan (ESAP)1 for 2014–2018. Higher education is predominantly provided by two public universities and nine private colleges. The Maldives National University (MNU) was established as the first public university in 2011. With five main campuses and 20 outreach centers, spread across different Maldives islands, MNU offers degrees in management, engineering, education, law, health sciences, and tourism, among others. The Centre for Educational Technology and Excellence (formerly known as the Centre for Open Learning) offers blended learning where students attend both face to face and online sessions.

3. Literature Review

3.1. Policy Measures and Roadmap Analysis

The impact of technology on education has been widely discussed for several decades, and remote teaching after the COVID-19 emergency recently became mainstream [10,11]. National and transnational policies are adopting the view that technology in education is important. The EU Digital Education Action Plan for 2021–27 acknowledges the need for education to become a part of a wider digital transformation [12]. However, if not reflected in wider global and local contexts, rushing to technologize education has the risky potential of redefining and reducing education to business models of commercial solutions [11] without taking into account the needs of the main education stakeholders: students and teachers.
How do we address these different contexts and needs, and at the same time, harness the potential of technology for improving students’ outcomes? The idea of digitally mature education takes into account wider global, environmental, and digital changes and invites us to critically assess and plan technology investments in education as meaningful digital transformation, keeping in mind the needs of students, the context, and the key sustainability issues [13]. Research suggests that the successful deployment of technology in education depends on the quality and maturity of its plan and the planning process that should, among others, include analyzing needs, challenges, and opportunities, as well as evaluating results and monitoring progress [14]. How technology is designed and implemented depends on the context in which it is to be deployed. Therefore, any planning process must consider the context in which it is operating: national, institutional, cultural, and social.
The Theory of Change methodology can be used to facilitate the planning of technology investments at the organizational level. It is “a helpful tool for developing solutions to complex social problems” [15]. The Theory of Change was used in the AMED project to navigate the Maldives National University towards a more digitally mature university, aware of its strong position and vulnerabilities, that is well grounded in the wider global, ecological, and digital challenges that the Republic of Maldives is facing. For that reason, the university e-learning roadmap was developed, using participatory practices with the MNU stakeholders (university decision makers, teachers, and IT and support staff). The main focus of the method was to take into account the context of the Maldives National University in the moment of its digital transformation [16].

3.2. Multi-Criteria Analysis of e-Learning Problems

There are numerous research studies that focus on the application of multi-criteria decision-making (MCDM) methods in decision making in various fields. The objective of using MCDM methods is to make strategic decisions and solve complex problems. The application of MCDM methods supports decision makers in ranking and assessment in order to determine priorities and allocate resources based on the defined priorities. There are several in-depth studies of MCDM methods cited in the literature. In paper [17], the MCDM methods were categorized into pairwise comparisons and outranking and distance-based approaches. The authors presented the summary of previous work on some well-known MCDM methods including the Analytical Hierarchy Process (AHP), Analytical Network Process (ANP), Elimination et Choix Traduisant la Realité (ELECTRE), Preference Ranking Organization METHod for Enrichment of Evaluations (PROMETHEE), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), and VIseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR). These methods were applied in various areas for supporting the decision-making process and solving problems in energy, transportation, sustainability, manufacturing, and production.
One of the examples of the research in manufacturing can be found in paper [18]. This research emphasized the need for decision-making frameworks in manufacturing facilities and the importance of machine selection strategies in the context. The authors used a dual-MCDM approach that includes DEX—decision experts—and the EDAS method to solve decision-making problems in both academia and practical industries. Additionally, the same authors discussed the DEA multi-criteria analysis [19].
The review of the literature in STEM education from the perspective of the MCDM methods application is given in [20]. The motivation for preparing this paper was the fact that MCDM methods are widely applied in various fields but less so in education. The literature review related to the application of MCDM in education was presented and appropriate MCDM models for decision making in education were suggested. A more detailed review of the literature on the application of the AHP, the DEMATEL, and the TOPSIS in education was based on the following characteristics: objectives, criteria, methods, and the areas of application. Based on the conducted literature review, e-learning is the most common decision problem in education, and it is followed by a decision about the learning tools and learning skills.
In paper [21], the analytical approach was used for ranking distance education platforms based on human–computer interaction criteria and for identifying the most appropriate distance learning platform for teaching and learning activities by using multi-criteria decision-making approaches. Ref. [22] presented a comprehensive decision framework with interval-valued type-2 fuzzy AHP for evaluating all critical success factors of e-learning platforms. The objective was to determine the weights of all of the critical success criteria and offer a reliable method for evaluating e-learning platforms.
The literature review papers focusing on using MCDM methods in other areas are shortly described as follows.
A systematic literature review of MCDM methods in research on corporate sustainability was presented in paper [23]. The authors observed that most of the reviewed articles used only a single MCDM method to analyze the data, and that there was a lack of research using integrated MCDM methods.
Ref. [24] presented a broad classification of the MCDM-based R&D project selection, categorized by the nature of alternatives, the types of integration approach, the MCDM method itself, and the types of uncertainty, by reviewing 66 studies and using the systematic literature review approach.
The literature review of the performance evaluation of logistics and the use of multi-criteria decision-making methods (MCDM) in this area covered more than 120 research articles published from 2010 to 2019 [25].
A plethora of research focused on the application of MCDM methods in strategic planning. Makki et al. [26] presented a novel strategic approach to evaluate higher education quality standards in university colleges using multi-criteria decision-making.
MCDM methods were also used to solve a multi-criteria decision problem and investigate the evaluation of six strategic planning models in the context of smaller and medium-sized manufacturing companies [27]. Lee [28] presented the case study that provided management with valuable insights for planning and controlling health-care activities and services by using the AHP and goal programming (GP) to derive satisfying solutions for designing, evaluating, and implementing an Enterprise Resource Program (ERP). Kangas et al. [29] applied MCDM methods in the strategic planning of forestry on the state-owned lands in Finland.
The AHP was applied in e-learning in order to prioritize the factors that were affecting the academic integrity in e-learning in Saudi Arabian Universities [30]. The AHP was applied to evaluate the critical success factors of e-learning platforms [22,31,32]. Selecting the best way of learning was implemented by using the AHP, and the results showed that the undergraduate students mostly preferred online learning [33]. Sustainable e-learning is a goal of many universities. The barriers that often hinder sustainable e-learning were evaluated by using the AHP [34]. Different types of e-learning tools and software were evaluated using the AHP [35,36] as well as e-learning platforms [37,38,39,40].
Similarly to applying the AHP, the DEMATEL and the SNAP were also applied in the context of e-learning. A hybrid model DEMATEL-ANP was used to prioritize the components of the e-learning system with the goal of resource allocation to the appropriate components, and was found to be important in actualizing the cost–benefit analysis [41]. This is an example of conducting resource allocation by using only one component that affects others, since neither the DEMATEL nor the ANP explicitly consider the element’s importance and weights into the final priorities. A similar approach was used to assess the key barriers for the course comprehension of Chinese students in French language courses in higher education [42]. The DEMATEL was also applied with the Simple Additive Approach (SAW) to analyze the criteria for sustainable mathematics education [43]. In addition, the DEMATEL-ANP together with the TOPSIS were used for teaching sustainable development [44]. The DEMATEL was applied to evaluate critical success factors for the adoption of e-learning facilities [45,46].
On the other hand, the SNAP method does not have many applications in e-learning, since it is a relatively new method. By using the SNAP domains in the digital maturity of higher education, institutions were evaluated [47,48], strategic goals in the balanced scorecard strategic map of goals were prioritized [49], the criteria for selecting the best scientists were evaluated [47,50], and the criteria in the evaluation of critical IT systems were evaluated [51]. The SNAP method was developed to overcome some shortcomings of the ANP. It is based on a combination of the DEMATEL, SNA, and ANP. The initial version of the SNAP, DEMATEL-based ANP, was tested and the results were very good since the data collecting procedure was simplified when compared to the ANP, and the results were very similar to those of the ANP [52].

4. SNAP

There are many multi-criteria decision-making methods that can be used for resource allocation problems. In the previous section, we presented some of the implications of several multi-criteria decision-making methods. The first research question deals with the selection of the most appropriate method for resource allocation in the case of MNU. This research question was answered in the focus group held by the researchers and the research participants. The answer to this question is the SNAP method, which is presented in this section. However, the process of how we concluded that the SNAP is the best choice is explained in the methodology section.
The SNAP integrates two aspects of element priorities:
  • the strength of the element (the importance of the element with respect to the decision-making goal); and
  • the effect of the element on other elements.
It is possible that a certain element is very important for a decision-making goal, but its implementation does not influence the implementation of other elements. In addition, there could be an element that is not the most important element for a decision-making goal, but its implementation will affect the implementation of other elements. So, in the end, both elements could have equal priorities in terms of resource allocation.
In the SNAP, the strength of the element is determined by using the AHP process. The effect of the element is determined by applying the PageRank procedure on the matrix of the influences, which is modeled by using the DEMATEL scale. Finally, the obtained priorities in both cases are to be aggregated by using the arithmetic of the geometric mean.

4.1. AHP

The AHP is a well-known and most often used multi-criteria decision-making method [53]. It is applied to problems of prioritization and selection. The method was created by prof. Thomas Saaty. It is based on the pairwise comparison procedure. There are several steps in the AHP:
  • The structuring of the decision-making problem;
  • Pairwise comparisons of lower level elements with respect to higher level elements. Here, the criteria are compared with respect to the goal, lower level criteria with respect to higher level criteria, and the alternatives with the respect to criteria at the lowest level;
  • After pairwise comparison tables are created, it is possible to calculate the criteria weights and priorities of the alternatives;
  • Performing the sensitivity analysis.
The AHP is explained in detail in the literature [54,55]. The method does not model dependencies and influences between the elements in the structure, so this method is to be applied when there are no influences between the criteria. Applying the AHP when there are dependencies and influences between the criteria might not lead to the best solution, and in such a case, applying a network-based method is more favorable. The second step of the methodology can be explained more deeply [55,56].
Let n be the number of criteria (or alternatives) for which weights (priorities) w i have to be determined on the basis of the estimated values of their ratios:
a i j = w i w j
These ratios form the matrix A . In case of consistent estimates, i.e., where a i j = a i k a k j holds, the matrix A satisfies the equation:
A w = n w
The matrix A has the following properties:
  • all its rows are proportional to the first row;
  • all elements are positive; and
  • a i j = 1 a j i holds.
Therefore, only one of its eigenvalues differs from zero and it is equal to n . The corresponding eigenvector has real, positive components which are priorities (weights) of alternatives (criteria). Through the additional constraint w i = 1 , the vector w becomes unique and normalized. If the matrix A contains inconsistent estimates, and it is usually so in real cases, the vector of weights w is obtained by solving the equation under the condition w i = 1 , where λ m a x is the biggest eigenvalue of the matrix A.
A λ m a x I w = 0
Even if matrix A is no longer consistent, the fact that all elements of A are positive and a i j = 1 a j i holds is enough to assure that λ m a x is real and all components of the corresponding eigenvector are real and positive. In this case, we have λ m a x > n , and the difference λ m a x n is used as a base for measuring the consistency of estimates. The consistency of estimates is measured with the consistency index given by the following:
C I = λ m a x n n 1 .
Using this index, we calculate the consistency ratio:
C R = C I R I
where R I is the random index defined as the consistency index of n × n matrix randomly generated by pair-wise comparisons. If the consistency ratio C R < 0.1 holds, then the estimates of the relative importance of criteria, and therefore the calculated priorities of alternatives, are considered acceptable. In the opposite case, why the inconsistency of estimates is unacceptably high must be investigated.

4.2. DEMATEL and the PageRank Centrality

Figure 1 presents the differences between a decision-making problem that contains criteria without influences between them, and a decision-making problem that contains criteria with influences between them.
In Figure 1, criterion 1 influences criterion 3; therefore, it becomes more important. If 1–4 are policy measures, we can interpret this situation as follows: if the implementation of policy measure 1 influences policy measure 3, then policy measure 1 is more important because, with its implementation, measure 3 will also be supported for implementation; therefore, it makes sense to allocate more resources to 1. Conversely, criterion 4 does not influence any other measure, so its implementation will not support the implementation of any other measure, and in that sense, it becomes less important. However, policy measure 4 is not unimportant. It has its importance and strength in terms of the e-learning plan implementation, which brings us to the conclusion that resource allocation should be implemented by combining the strength of each measure and the intensity of its effect on other measures. While the first part can be implemented by using the AHP, modelling the relations among the measures (affecting) is completed by using the DEMATEL and SNAP.
The DEMATEL is a well-known approach in analyzing networked structures. It starts with creating the square matrix of all elements, which is filled with values 0–4 (0 = no influence, 1 = weak influence, … 4 = very strong influence). The steps of this method are explained in detail in the literature [58,59,60]. Here, it is applied as part of the SNAP method, and the SNAP uses only the first DEMATEL step. The SNAP is a relatively new method for analyzing multiple-criteria decision-making problems. It is based on the combination of the ANP and social network analysis (SNA) centrality measures, more precisely, the PageRank centrality measure [47,50,61,62].
PageRank centrality is a special type of eigenvalue centrality. The eigenvalue centrality for undirected and unweighted networks is calculated using Equation (6) [63]:
C E i = 1 λ j M ( i ) C E j = 1 λ j N a i j C E j
where:
  • M ( i ) is a set of neighbors of actor i ;
  • λ is a constant (the maximum eigenvalue); and
  • a i j is an element of a matrix of neighbours A .
PageRank centrality is used for directed networks, and there are variants of this measure in terms of weighted and unweighted graphs. PageRank centrality can be calculated using the iterative procedure [64] or using Equation (7):
lim k A k Z 0 = A ~
where:
  • A is the matrix of neighbors;
  • Z 0 is a one-column matrix which contains elements 1 N ; and
  • A ~ is a matrix of priorities.
In addition, PageRank centrality includes the calculation of matrix G using Equation (8):
G = α · A + ( 1 α ) · E
where:
  • A is the matrix of neighbors;
  • In most cases, α = 0.85 [65]; and
  • E is a square matrix with all values of 1 in it ( d i m E = d i m A ).
As already said, the SNAP is a relatively new method, developed over the last five years. Finally, it evolved to a version which uses PageRank centrality. The steps of the SNAP method are presented as follows:
  • Conducting the AHP;
  • Creating a matrix of influences between the criteria as a starting point. The influences are evaluated by decision makers using the DEMATEL scale (0–4) (matrix D );
  • In the second step, dividing each value in the previous matrix with the maximum sum of columns, which is increased by 1 (matrix A );
  • Calculating matrix I A ;
  • Calculating the inverse of matrix I A ;
  • Calculating A ~ using Equation (9);
    A ~ = k A k = A · ( A I ) 1
  • Calculating the sum of rows (ΣR) and columns (ΣC) of A ~ and their difference, d . The difference should then be normalized using the absolute difference between the highest (H) and lowest (L) difference d , n . When the normalization value,   n , is calculated, it should be added to differences d + n . Now, all values are positive, and it is possible to calculate the criteria weights using the normalization by sum;
  • Combining the AHP results with the PageRank procedure using the arithmetic or geometric mean.

5. Research Methodology

The main research question in this study is related to identifying the priorities of e-learning policy measures in the e-learning roadmap of MNU for resource allocation purposes.
The steps of the methodology are presented in Figure 2.
  • In the first step, the lead investigators in this research implemented the focus group to select the most appropriate method for resource allocation related to e-learning policy measures. When deciding on the appropriate method, five main criteria had to be considered: (A) can the method model the strength of the e-learning policy, (B) can the method model the effects of policy measures on other policy measures, (C) how is the method accepted in scientific and professional areas (is the method widely accepted and proven as a significant method for resource allocation purposes), (D) the complexity of providing inputs needed for the application of a decision-making method, and (E) the duration of the process of providing inputs. The first three criteria are method-oriented, and the last two criteria are decision-maker-oriented. The primary goal was to select the most appropriate method which can successfully complete the resource allocation priorities, and which is not too complex in terms of providing inputs, since the participants (decision makers) are not experts in decision-making method application, even if they are experts in their respective fields. So, a significant emphasis was placed on the behavioral aspects of the problem. This step resulted in the selection of the AHP and SNAP analysis of the resource allocation, since this combination enables the modelling of both aspects (strength of the policy and effects on other policies), and the methods are widely applied (the AHP is the most often used multi-criteria decision-making method, while the SNAP is both theoretically and practically proven to be much more successful in modeling influences between the criteria than the ANP). Additionally, the complexity of providing inputs is much lower than in other network methods, and the duration of providing inputs is low when compared to other network methods.
  • In the second step, preparations were made for the data collection procedure. Firstly, this included making a presentation on the AHP and SNAP for the participants of the research, with an emphasis on the inputs needed for the implementation of these methods. The inputs for the AHP are related to Saaty’s scale and pairwise comparison procedure. The inputs for the SNAP are related to the DEMATEL scale. It was of the utmost importance to present both methods in a clear and understandable way so that the participants could easily use them during the data collection procedure. Secondly, the data collection forms had to be created. It was planned that the data collection procedure would be completed through an online meeting, so Google Sheets documents were prepared for each method (AHP and SNAP) with a separate sheet for each decision maker, while the main sheet in each document was created for purposes such as (A) monitoring the results of data collection in real time, (B) assisting in the data collection procedure when participants experienced problems, and (C) a discussion after all the data were collected. The data collection forms are attached to this paper as Appendix A and Appendix B. The planned duration of data collecting was 6 h over 2 days, three hours per day.
  • In the third step, the data were collected at the virtual event organized as a part of the AMED project. In policy measure evaluation, there were 23 participants: persons with managerial functions at MNU (deans, heads of departments) and persons in lecturer and researcher positions. Group decision-making in the AHP was implemented at the level of pairwise comparison tables, which were integrated into the group table using the geometric mean. In the SNAP, the group decision-making was implemented at the level of the DEMATEL tables. Input data on the effects between the policy measures were integrated into the group DEMATEL table using the arithmetic mean. In the AHP, it is important to track the change in the consistency ratio, and the data collection form included this calculation, and as a result, achieving consistent tables was easier.
  • In the final step, the reports on the AHP and SNAP analysis were automatically created on the main sheets of each AHP or SNAP document. The results are presented in the following section.
Figure 3 presents the structure of the participants in the research with respect to their function at MNU. Most of the participants were deans (8), followed by lecturers (6), heads of departments (4), deputy vice-chancellors (3), a quality control manager (1), and a vice-chancellor (1). In addition, 17 out of 23 participants were in a leading position in the organizational structure. This is very important because the decision that had to be made was very strategic since MNU was at that time at a low level of e-learning adoption, and there was a great need for a high level of e-learning application (due to the geography of the Maldives, which results in high infrastructural, travel, and connectivity costs for MNU and their employees and students). Figure 4 presents the hierarchical structure of the AHP model related to the prioritization of e-learning policy measures. The model is drawn from left to right, representing the top-down hierarchy. At the top of the hierarchy, there is a goal which is here defined as a question: Which e-learning policy measure is the most important to achieve a high level of e-learning application in MNU? The question form is selected due to behavioral reasons for filling the pairwise comparison table, which requires answering 21 pairwise comparison questions similar to the main goal question (see Appendix A).

6. Results

In this section, the results of the research are presented. The multi-criteria methods that could be used for resource allocation purposes are compared in Table 2 with respect to the selected criteria. The ANP does not enable the calculation of the strength of the element in the system, and for the participants, it has the high complexity of providing inputs and the high duration of the process. On the other hand, the AHP does not model influences between the elements in the structure, even though it has acceptable complexity and duration. The DEMATEL and ISM do not support the calculation of the strengths of the elements. The ELECTRE, PROMETHEE, VIKOR, and TOPSIS do not model the effects among the criteria in the decision-making problem structure. In addition, since we do not have a full problem structure (goal–criteria–alternatives), but only a goal and alternatives, the calculation of the element strength is difficult. A similar conclusion can be drawn in the case of the DEA. Finally, the SNAP enables the calculation of the strengths of the elements and models the influences between the elements. Additionally, its complexity is acceptable, and the participants evaluated that they were able to provide the inputs needed for the analysis.
In Figure 5, we can see the AHP group results: what the participants think, and which policy measure is the most important for the overall goal (a high level of e-learning application in teaching processes in MNU). Participants evaluated the identification of the necessary infrastructure as the strongest policy measure. They believe that the implementation of this policy measure will be the most important contribution to the main goal of achieving a high level of e-learning application in MNU.
The individual opinions on policy measure priorities are not unanimous. Some participants agreed more and some less. An additional analysis showed that the participants from the same group did not agree. For example, all lecturers or deans did not have the same or similar priorities. Reasons for this can be found in the fact that they come from different parts of MNU (different locations) and from different fields of expertise; therefore, they see this issue differently, from their own perspective. The advantage of the AHP is the possibility to aggregate different opinions using the geometric mean. If the opinions of some persons or some groups of persons are different in terms of decision importance, the geometric mean can be applied. In our case, it was decided at the very beginning that all the participants have the same weight in terms of influence on the final decision, but the importance of the participant group was upgraded in the model through the number of participants in the group, as is presented in Figure 3. This is the reason why the deeper AHP priority result analysis was not needed in our case.
Similar conclusions on the most important factors or measures in e-learning implementation were also made in other papers (mentioned earlier); but, is the infrastructure really the most important, and must MNU invest most in this policy measure? We will answer this question after presenting the results of the SNAP method (which combines the DEMATEL, PageRank, and the AHP).
As mentioned earlier, the SNAP analysis is needed in this problem because the group who worked on the project and participated in policy measure prioritization believed that resource allocation cannot be made considering policy measure priorities only with respect to the goal, but also considering effects and influences among the policy measures. The SNAP combines the AHP results with the PageRank results, so both components are considered as desired by the decision makers.
In Table 3, the group influences among the policy measures which were calculated as the arithmetic means of participants’ assessments using the DEMATEL scale are presented. To collect the data on the influences between the policy measures, the data collection form attached in Appendix B was used. In addition, we calculated the sums of columns and rows. The sums of the rows (the last column in Table 3) represent the total influences of a certain policy measure on the others. On the other hand, the sums of the columns (the last row in Table 3) represent the total influences that other policy measures have on a certain policy measure. This analysis corresponds to incoming and outgoing centrality degree measures from the SNA [66,67,68]. If we focus only on policy measure 6 (which has the highest priority in the AHP analysis), we can see that, in this analysis, it had the second highest priority in terms of influencing other policy measures (outgoing centrality degree), but also the third highest priority in terms of being influenced by other policy measures (incoming centrality degree). In addition, the difference between those two numbers was negative, which means this measure was more influenced by others than it influenced others. This suggests that it will not have the highest priority in the analysis related to the between-measures influences (effects). Similar analyses can be conducted with the other measures in the decision-making problem. In addition, it is mandatory to mention that the analysis related to centrality degree only covers the direct influences between the measures. However, the indirect effects (intermediate effects) must also be included in the analysis, which is the reason for applying the other steps of the SNAP (and not only step 1 and step 6).
Step 6 is presented in Table 4 and Step 7 in Table 5. The other steps are not presented here, but can be calculated in respect to Table 2 and the steps of the SNAP method mentioned above.
In the last column of Table 4, we can see the priorities of policy measures with respect to the influences between them. The highest priority was associated with policy measure 3, which is related to advisory services and founding the Centre for e-Learning (CETE). In the discussion of the analysis results, decision makers also agreed that this policy measure will mostly affect other measures. Without the knowledge and advice from the CETE, all the equipment cannot be efficiently used in its full capacity; knowledge empowers all other efforts that are to be undertaken in all other policy measures. In this analysis, policy measure 6 (related to infrastructure) was in 5th place.
In Table 5, we can see the priorities obtained using the AHP (importance with respect to the goal), the DEMATEL-PageRank analysis (vector p, importance with respect to the influences among the policy measures), and the final SNAP results. The SNAP (a) is associated with the arithmetic mean, and the SNAP (g) is associated with the geometric mean in terms of the integration of the AHP and p priorities. The differences among the SNAP (a) and the SNAP (g) are in the third decimal, so we consider the results reliable independently of the aggregation mean used. Additionally, qualitative analysis with the respondents resulted in the conclusion that priorities obtained by using the SNAP are relevant in terms of resource allocation.
If we observe the rank analysis of the results (Table 6), and calculate the Spearman rank correlation coefficient, we can identify a strong negative correlation between the AHP and SNAP ( r = 0.61 ), and weak positive correlation among p and SNAP ( r = 0.14 ). However, in terms of resource allocation, relative priorities are much more needed than ranks since priorities can easily be transformed to concrete budget allocations to each policy measure. The Pearson correlation coefficient values are medium ( r = 0.41 in case of AHP-SNAP and r = 0.24 in case of p-SNAP).
The final discussion on the SNAP priorities ended with the conclusion that participants believe that the SNAP priorities best present the resource allocation needs, and that the total budget is to be divided respecting the SNAP priorities.

7. Discussion

Resource allocation is a very important and delicate issue in managing activities in every organization. At the beginning of the AMED project, MNU was at a low level of e-learning implementation but showed the willingness and expressed the need to apply e-learning at high level of implementation. Many different locations and students coming from different islands of the Maldives created high costs of learning and teaching processes for both MNU and students. For this reason, MNU decided to invest in and increase the level of e-learning implementation. Within the scope of the AMED project, the e-learning roadmap and policy measures were created. Their implementation should result in lowering the teaching and learning costs.
The limited amount of money available for achieving the high level of e-learning implementation at MNU must be optimally allocated to seven identified policy measures. Therefore, we set up two main research questions in this study:
  • Which multi-criteria decision-making method is most suitable for the resource allocation of the MNU e-learning policy measures?
  • What are the MNU e-learning policy measure priorities for resource allocation?
    (a)
    What are the MNU e-learning policy measure priorities in terms of the importance of policy measures for MNU?
    (b)
    What are the MNU e-learning policy measure priorities in terms of the importance of affecting other policy measures?
The first research question is answered in the analysis presented in Table 2, which is the result of focus groups of researchers and participants. For the purpose of resource allocation in the case of MNU policy measures, the most suitable method was the SNAP.
The second research question was answered by applying the SNAP method in the case of e-learning policy measures. The idea was to obtain the resource allocation priorities of policy measures so that the priorities were calculated considering the following two components: (1) the importance (strength) of policy measures for the main goal of e-learning implementation, and (2) the importance of policy measures with respect to the influences between them. The first part was implemented using the AHP, and the second part was implemented using the DEMATEL analysis in combination with PageRank centrality. The results were integrated using the arithmetic (geometric) mean, and this hybrid combination of multi-criteria decision-making methods is also known as the SNAP method. The final results are presented in Figure 6.
After the final priorities were obtained, the results were additionally discussed among the participants through focus groups. The ranks of policy measures with respect to their importance for the goal (AHP priorities) matched the ranks of policy measures by the respondents obtained by using direct assessment. The absolute differences of e-learning priorities were additionally discussed, and the participants agreed on them. Similar conclusions were derived for priorities with respect to the effect on other policies and the final priorities. The participants agreed that the SNAP priorities match the way in which the e-learning budget should be allocated to different e-learning policies.

8. Conclusions

The goal of this paper was to obtain the priorities of policy measures in the e-learning roadmap at MNU. Resource allocation related to the strategic documents is a decision of the highest level (strategic decision). Strategic decisions should be made at the highest level in the organizational structure, and that was the case in this study. The participants in the decision-making process were people with responsible roles at the university.
In the present study, we can see that there were differences between the hierarchical and networked approach, as well as that we should be careful in selecting methods to be used in the analysis. If the problem is characterized by the existence of influences (dependencies, effects) between the elements of a decision-making problem, it is opportune to use networked approaches. However, the networked approach must consider the strength of each element with respect to the goal, and not only model the influences between the elements. The SNAP approach is the one that enables both aspects.
After the process of resource allocation is completed by using the SNAP approach, the decision makers made official decisions on resource allocation, but due to the sensitivity of the data, we cannot present the details of the final decisions.
Since the study was implemented during the AMED project, one of the aspects that was tracked during the project was the satisfaction of users (MNU staff) with the project activities. Activities were evaluated with the highest grades, which can lead us to the conclusion that learning the SNAP method and its real-world application was very useful for MNU.

Author Contributions

Conceptualization, N.K., N.B.R. and D.K.; methodology, N.K., N.B.R. and D.K.; software N.K.; validation, N.K., N.B.R. and D.K.; investigation, N.K., N.B.R. and D.K.; resources, N.K., N.B.R. and D.K.; data curation, N.K., N.B.R. and D.K.; writing—original draft preparation, N.K., N.B.R. and D.K.; writing—review and editing, N.K.; visualization, N.K., N.B.R. and D.K.; supervision, N.K., N.B.R. and D.K.; project administration, N.K., N.B.R. and D.K.; funding acquisition, N.K., N.B.R. and D.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data are available upon request to the authors.

Acknowledgments

We would like to thank the participants from MNU for participating in the project AMED and the research. This work was supported by the Croatian Science Foundation under the project IP-2020-02-5071.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

The data collection form for AHP priorities was implemented as a Google Sheets document with 24 sheets: 23 of them were associated with each participant, and the last one was the dashboard which aggregated the results of all worksheets and presented the results of this paper (Figure 5).
Instructions: Compare in pairs policy measures. Write “x” below the Saaty value in each pair. Only one “x” in a row is allowed. Please, pay attention on CR value. At the end, it should be under 0.1
Policy Measures Priorities
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines 0.1429 CR
2. Establishing Legal Framework for e-Learning 0.1429 0.00
3. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE) 0.1429
4. Implementing Research Projects in the Area of e-Learning 0.1429
5. Establishing Quality Assurance Framework for e-Learning 0.1429
6. Identification of the Necessary Infrastructure For e-Learning 0.1429
7. Improving Student Support Services for e-Learning 0.1429
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines987654321234567892. Establishing Legal Framework for e-Learning
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines987654321234567893. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE)
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines987654321234567894. Implementing Research Projects in the Area of e-Learning
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines987654321234567895. Establishing Quality Assurance Framework for e-Learning
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines987654321234567896. Identification of the Necessary Infrastructure for e-Learning
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines987654321234567897. Improving Student Support Services for e-learning
2. Establishing Legal Framework for e-Learning987654321234567893. Establishing University Advisory Services for Lecturers through the Centre for e-learning (CETE)
2. Establishing Legal Framework for e-learning987654321234567894. Implementing Research Projects in the Area of e-Learning
2. Establishing Legal Framework for e-Learning987654321234567895. Establishing Quality Assurance Framework for e-Learning
2. Establishing Legal Framework for e-Learning987654321234567896. Identification of the Necessary Infrastructure for e-Learning
2. Establishing Legal Framework for e-Learning987654321234567897. Improving Student Support Services for e-Learning
3. Establishing University Advisory Services for Lecturers through the Centre for e-learning (CETE)987654321234567894. Implementing Research Projects in the Area of e-Learning
3. Establishing University Advisory Services for Lecturers through the Centre For e-Learning (CETE)987654321234567895. Establishing Quality Assurance Framework for e-Learning
3. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (ETE)987654321234567896. Identification of the Necessary Infrastructure for e-Learning
3. Establishing University Advisory Services for Lecturers through the Centre for e-learning (CETE)987654321234567897. Improving Student Support Services for e-Learning
4. Implementing Research Projects in the Area of e-Learning 987654321234567895. Establishing Quality Assurance Framework for e-Learning
4. Implementing Research Projects in the Area of e-Learning 987654321234567896. Identification of the Necessary Infrastructure for e-Learning
4. Implementing Research Projects in the Area of e-Learning 987654321234567897. Improving Student Support Services for e-Learning
5. Establishing Quality Assurance Framework for e-Learning987654321234567896. Identification of the Necessary Infrastructure for e-Learning
5. Establishing Quality Assurance Framework Ffor e-Learning987654321234567897. Improving Student Support Services for e-Learning
6. Identification of the Necessary Infrastructure for e-Learning987654321234567897. Improving Student Support Services for e-Learning

Appendix B

The data collection form for Dematel-PageRank analysis was implemented in a similar way to the AHP on 24 sheets: 23 of them were related to the participants in the decision-making process, and the last was implemented as a dashboard where the group results appeared (Table 3, Table 4, Table 5 and Table 6 and Figure 6).
Give your opinion on the influences between the elements of the e-Learning Roadmap…
0 = no influence from element X to Y
1 = low influence from element X to Y
2 = medium influence from element X to Y
3 = strong influence from element X to Y
4 = very strong influence from element X to Y
1234567
1. Developing E-learning Resources for Study Programs and Setting University-Wide Guidelines0
2. Establishing Legal Framework for e-Larning 0
3. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE) 0
4. Implementing Research Projects in the Area of e-Learning 0
5. Establishing Quality Assurance Framework for e-Learning 0
6. Identification of the Necessary Infrastructure for e-Learning 0
7. Improving Student Support Services for e-Learning 0

References

  1. Saaty, T.L. Fundamentals of the analytic network process—Dependence and feedback in decision-making with a single network. J. Syst. Sci. Syst. Eng. 2004, 13, 129–157. [Google Scholar] [CrossRef]
  2. Kadoić, N. Characteristics of the Analytic Network Process, a Multi-Criteria Decision-Making Method. Croat. Oper. Res. Rev. 2018, 9, 235–244. [Google Scholar] [CrossRef]
  3. Project AMED—Advancing Higher Education in Maldives through e-Learning Development. Available online: https://amed-project.eu/en/about-project (accessed on 15 October 2023).
  4. Maldives National University. Strategic Plan of the Maldives National University 2013–2017; Maldives National University: Malé, Maldives, 2013. [Google Scholar]
  5. Maldives National University. Strategic Plan of the Maldives National University 2020–2025; Maldives National University: Malé, Maldives, 2020. [Google Scholar]
  6. Maldives National University. Strategic Plan of the Maldives National University 2018–2022; Maldives National University: Malé, Maldives, 2018. [Google Scholar]
  7. R. and ActKnowledge. Theory of Change. Available online: www.theoryofchange.org (accessed on 15 October 2023).
  8. Pawson, R. Nothing as Practical as a Good Theory. Evaluation 2003, 9, 471–490. [Google Scholar] [CrossRef]
  9. World Population Review. Available online: https://worldpopulationreview.com (accessed on 15 October 2023).
  10. Hodges, C.B.; Moore, S.; Lockee, B.B.; Trust, T.; Bond, M.A. The Difference Between Emergency Remote Teaching and Online Learning. Educause. 2020. Available online: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning (accessed on 15 October 2023).
  11. Teräs, M.; Suoranta, J.; Teräs, H.; Curcher, M. Post-COVID-19 Education and Education Technology ‘Solutionism’: A Seller’s Market. Postdigital Sci. Educ. 2020, 2, 863–878. [Google Scholar] [CrossRef]
  12. European Commission. Digital Education Action Plan 2021–2027 Resetting Education and Training for the Digital Age. Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions. 2020. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52020DC0624&from=EN (accessed on 15 October 2023).
  13. Kupres, D.; Lanzo, C.; Morer, S. Digitally Mature Education: The Role of Digital Maturity and Educational Leadership in Meaningful Digital Transformation of Schools. In Proceedings of the Central European Conference on Information and Intelligent Systems; University of Zagreb Faculty of organization and informatics: Dubrovnik, Croatia, 2022; pp. 287–293. [Google Scholar]
  14. Solar, M.; Sabattin, J.; Parada, V. A maturity model for assessing the use of ICT in school education. Educ. Technol. Soc. 2013, 16, 206–218. [Google Scholar]
  15. Anderson, T. Teaching in an online learning context. In Theory and Practice of Online Learning; Athabasca University Press: Athabasca, AB, Canada, 2004. [Google Scholar] [CrossRef]
  16. Peters, M.A.; Jandrić, P. Dewey’s Democracy and Education in the age of digital reason: The global, ecological and digital turns. Open Rev. Educ. Res. 2017, 4, 205–218. [Google Scholar] [CrossRef]
  17. Azhar, N.A.; Radzi, N.A.M.; Ahmad, W.S.H.M.W. Multi-criteria Decision Making: A Systematic Review. Recent Adv. Electr. Electron. Eng. 2021, 14, 779–801. [Google Scholar] [CrossRef]
  18. Wang, C.-N.; Yang, F.-C.; Vo, T.M.N.; Nguyen, V.T.T.; Singh, M. Enhancing Efficiency and Cost-Effectiveness: A Groundbreaking Bi-Algorithm MCDM Approach. Appl. Sci. 2023, 13, 9105. [Google Scholar] [CrossRef]
  19. Wang, C.-N.; Yang, F.-C.; Vo, N.T.M.; Nguyen, V.T.T. Enhancing Lithium-Ion Battery Manufacturing Efficiency: A Comparative Analysis Using DEA Malmquist and Epsilon-Based Measures. Batteries 2023, 9, 317. [Google Scholar] [CrossRef]
  20. Malik, D.A.A.; Yusof, Y.; Khalif, K.M.N.K. A view of MCDM application in education. J. Phys. Conf. Ser. 2021, 1988, 012063. [Google Scholar] [CrossRef]
  21. Adem, A.; Çakıt, E.; Dağdeviren, M. Selection of suitable distance education platforms based on human–computer interaction criteria under fuzzy environment. Neural Comput. Appl. 2022, 34, 7919–7931. [Google Scholar] [CrossRef] [PubMed]
  22. Atıcı, U.; Adem, A.; Şenol, M.B.; Dağdeviren, M. A comprehensive decision framework with interval valued type-2 fuzzy AHP for evaluating all critical success factors of e-learning platforms. Educ. Inf. Technol. 2022, 27, 5989–6014. [Google Scholar] [CrossRef] [PubMed]
  23. Chowdhury, P.; Paul, S.K. Applications of MCDM methods in research on corporate sustainability. Manag. Environ. Qual. Int. J. 2020, 31, 385–405. [Google Scholar] [CrossRef]
  24. de Souza, D.G.B.; dos Santos, E.A.; Soma, N.Y.; da Silva, C.E.S. MCDM-Based R&D Project Selection: A Systematic Literature Review. Sustainability 2021, 13, 11626. [Google Scholar] [CrossRef]
  25. Chejarla, K.C.; Vaidya, O.S.; Kumar, S. MCDM applications in logistics performance evaluation: A literature review. J. Multi-Criteria Decis. Anal. 2021, 29, 274–297. [Google Scholar] [CrossRef]
  26. Makki, A.A.; Alqahtani, A.Y.; Abdulaal, R.M.S.; Madbouly, A.I. A Novel Strategic Approach to Evaluating Higher Education Quality Standards in University Colleges Using Multi-Criteria Decision-Making. Educ. Sci. 2023, 13, 577. [Google Scholar] [CrossRef]
  27. Ajripour, I.; Hanne, T. Using the Fuzzy Best Worst Method for Evaluating Strategic Planning Models. Processes 2023, 11, 1284. [Google Scholar] [CrossRef]
  28. Lee, C.W.; Kwak, N.K. Strategic Enterprise Resource Planning in a Health-Care System Using a Multicriteria Decision-Making Model. J. Med. Syst. 2009, 35, 265–275. [Google Scholar] [CrossRef]
  29. Kangas, J.; Kangas, A.; Leskinen, P.; Pykäläinen, J. MCDM methods in strategic planning of forestry on state-owned lands in Finland: Applications and experiences. J. Multi-Criteria Decis. Anal. 2001, 10, 257–271. [Google Scholar] [CrossRef]
  30. Muhammad, A.; Shaikh, A.; Naveed, Q.N.; Qureshi, M.R.N. Factors Affecting Academic Integrity in E-Learning of Saudi Arabian Universities. An Investigation Using Delphi and AHP. IEEE Access 2020, 8, 16259–16268. [Google Scholar] [CrossRef]
  31. Alqahtani, A.Y.; Rajkhan, A.A. E-Learning Critical Success Factors during the COVID-19 Pandemic: A Comprehensive Analysis of E-Learning Managerial Perspectives. Educ. Sci. 2020, 10, 216. [Google Scholar] [CrossRef]
  32. Naveed, Q.N.; Qureshi, M.R.N.; Tairan, N.; Mohammad, A.; Shaikh, A.; Alsayed, A.O.; Shah, A.; Alotaibi, F.M. Evaluating critical success factors in implementing E-learning system using multi-criteria decision-making. PLoS ONE 2020, 15, e0231465. [Google Scholar] [CrossRef] [PubMed]
  33. Siew, L.W.; Hoe, L.W.; Fai, L.K.; Bakar, M.A.; Xian, S.J. Analysis on the e-Learning Method in Malaysia with AHP-VIKOR Model. Int. J. Inf. Educ. Technol. 2021, 11, 52–58. [Google Scholar] [CrossRef]
  34. Naveed, Q.N.; Qahmash, A.I.; Al-Razgan, M.; Qureshi, K.M.; Qureshi, M.R.N.M.; Alwan, A.A. Evaluating and Prioritizing Barriers for Sustainable E-Learning Using Analytic Hierarchy Process-Group Decision Making. Sustainability 2022, 14, 8973. [Google Scholar] [CrossRef]
  35. De Castro-Pardo, M.; De la Fuente-Cabrero, C.; Laguna-Sanchez, P.; Perez-Rodriguez, F. Combining ahp and goal programming in the context of the assessment of e-learning. Int. J. Anal. Hierarchy Process. 2019, 11, 301–312. [Google Scholar] [CrossRef]
  36. Al Nawaiseh, A.J. Evaluating Software Quality in E-Learning System by Using the Analytical Hierarchy Process (AHP) Approach. In The Effect of Information Technology on Business and Marketing Intelligence Systems; Springer International Publishing: Cham, Switzerland, 2023; pp. 365–387. [Google Scholar] [CrossRef]
  37. Sun, J.; Fu, L.; Liu, J.; Wu, J.; Chen, Y. A Learning Efficiency Evaluation Model for E-Learning Platforms Based on Analytic Hierarchy Process (AHP). In Proceedings of the 2021 33rd Chinese Control and Decision Conference (CCDC), Kunming, China, 22–24 May 2021; pp. 2237–2242. [Google Scholar] [CrossRef]
  38. Wang, C.S.; Lin, S.L. How instructors evaluate an e-learning system? An evaluation model combining fuzzy AHP with association rule mining. J. Internet Technol. 2019, 20, 1947–1959. [Google Scholar] [CrossRef]
  39. Priska, M.A.; Aulia, D.; Muslim, E.; Marcelina, L. Developing a Framework to Evaluate E-learning System at Higher Education in Indonesia. In Proceedings of the 2020 the 4th International Conference on Education and E-Learning, Yamanashi, Japan, 6–8 November 2020; ACM: New York, NY, USA, 2020; pp. 27–32. [Google Scholar] [CrossRef]
  40. Toan, P.N.; Dang, T.-T.; Hong, L.T.T. E-Learning Platform Assessment and Selection Using Two-Stage Multi-Criteria Decision-Making Approach with Grey Theory: A Case Study in Vietnam. Mathematics 2021, 9, 3136. [Google Scholar] [CrossRef]
  41. Çelikbilek, Y.; Tüylü, A.N.A. Prioritizing the components of e-learning systems by using fuzzy DEMATEL and ANP. Interact. Learn. Environ. 2019, 30, 322–343. [Google Scholar] [CrossRef]
  42. Lee, H.-I.; Chiu, S.-H.; Chen, X.-N.; Liao, X.-Z.; Lin, T.-Y. What Are the Key Barriers for the Course Comprehension of Chinese Students in Lecture of French Higher Education? In Proceedings of the 2020 11th International Conference on E-Education, E-Business, E-Management, and E-Learning, Osaka, Japan, 10–12 January 2020; ACM: New York, NY, USA, 2020; pp. 125–129. [Google Scholar] [CrossRef]
  43. Jeong, J.S.; González-Gómez, D. Adapting to PSTs’ Pedagogical Changes in Sustainable Mathematics Education through Flipped E-Learning: Ranking Its Criteria with MCDA/F-DEMATEL. Mathematics 2020, 8, 858. [Google Scholar] [CrossRef]
  44. Ghassami, F.; Shobeiri, S.M.; Larijani, M.; Rad, S.F. Choosing the most appropriate method of teaching sustainable development using hybrid algorithm of DEMATEL-ANP and TOPSIS in fuzzy approach (A case study of technical and vocational schools). J. Environ. Sci. Technol. 2019, 32, 107–123. [Google Scholar] [CrossRef]
  45. Hossain, G.M.S.; Huang, W.; Kaium, M.A. Evaluating Critical Success Factors for Adoption Decision of e-Learning Facilities in Bangladesh by Using DEMATEL Approach. Int. J. e-Educ. e-Bus. e-Manag. e-Learn. 2020, 10, 182–204. [Google Scholar] [CrossRef]
  46. Mehta, K.; Sharma, R. Prioritizing the Critical Success Factors of E-Learning Systems by Using DEMATEL. In Redefining Virtual Teaching Learning Pedagogy; Wiley: Hoboken, NJ, USA, 2023; pp. 401–420. [Google Scholar] [CrossRef]
  47. Kadoić, N. Nova Metoda za Analizu Složenih Problema Odlučivanja Temeljena na Analitičkom Mrežnom Procesu i Analizi Društvenih Mreža. Doctoral Dissertation, University of Zagreb, Zagreb, Croatia, 2018. [Google Scholar]
  48. Đurek, V.; Kadoić, N.; Dobrović, Ž. Digital Maturity of Higher Education Institution: A meta model of the Analytical Network Process (ANP) and Decision Expert (DEX). In Proceedings of Central European Conference on Information and Intelligent Systems 2018; Strahonja, V., Kirinić, V., Eds.; Fakultet Organizacije i Informatike: Varaždin, Croatia, 2018; pp. 223–230. [Google Scholar]
  49. Kadoic, N.; Redep, N.B. Ranking the balanced scorecard goals of higher education institutions using the centrality measures. In Proceedings of the 11th International Conference on Education and New Learning Technologies, Palma, Spain, 1–3 July 2019; pp. 7366–7373. [Google Scholar] [CrossRef]
  50. Kadoić, N.; Redep, N.B.; Divjak, B. Application of PageRank centrality in multi-criteria decision making. In Proceedings of the 15th Internationyl Symposium on Operational Research SOR 201, Bled, Slovenia, 25–27 September 2019; pp. 54–59. [Google Scholar]
  51. Maček, D.; Magdalenić, I.; Ređep, N.B. A Model for the Evaluation of Critical IT Systems Using Multicriteria Decision-Making with Elements for Risk Assessment. Mathematics 2021, 9, 1045. [Google Scholar] [CrossRef]
  52. Schulze-González, E.; Pastor-Ferrando, J.-P.; Aragonés-Beltrán, P. Testing a Recent DEMATEL-Based Proposal to Simplify the Use of ANP. Mathematics 2021, 9, 1605. [Google Scholar] [CrossRef]
  53. Kadoić, N.; Ređep, N.B.; Divjak, B. E-learning decision making: Methods and methodologies. In Re-Imagining Learning Scenarios; European Distance and E-Learning Network: Budapest, Hungary, 2016; p. 24. [Google Scholar]
  54. Saaty, T.L. Fundamentals od Decision Making and Priroty Theory with the Analytic Hierarchy Process; RWS Publications: Pittsburgh, PA, USA, 1994. [Google Scholar]
  55. Saaty, T.L. Decision making with the analytic hierarchy process. Int. J. Serv. Sci. 2008, 1, 83–98. [Google Scholar] [CrossRef]
  56. Begičević, N.; Divjak, B.; Hunjak, T. Prioritization of e-learning forms: A multicriteria methodology. Central Eur. J. Oper. Res. 2007, 15, 405–419. [Google Scholar] [CrossRef]
  57. Kadoić, N.; Divjak, B.; Begičević Ređep, N. Integrating the DEMATEL with the analytic network process for effective decision-making. Cent. Eur. J. Oper. Res. 2018, 27, 653–678. [Google Scholar] [CrossRef]
  58. Lin, C.-L.; Hsieh, M.-S.; Tzeng, G.-H. Evaluating vehicle telematics system by using a novel MCDM techniques with dependence and feedback. Expert Syst. Appl. 2010, 37, 6723–6736. [Google Scholar] [CrossRef]
  59. Hung, S.-J. Activity-based divergent supply chain planning for competitive advantage in the risky global environment: A DEMATEL-ANP fuzzy goal programming approach. Expert Syst. Appl. 2011, 38, 9053–9062. [Google Scholar] [CrossRef]
  60. Tsai, W.-H.; Chou, W.-C. Selecting management systems for sustainable development in SMEs: A novel hybrid model based on DEMATEL, ANP, and ZOGP. Expert Syst. Appl. 2009, 36, 1444–1458. [Google Scholar] [CrossRef]
  61. Kadoić, N.; Ređep, N.B.; Divjak, B. A new method for strategic decision-making in higher education. Central Eur. J. Oper. Res. 2017, 26, 611–628. [Google Scholar] [CrossRef]
  62. Dzeko, M.; Kadoic, N.; Dobrovic, Z. Metamodeling SNAP, a Multi-Criteria Method for Effective Strategic Decision Making on e-Learning Issues. In Proceedings of the 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2019; pp. 849–853. [Google Scholar] [CrossRef]
  63. Mincer, M.; Niewiadomska-Szynkiewicz, E. Application of social network analysis to the investigation of interpersonal connections. J. Telecommun. Inf. Technol. 2012, 2012, 83–91. [Google Scholar]
  64. Horvat, D.; Munđar, D. Rangiranje web stranica. Osjecki Mat. List 2017, 17, 51–62. [Google Scholar]
  65. Xing, W.; Ghorbani, A. Weighted PageRank algorithm. In Proceedings of the Second Annual Conference on Communication Networks and Services Research, Fredericton, NB, Canada, 21–21 May 2004; pp. 305–314. [Google Scholar] [CrossRef]
  66. Knoke, D.; Yang, S. Social Network Analysis (Quantitative Applications in the Social Sciences), Series: Quantitative Applications in the Social Sciences, 2nd ed.; SAGE Publications, Inc.: Thousand Oaks, CA, USA, 2008. [Google Scholar]
  67. Wasserman, S.; Faust, K. Social Network Analysis: Methods and Applications, 1st ed.; Cambridge University Press: Cambridge, UK, 1994. [Google Scholar]
  68. Kadoić, N.; Divjak, B.; Redep, N.B. Differences among social network structures in the private sector, politics and NGOs in Croatia. TEM J. 2017, 6, 839. [Google Scholar] [CrossRef]
Figure 1. Criteria with and without influences between them [57].
Figure 1. Criteria with and without influences between them [57].
Axioms 12 01110 g001
Figure 2. Research methodology.
Figure 2. Research methodology.
Axioms 12 01110 g002
Figure 3. Research participants.
Figure 3. Research participants.
Axioms 12 01110 g003
Figure 4. Hierarchical model for AHP analysis.
Figure 4. Hierarchical model for AHP analysis.
Axioms 12 01110 g004
Figure 5. AHP priorities of e-learning policy measures.
Figure 5. AHP priorities of e-learning policy measures.
Axioms 12 01110 g005
Figure 6. The AHP, DEMATEL-PageRank (p), and SNAP priorities of e-learning policy measures.
Figure 6. The AHP, DEMATEL-PageRank (p), and SNAP priorities of e-learning policy measures.
Axioms 12 01110 g006
Table 1. e-Learning policy measures.
Table 1. e-Learning policy measures.
MeasureDescription
Developing e-Learning Resources for Study Programs and Setting University-Wide GuidelinesSetting up a digital learning repository within the library in which the existing and future courses and learning objects will be stored; developing practical guidelines and PD training for teachers on how to create OER, blended, and online courses; and developing similar guidelines on how to use provided resources for students; as well as standardizing the (minimum) requirements for each course will contribute to increasing the number of developed online/blended study courses.
Establishing a Legal Framework for e-LearningEnsuring the accreditation and recognition of an online/blended course; promoting and renumerating teaching, research, and administrative staff who participate in e-learning projects (through the Workload and Promotion Policy and Guidelines); providing introductory obligatory didactic/digital PD training for new lecturers; and increasing the capacity for legal advisory (on the copyright, reuse, and promotion of e-learning resources) will contribute to a comprehensive legal management framework for e-learning development and assessment.
Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE)Promoting e-learning to students; forming a permanent advisory team to support teachers and students in e-learning; delivering PD training for technical, support, and administrative staff in the area of e-learning; centralizing the offer of PD courses for e-learning; and centralizing the CETE will contribute to a centralized advisory service for lecturers and students at the CETE.
Implementing Research Projects in the Area of e-LearningDeveloping and adopting guidelines for the development of e-learning resources based on pedagogical principles; forming an e-learning research unit at the CETE as a link between the research and practice of e-learning; and enhancing application and participation in international projects and projects within the industry in the area of e-learning will contribute to the forming of the e-learning and innovative teaching research UNIT at the CETE.
Establishing a Quality Assurance Framework for e-LearningIntroducing data analytics and a protection policy; adopting procedures and criteria for e-learning courses for meeting the MNU QA requirements; and upgrading and adopting a detailed competency framework for blended course design based on the MQA competency document and international standards will contribute to a higher quality of e-learning courses in line with the MNU standard quality assurance procedure.
Identification of the Necessary Infrastructure for e-LearningDeveloping a cost–benefit analysis for the MNU IT infrastructure investments and a realistic action plan; re-structuring and increasing IT department capacities; negotiating special higher bandwidth rates with the telecommunication providers; upgrading audio and video equipment for students and teachers; the planning of the long-term support of the LMS platform; setting up campus infrastructure maintenance, including a security and data protection policy; and equipping outreach centers for disadvantaged students will result in adopted CBA and an action plan for IT infrastructure investments.
Improving Student Support Services for e-LearningTraining and forming a network of e-facilitators; creating an integrated web portal with student e-services; and providing accessibility support for students with special needs will contribute to the improved key student qualifications for e-learning.
Table 2. Comparison of resource allocation methods in the case of MNU.
Table 2. Comparison of resource allocation methods in the case of MNU.
Calculate the Strength of the ElementModel the Effects of the ElementMethod Acceptance in Scientific
Community
Complexity of Providing InputsDuration of the Process
ANPNoYesHighHighHigh
AHPYesNoHighMediumMedium
DematelNoYesHighLowMedium
ISMNoYesMediumLowMedium
ElectreYes *NoHighMediumMedium
PrometheeYes *NoHighMediumMedium
DEAYes **NoHighMediumMedium
VIKORYes *NoHighMediumMedium
TopsisYes *NoHighMediumMedium
SNAPYesYesMediumMediumMedium
* The method uses the strengths of the elements but does not have the procedure for their calculation. ** The strengths of the elements are calculated concerning the alternatives' values, not independently of them.
Table 3. Influences between the policy measures (full names of policy measures are not displayed in the first row).
Table 3. Influences between the policy measures (full names of policy measures are not displayed in the first row).
1234567SUM
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines0.001.932.502.403.132.872.7315.57
2. Establishing a Legal Framework for e-Learning2.270.001.932.202.672.642.3614.06
3. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE)2.602.400.002.272.933.003.1416.34
4. Implementing a Research Project in the Area of e-Learning 2.332.542.670.002.853.003.0816.46
5. Establishing a Quality Assurance Framework for e-Learning3.292.622.553.170.002.922.9217.45
6. The identification of the Necessary Infrastructure for e-Learning3.422.502.272.422.830.003.2516.69
7. Improving Student Support Services for e-Learning2.832.082.182.672.922.500.0015.18
SUM16.7414.0714.1015.1217.3316.9317.48
Table 4. Priorities of policy measures with respect to the influences between them.
Table 4. Priorities of policy measures with respect to the influences between them.
1234567ΣRΣCdd + np
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines0.840.810.830.881.000.970.996.316.74−0.431.160.105
2. Establishing a Legal Framework for e-Learning0.870.650.740.800.900.890.905.755.740.011.600.144
3. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE)0.990.850.740.901.031.011.046.565.760.802.390.215
4. Implementing a Research Project in the Area of e-Learning 0.990.860.870.801.031.011.046.606.150.452.040.183
5. Establishing a Quality Assurance Framework for e-Learning1.070.900.910.980.941.061.086.946.910.041.630.146
6. The identification of the Necessary Infrastructure for e-Learning1.040.870.860.921.040.881.066.676.75−0.081.510.136
7. Improving Student Support Services for e-Learning0.950.790.800.870.970.940.846.166.96−0.790.800.072
ΣC6.745.745.766.156.916.756.96 H0.8011.14
L−0.79
n1.59
Table 5. Priorities of policy measures with respect to the influences between the measures.
Table 5. Priorities of policy measures with respect to the influences between the measures.
pAHPSNAP (a)SNAP (g)
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines0.1050.1540.1300.130
2. Establishing a Legal Framework for e-Learning0.1440.1060.1250.127
3. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE)0.2150.1380.1760.176
4. Implementing a Research Project in the Area of e-Learning 0.1830.1070.1450.144
5. Establishing a Quality Assurance Framework for e-Learning0.1460.1470.1470.150
6. The identification of the Necessary Infrastructure for e-Learning0.1360.1920.1640.165
7. Improving Student Support Services for e-Learning0.0720.1540.1130.108
Table 6. Priorities of the policy measures with respect to the influences between the measures.
Table 6. Priorities of the policy measures with respect to the influences between the measures.
pAHPSNAP
1. Developing e-Learning Resources for Study Programs and Setting University-Wide Guidelines625
2. Establishing a Legal Framework for e-Learning476
3. Establishing University Advisory Services for Lecturers through the Centre for e-Learning (CETE)151
4. Implementing a Research Project in the Area of e-Learning 264
5. Establishing a Quality Assurance Framework for e-Learning343
6. The identification of the Necessary Infrastructure for e-Learning512
7. Improving Student Support Services for e-Learning737
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kadoić, N.; Begičević Ređep, N.; Kupres, D. Using SNAP to Analyze Policy Measures in e-Learning Roadmaps. Axioms 2023, 12, 1110. https://doi.org/10.3390/axioms12121110

AMA Style

Kadoić N, Begičević Ređep N, Kupres D. Using SNAP to Analyze Policy Measures in e-Learning Roadmaps. Axioms. 2023; 12(12):1110. https://doi.org/10.3390/axioms12121110

Chicago/Turabian Style

Kadoić, Nikola, Nina Begičević Ređep, and Dragana Kupres. 2023. "Using SNAP to Analyze Policy Measures in e-Learning Roadmaps" Axioms 12, no. 12: 1110. https://doi.org/10.3390/axioms12121110

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop