Next Article in Journal
Effects of Different Grazing Systems on Aboveground Biomass and Plant Species Dominance in Typical Chinese and Mongolian Steppes
Next Article in Special Issue
Mediating Bullying and Strain in Higher Education Institutions: The Case of Pakistan
Previous Article in Journal
Social Capital, Human Capital, and Sustainability: A Bibliometric and Visualization Analysis
Previous Article in Special Issue
Sustainable Higher Education and Technology-Enhanced Learning (TEL)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Academics’ Perceptions on Quality in Higher Education Shaping Key Performance Indicators

by
Emmanouil Varouchas
1,*,
Miguel-Ángel Sicilia
2 and
Salvador Sánchez-Alonso
2
1
School of Business and Economics, Department of Computer Information Systems, The American College of Greece—Deree, Ag. Paraskevi Campus, Athens GR-153 42, Greece
2
Department of Computer Science, University of Alcalá, 28801-Alcalá de Henares, Madrid, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2018, 10(12), 4752; https://doi.org/10.3390/su10124752
Submission received: 21 August 2018 / Revised: 29 November 2018 / Accepted: 10 December 2018 / Published: 13 December 2018

Abstract

:
Institutions in higher education (HE) continuously strive to develop and deliver impactful educational programs. At the same time, they should continue to fulfill their mission to educate students in basic applied subjects and in parallel respond to the need to equip students with new skills. For this reason, higher education institutions (HEI) perform periodical curricular reviews adhering to internal and external quality assurance systems. The subsequent curricular reforms are of a transformative nature, preparing graduates to tackle the challenges of globalization, unemployment and vanishing professions. For these reforms to lead to sustainable curricula, the integration of quality into educational programs is instrumental. A suggested way of achieving a transformative stance is to provide the context for the application and diffusion of quality metrics in teaching and learning. This research intends to provide a discussion of key performance indicators (KPIs) related to quality. This paper presents the second round of qualitative interviews with higher education administrators and professors as a promising vehicle for advancing towards the formulation of KPIs based on their understanding of the different independent dimensions of the quality construct. These KPIs will provide valuable insights into improving teaching, learning and assessment and will eventually lead to sustainable curricula. Research findings outline the significance of the time invested to design and update a course, indicate that technology-enhanced learning solutions are perceived as key quality drivers, and point out the need to align courses with industry requirements and real-world problems. Additionally, findings indicate that the quality and impact of teaching and learning is promoted by the multi/inter-disciplinary character of a course, the engagement of students in interactive discussions and student research as part of summative assessment. The main contribution of this research is an analytic discussion of perceptions of higher education administrators and professors about quality, leading to a significant enrichment of the relevant literature. A set of innovative generic KPIs which can be used in multidimensional quality assessment in higher education is eventually proposed.

1. Introduction

Without doubt, the university is no longer a quiet place to teach and do scholarly work at a measured pace and contemplate the universe as in centuries past. It is a big, complex, demanding, competitive business, requiring large-scale ongoing investment [1]. Higher education today is challenged by global unrest, regulatory compliance, technology disruption, emphasis on admissions, societal demand for better education and the fact that students are rewriting the rules. To be able to respond to these challenges, HEIs need to adapt quickly and change what they have been doing in a mediocre way, capitalize on what they do excellently and make quality a necessary ingredient of their core competency: the curricula. To achieve this, academics need to view quality as a means of continuous improvement and realize that curricula are continuously evolving, living structures. In other words, quality should be considered as the pivotal instrument for the transformation of HEIs.
Due to the increase in societal demand for higher education, the needs for diverse skills required in the context of globalization (exploratory skills, exploitation skills, management skills, moral and ethical skills, etc.), and the processes of internationalization and diversification in higher education, a growing concern has emerged regarding the quality of higher education inputs, processes and outcomes [2]: the concern of defining simple, measurable quality indicators. At the same time, the negative effects of the heavy reliance on control by such indicators have been highlighted [3].
Considering the above points, this research was mainly motivated by the fact that there is lack of methodologies and tools for measuring quality factors in higher education teaching, learning and assessment, and for producing quality metrics in support of closing the loop from measuring quality to curriculum enhancement and possibly reform. Additionally, the aim is set so that the findings of this research will benefit higher education stakeholders and policy makers internationally in further understanding the value of quality metrics on teaching, learning and the curriculum for the advancement of the education offered to students.
The main purpose of this paper is to present the main methodological work as it relates to Steps 3 and 6 of the research model presented in Section 3, Figure 1, and the key findings of two rounds of qualitative research, presented in Section 4.1 and Section 4.2 respectively.
Round 1 was performed to investigate the specification of the constructive perceptions of higher education administrations for the determinants of quality (see Appendix A for questionnaire). For this purpose, a thorough critical literature review resulted in the specification of several parameters which define quality. In Section 4.1, the key findings of this qualitative analysis are presented.
The main purpose of the qualitative research in Round 2 was to construct a theoretical framework about an integrated model of quality in higher education, aiming to understand metrics or key performance indicators for the main dimensions of the tested model (see Appendix B for questionnaire). In Section 4.2, the key findings of this qualitative analysis are presented.

2. Literature Review

The quality of a university and its measurement have been on the agenda of university policy since the 1980s [4]. It is commendable to pinpoint McDonald’s [5] notion of quality assurance, who claims that “in higher education, quality should not be rigidly ‘defined’, but seen as a flexible notion used in ways that are appropriate for the particular circumstances”. Additionally, quality policies should be tailor-made to the institution’s goals and objectives, mission and affected stakeholders. Moreover, the view of De Ketele [6] that quality is a concept which is difficult to define due to its multidimensional and relative nature is acknowledged. In the same sense, Sanyal and Martin [2] suggest that because quality means different things to different stakeholders and it is difficult to reconcile all of them, the definition of quality is a political process.
Furthermore, Deming [7] borrows ideas from the world of business to justify the need for quality in higher education. He says: “How can quality of teaching, learning and curriculum be improved? Is it enough to say that we as tutors, teachers, professors, staff or management of an educational institution are doing their best efforts? It is almost obvious that if everyone is doing their best efforts towards a different direction, efforts most probably will not bring the expected result. For individual best efforts to be effective, there is a need of a common vision, goals, and guidance. Ultimately there is a need for an orchestrating plan and a specific process towards the achievement of better quality”.
Against Deming’s ideas, McDonald’s [5] notion of quality assurance is posed: “Quality in higher education is not the simple concept that it can be in commerce, and industry. Quality may have one or more meanings, depending on the stakeholder, the relevant goals and objectives, and the mission of the institution. Thus in higher education, quality should not be rigidly ‘defined’, but seen as a flexible notion used in ways that are appropriate for the particular circumstances”.
As university education is evolving, McLean [8] points out that “individuals and institutions can be transformed for better and worse whether or not we are seeking radical change”. We agree with McLean’s point of view, and we further develop it by saying that the academic ‘transformation’ encompasses innovative teaching methods and pedagogies, more technology-infused curricula and the measurement of the above.
On another note, and referring to sustainable curricula, according to Sterling [9], sustainability is not just another issue to be added to a curriculum, but rather can be a gateway to a different view of curriculum, pedagogy, organizational change, policy, and ethos. At the same time, HEIs are expected to play a significant role in contributing to creating a more sustainable world through their major functions of education, research and outreach [10]. Considering the points of view of Sterling [9] and Fadeeva and Mochizuki [10], there is an evident correlation between quality education and sustainable development. One of the challenges academic institutions in higher education are facing is that of planning for and ensuring the sustainability of their academic programs. This is probably the biggest challenge, since in its epicenter lies the development of quality curricula—the core competency of higher education institutions. At this point, it is necessary to clarify that the perspective from which the term “sustainability” is viewed emphasizes how education can become more sustainable, and not education for sustainability, which involves mainly environmental theories and practices.
In further reviewing the literature of higher education research, two dominant complementary perceptions of quality have been identified. From one standpoint, quality is anticipated as a resulting outcome of many contributing factors in which well documented systems for their measurement have attached values. Consequently, the measurement and management of quality is a matter which keeps higher education stakeholders away from an agreement to apply a standardized set of tools and measurable indicators; notably, customer perception, value and repurchase intention have been investigated lately as purely external ones [11], but here, focus is placed on the internal factors. One critical research problem associated with this approach is directly linked to the perceptions of value metrics of overall quality, which then may be connected to perceptions and measures of value as perceived by students [12]. From a practical point of view, transparent mechanisms for the measurement of quality and control mechanisms need to be established. From the other standpoint, quality is perceived as a continuous improvement process; thus, it is important to clarify and to support all the transformative stages that constitute the life cycle of quality development within an academic institution. This second approach is quite complex in terms of conceptual modelling requirements, mainly because of the great variety of institutions’ mission, goals, and legislation under which the institutions operate. In the following section, the drivers and methodological approach for the study of quality perceptions in higher education are presented.
The researchers’ notion of quality in higher education, from the standpoint of practitioners involved in teaching, student advising, and designing courses and academic programs, is that quality in education is a multidimensional issue with several interwoven dimensions, such as:
  • Quality in teaching, learning and the curriculum;
  • The quality of the country’s education system;
  • Quality in facilities, academic resources and support;
  • Quality in the internal and external quality assurance framework;
  • Quality in learning outcomes and graduates’ knowledge and skills.
This notion has been primarily informed by the researchers’ teaching, research and academic administration professional experience in higher education. To maintain high quality standards in all dimensions, HEIs have the responsibility to adjust and develop strategies to respond rapidly to the changes in student learning needs, emerging skills, legislation and global economy, and mandates from stakeholders. As a result, HEIs are faced with the need to reform many of their existing management practices and mindsets. To this end, key performance indicators are a fundamental concept in measuring performance in multiple contexts [13]. Even though HEIs are required to keep track of KPIs for external regulatory compliance purposes as well as for the internal administration of resources, there is lack of a standardized set of KPIs measuring quality in multiple dimensions and especially quality in teaching, learning and the curriculum. The main reason for this is that it is hard to capture “qualitative indicators” such as descriptions, observations, comparisons based on non-numerical data, the assessment of the degree of student learning and the overall student experience from an academic program of study in a KPI. Thus, this paper deals with only the first dimension mentioned above, namely teaching, learning and the curriculum. According to Chalmers [14], these performance indicators typically do not involve generating the quantity of outcomes in the form of numerical data but measure complex processes and results in terms of their quality and impact. On the other hand, “quantitative indicators” are defined as those associated with the measurement of quantity or amount and are expressed as numerical values: something to which meaning or value is given by assigning it a number [14].

3. Research Methods

Through the findings of this research, the primary aim is to fill the gap of missing KPIs to be used by universities for measuring quality in teaching, learning and the curriculum. The formulation of the aforementioned aim stems from reviewing the literature on measuring quality dimensions in teaching and curriculum design, from our experience as academics and the need for continuous improvement in academic programs. As a result, the following four main drivers prompted this research:
  • The lack of applied methodologies focusing on the integration of curriculum design, delivery and outcome assessment;
  • The need for transparent mechanisms for the measurement and control of quality in curricula;
  • The need to inform the curriculum design process with quality perceptions for a learner-centric focus;
  • The need to investigate effective knowledge dissemination methods of tacit knowledge with the support of innovative learning management systems.
The methodological approach followed was initially presented in a paper by Varouchas, Lytras and Sicilia in 2016 [15] and involves the seven steps outlined below:
Step 1:
Conduct a literature review: overview of quality variables to be used in the design of the research tool;
Step 2:
Design of the initial research model, mostly informed by the critical review of the literature;
Step 3:
Perform focused qualitative research for perceptions of higher education administrators to inform and to update the initial research model;
Step 4:
Revise the research model informed by a critical literature review and by perceptions of key higher education stakeholders;
Step 5:
Run a quantitative analysis related to quality metrics: application of data mining techniques to the data collected;
Step 6:
Develop a prototype research instrument for the collection of data on hermeneutic factors of quality (data collection from higher education academics in Greece and abroad);
Step 7:
Finalize the instrument for measuring quality KPIs and implications of the research.
The literature review involved desktop research and the compilation of at least 100 scientific articles published in indexed impact factor journals for quality assessment in higher education, total quality management and knowledge dissemination over the last 10–15 years (Step 1 of the methodology). The initial research model was informed by the critical review of the literature and provided the basis for the two rounds of quality-focused structured interviews.
Following this, the first round of qualitative research through structured interviews with 10 higher education administrators and professors in Greece was performed, aiming at a more thorough understanding of the perceptions on quality components in higher education and informing the initial research model (Step 3 of the methodology). More specifically, interviewees included academic department heads from the School of Business and the School of Liberal Arts and Sciences at Deree—The American College of Greece, all professors in various disciplines such as Information Management, International Business, Finance, Tourism and Hospitality, Psychology, and English. The outcome of their input was used to complement the literature review and shape a structured questionnaire, which formed the main instrument for the collection of data from the higher education community across the world and then was used for quantitative analysis.
Afterwards, as indicated by Step 5 of the methodology, the quantitative study derived a three-tier content, process, and engagement model with 20 quality factors, and highlighted a set of key performance indicators for further investigation (see Section 4, Figure 3).
Finally, a second round of qualitative research through focused structured interviews were performed with 13 higher education administrators and professors from Greece and abroad, aiming at a more thorough understanding of the perceptions of quality components in higher education and at producing key performance indicators (Step 6 of the methodology). More specifically, interviewees included deans and academic department heads from the School of Business and the School of Liberal Arts and Sciences at The American College of Greece, and professors in various disciplines such as Information Management, International Business, Finance, Tourism and Hospitality, Psychology, and English. More specifically, the interviewee list included professors from Greece, the United States, the United Kingdom and Spain. Interviewees were selected because of their willingness to participate and contribute to this research, their deep knowledge of teaching and assessment practices and their experience in administering academic units at their universities. At this point, it is necessary to clarify that the second round of interviews did not include the same participants as Round 1.
Regarding the methodology adopted for analyzing the qualitative data gathered from the interviews, the constant comparison method was used. As Maykut and Morehouse [16] point out, “words are the way that most people come to understand their situations; we create our world with words; we explain ourselves with words; we defend and hide ourselves with words”. Thus, in qualitative data analysis and presentation, “the task of the researcher is to find patterns within those words and to present those patterns for others to inspect while at the same time staying as close to the construction of the world as the participants originally experienced it”. Qualitative data analysis involving identifying, coding, and categorizing patterns found in respondents’ perceptions was performed. More specifically, a line-by-line analysis of the text of the responses was performed, and codes were given to words or phrases that represented units of data associated with a concept. Then, quality perceptions were grouped into categories that best fit the data. The categories were related directly to the questions asked in the structured interview.
As far as the number of participants is concerned, according to Baker, Edwards and Doidge [17], the amount of qualitative data does not depend on the number of interviews but on the depth of the interview and how well the researcher uncovers the participants’ thoughts. Additionally, a small number of participants can offer researchers insights into research projects that target participants from a specific group (e.g., department heads, faculty).
In the next section, the key findings of the qualitative analysis are presented.

4. Results and Discussion

4.1. First Round of Qualitative Research

The detailed research design presented in the previous section has supported the collection of a significant amount of qualitative data from higher education administrators. In this section, the qualitative analysis of the data collected has a threefold objective:
  • First, to analyze the basic perceptions of higher education administrators and professors in terms of the complementary value components of quality. The objective is that the integration of the perceived complementary aspects will enlighten a detailed mapping of quality metrics;
  • Second, to reveal several concerns and limitations as perceived by administrators and professors related to the integration of the quality value components to the design of learning content and academic programs;
  • Third, to emphasize the understanding of hidden or existing relationships between quality perceptions and performance indicators from different perspectives. Thus, the next methodological step will lead to the identification of several qualitative key performance indicators.
Several value components are revealed, and their interpretation may guide the justification of various initiatives in higher education organizations. Additionally, several quality perceptions of interviewees and the main arguments outlined in their statements have been mentioned repeatedly in the clear majority of responses. At the same time, the opinions of respondents on their perceptions have coincided in all responses.
In a synthesis of their perceptions, respondents’ arguments have been clustered to formulate a set of aspects of perceptions considered critical in integrating quality in the educational process. The key arguments provided refer to teaching qualifications instructors need to hold, together with research activity they demonstrate every academic year. Thus, well-qualified and research-active faculty are able to inform their teaching through research in their field and at the same time assist students in reaching learning outcomes at the course and program level. Additionally, respondents argue that teaching content should be customized to address course learning outcomes and different student learning styles. In this way, students will be motivated to engage in active learning and consequently develop skills in team work, problem-solving, technology and innovation among others. Moreover, most respondents pointed out the importance of the integration and application of theoretical knowledge into addressing real-life problems and situations. This could be achieved through innovative assessments and student engagement with the industry and job market.
It was interesting to observe that respondents with academic administration experience and service in university committees agree that quality can only be maintained through an established quality assurance system, with clear, automated procedures geared toward promoting quality outcomes.
The synthesis of the previous perceptions provides numerous interesting insights. A first interpretation of the commentary aspect for quality perceptions is provided in the proposed model in Figure 2. A three-dimensional value integration space for quality value components is well defined and is linked with the critical theoretical model that was presented in previously published work. According to Varouchas and Sicilia [18], the dimensions and the value ingredients of this value space include three dimensions and 20 value components which require further investigation (see Figure 2 below).

4.2. A Second Round of Qualitative Research—Drafting KPIs for Quality Measurement

Having developed the key contribution summarized in Figure 2 above, a subsequent thread of qualitative research strategy is required to elaborate and confirm several quality measurements. For this reason, a second round of focused structured interviews with key stakeholders (academic deans and department heads) was conducted. As mentioned in the introduction, the main purpose of this qualitative research is to understand some metrics or key performance indicators for the main dimensions of the tested model.
Once more, the constant comparison method was used to analyze the qualitative data gathered from 13 interviews with academic administrators and professors in higher education. Following the analysis and codification of a detailed research agenda, a summary of the main findings which include constructive responses for the formulation of constructs and candidate KPIs is presented below.
Construct 1: Time for Preparation of courses/effort invested in design
Question 1:How much effort do you make in the preparation to teach a required course in your discipline? Do you believe that the time you invest in the preparation of content is a key ingredient of quality? Elaborate on this statement.
Construct #1 Summary of Findings—Key Quotes
  • To teach a course effectively, one would need over 3 h of preparation per credit hour per week; in addition to this, a faculty member needs to be constantly informing oneself on developments in their field of expertise, which adds significantly to the minimum preparation time cited above.
  • Preparation of the content is a key quality factor in teaching for two main reasons: for addressing learning objectives and outcomes and for making the course interesting to students.
  • I would guess that it would take me between 25 to 35 h to teach a required course in economics or finance. This time differs for the principles classes, which I have taught for decades and which are easier for me to prepare, while upper level classes typically require more time. Yes, I certainly do believe that the time spent is an indicator of quality.
The main finding is that it is evident that time devoted for preparing a course is critical as well as the time devoted to updating material and to engaging students with the learning content and context. One generic KPI which will further be developed in future research is recommended:
  • Preparation time = developments of the field + frequency of teaching + motivation time + engagement scenario + core knowledge
Construct 2: Technology-enhanced learning utilization
Question 2:Which are the main technologies you deploy in your classroom? Can you elaborate on the added value contribution of the use of information & communication technologies (ICTs) in your classes? For example, what do you think about the use of educational videos from YouTube? Are there any prerequisite factors for the use of the technology in the classroom?
Construct #2 Summary of Findings—Key Quotes
  • I use PowerPoint and videos in every lecture (videos that present company cases or examples), and a simulation game.
  • Moodle provides the basis for most of my modules. It is important that the taught content of the video is placed properly in the context of the LOs and assessment units I have designed. I have also experimented with ‘clicker’ technology in larger lectures—to get some instant feedback from the students on the degree to which an important element of a course has been understood.
  • Blackboard tools (i.e., journals, blogs, and discussion forum) contribute to making the class more interactive and facilitate exchanges both between the instructors and among students. They also allow for class discussions to be extended online, facilitate the supervision of projects (work in progress), peer review, coordination of group assignments between students. As far as teaching is concerned, I use PowerPoints in which I frequently embed audio-visuals, slinks to interesting articles or research findings, as well as educational videos. The use of ICT is essential for today’s teaching environment: It can be used to illustrate elaborate ideas or concepts in a student-friendly way; it promotes a more interactive approach to teaching and learning; it can facilitate class discussions; and it is compatible with the habits of the generation of “digital natives” and our culture’s emphasis on the visual.
  • The main technologies that I use are a blackboard/canvas; excel and video content from sources like Khan Academy, YouTube, TED talks and Merlot. Current media such as CNN, CNBC etc. are also used.
The main finding is that it is evident in the responses that technology-enhanced learning solutions are perceived as key quality drivers in higher education. There are a variety of approaches and technologies available. One generic KPI which will further be developed in future research is recommended:
  • Technology infusion = (blended + CMS)/traditional
Construct 3: Academia–industry partnerships
Question 3:To which extent do you use industry project engagements in your classes? Can you name some transferable skills acquired by students through these engagements?
Construct #3 Summary of Findings—Key Quotes
  • My aim is to use live assignments in all my courses, but I try also to maintain the relevant equilibrium in the themes of the assignment, and the topics. Transferable skills could be professionalism, teamwork, and leadership.
  • Industry project engagements provide students with practical problem-solving skills; realistic development goals; customer-facing skills; and project management, planning and reporting skills.
  • Executives from the company deliver the project brief to students, deliver company presentations, provide support to student teams, and they attend the final student presentations. Transferrable skills are built through these projects in varying degrees: communication skills, reporting skills, presentation skills and teamwork skills, and leadership, time management, and negotiation skills.
The main finding is that it is evident that most respondents recognize the need to align their course with industry requirements and real-world problems. Thus, a critical component in the proposed KPIs is related to industry orientation and alignment. Two generic KPIs which will further be developed in future research are recommended:
  • Industry alignment = number of case study analyses per course x time allocated per analysis/total course teaching hours in an academic term.
  • Interaction with practitioners = number of interactions per course per academic term
Construct 4: Students’ research outcomes and quality
Question 4:Do you have any criteria for measuring the quality of the research work of your students? Are you interested in measuring the dissemination of their work? For example, how many research papers are published from students’ coursework?
Construct #4 Summary of Findings—Key Quotes
  • Yes, I would be interested in measuring the dissemination of students’ work. I am currently looking for the relevant student journals that they could use to publish very good papers from their coursework.
  • I do not have any criteria measuring the quality of my students’ research work.
  • Important criteria are the appropriate use of suggested research methodology quality and appropriate use of sources of sources (updated bibliography, classic works, relevance to the specific topic, referencing/citations), concept use (terminology) and concept development, sociological relevance, application of theory, connection of research finding to relevant bibliography, organization and focus of the paper, development and clarity of argument, use of language and technical issues.
  • We have a rubric and marking scheme that we use to evaluate students’ research work. I would be interested in measuring the dissemination of their work.
The main finding is that Ii is evident that most respondents recognize that student research works improve the quality perceptions of a course and its impact. Thus, a critical component in the proposed KPIs is related to research works and depth. Two generic KPIs which will further be developed in future research are recommended:
  • Research works = number of student research works delivered per academic program
  • Research work depth = number of student works published in peer reviewed conferences
Construct 5: Engagement
Question 5:Do you promote discussion on a given topic among students in your classroom? Are you interested in the generation of new ideas on the topic discussed coming from students? How do you balance critical thinking and knowledge transfer in your lectures? Do you have any good recommendations: for example, 50% knowledge transfer and 50% critical thinking?
Construct #5 Summary of Findings—Key Quotes
  • It is difficult to balance critical thinking and knowledge on 50:50 basis, but given the nature of our discipline, that is, philosophy, and the relevant courses, it is fundamental to combine both these two components. I try at least to have a 40% (critical thinking) and knowledge (60%) balance.
  • I think there is no rule about balancing critical thinking and knowledge transfer; everything depends on the cohort, and that is the golden rule for me.
  • Elaborating business cases serves the purpose of in-class discussion and exchange of ideas. In order to gain critical knowledge, students should have grasped theory as well as alternative interpretative frameworks. Thus, balancing knowledge and critical thinking is not a task which is easily accomplished.
  • I use educational videos as the starting point of a discussion or alternatively a case study, a graph some visual asking students to interpret and elaborate on the relevant topic. I ask students to contribute as I am presenting new material, to express their views, share experiences, and provide illustrations. Allocating about 1/3 of class time to class discussion should be appropriate.
The main finding is that it is evident that most respondents recognize that the engagement of students in interactive discussions promote the quality and the impact of teaching and learning. Additionally, most respondents replied that balancing knowledge and critical thinking is not a task which iseasily accomplished. For this reason, we have excluded references to balance from the proposed KPI. Thus, a critical component in the proposed KPIs is related to engagement. One generic KPI which will further be developed in future research is recommended:
  • Engagement = documented discussions/total number of 50-min lectures per course
Construct 6: Competencies and skills
Question 6:Do you constantly associate learning objectives to transferable skills? Do you assign a specific number of teaching assignments to students? Can you give an example stating key elements in such an assignment? For example: in order to promote critical thinking, I design the X assignment.
Construct #6 Summary of Findings—Key Quotes
  • In my courses, a cognitive skill in relation to problem solving is assessed through case study analysis. Students need to analyze and solve a real case, using the Harvard case study methodology.
  • In my field, that is, English, learning outcomes are directly related to transferable skills. All assignments require that students exercise their critical thinking skills by unpacking layers of meaning in various types of texts.
  • Research projects are typically connected to specific learning outcomes in my courses.
  • Learning outcomes are directly related to skills acquisition.
  • In my technology introductory course, students are assigned the development of a video which they share with their classmates through Blackboard. Then, based on a rubric I give them, they evaluate and rate their classmates’ videos.
The main finding is that most respondents recognize that practical and transferable skills as well as skills and competencies promote the quality and the impact of teaching and learning. Thus, a critical component in the proposed KPIs is related to engagement. One generic KPI which will further be developed in future research is recommended:
  • Skillset = number of intended skills per course/average class grade per course
Construct 7: Inter/multi-disciplinary character
Question 8:What about the interdisciplinary character in the courses you teach? Can you name how many contributions from different disciplines you utilize in teaching your courses? For example, in the X course I teach, I use main contributions from four disciplines: Computer Science, Sociology, Psychology and News Media.
Construct #7 Summary of Findings—Key Quotes
  • The field I teach is interdisciplinary by its nature. We use concepts from different disciplines and emphasize the importance of economic, social and ecological dimensions of environmental issues. I try to present as many perspectives as possible so that students make connections with their disciplines. I ask students to reflect on how each discipline could help study a problem and help towards its solution. Information technology, math, different branches of natural sciences, social sciences (sociology, economics), law, ethics, and policy making are some of the disciplines that are involved in the study of the topics I present.
  • We live our lives in an interdisciplinary, multicultural and global fashion and our students should be educated like that to be successful citizens and employees. All of my classes have content from Politics, Geopolitics and Sociology.
The main finding is that it is evident that most respondents recognize that the multi/inter-disciplinary character of a course promotes the quality and the impact of teaching and learning. Thus, a critical component in the proposed KPIs is related to its inter/multi-disciplinary character. One generic KPI which will further be developed in future research is recommended:
  • Inter-disciplinary Character = number of disciplines applied in teaching material in a course
Construct 8: Metrics
Question 9:If you were asked to write down a formula for quality in higher education, what factors would you include? For example, quality = time allowed for preparation + pedagogy + student engagement.
Question 10. Name one metric from your own perception for the quality of education in higher education. For example, “quality metric #1 = # of students passing a course / # of total students enrolled in this course” or “quality metric #2 = # of papers presented in conferences / # of papers delivered in a course assessment from students”.
Construct #8 Findings
In response to question 9, the interview participants suggested different formulas for the measurement of quality (QFs), based on their teaching experience and active involvement in curriculum design and review:
  • QF #1 = Time allowed for preparation + scholarship/academic expertise + pedagogies + student engagement;
  • QF #2 = Selected students + meaning of knowledge + engagement + dedication;
  • QF #3 = Faculty expertise + pedagogies + high academic standards;
  • QF #4 = Planning + preparation + personality + pedagogy + physical environment + assessment.
Similarly, in response to question 10, the following quality metrics (QMs) were suggested by interview participants and are summarized below:
  • QM #1 = Papers presented in conferences;
  • QM #2 = Job positions which business students get 5 years following graduation;
  • QM #3 = Successful teaching of transferable skills;
  • QM #4 = Synthesis of concepts;
  • QM #5 = Ability for independent study;
  • QM #6 = Ability to solve problems;
  • QM #7 = Ability to collaborate in teams;
  • QM #8 = Number and quality of faculty publications;
  • QM #9 = Student satisfaction and happiness;
  • QM #10 = Student engagement;
  • QM #11 = Number of students with high/good performance in course assessments.
In the Table 1 below, nine generic KPIs derived from the research findings are listed.
In Figure 3 below, the integrative model for the study of quality perceptions in higher education is introduced and, together with the nine generic KPIs shown in Table 1, will provide the basis for future research. These KPIs will be applied to measure quality dimensions and produce quality metrics which will eventually be used by academic administrators and decision makers for quality enhancements leading to the sustainability of higher education curricula.

5. Conclusions and Outlook

Despite the common agreement among academics on the importance of quality in higher education, a consensus on its conceptualization has not been reached yet. Quality measurements stemming from KPIs provide the basis for rethinking the curriculum and enhance the pedagogical strategies for developing sustainable higher education programs of study. According to the point of view of Yarime and Tanaka [19], the content and delivery of these programs will reflect inter-disciplinary system thinking, dynamics and analysis for all majors, disciplines and professional degrees: education would have the same lateral rigor across the disciplines as the vertical rigor within them. A key result included in the research findings is that quality indicators could be encapsulated in KPIs to measure multiple dimensions of quality in higher education. It is in the hands of HEIs to decide when and how to thoughtfully and effectively integrate quality metrics in their systematic quality assurance processes to achieve greater efficiency and accountability within their organization [20]. Additionally, the significance of measuring quality will make faculty—the main actors in quality assurance—realize that they are holding an instrumental and challenging role in the quality assessment process, and they are not simply entities who have to perform yet another clerical and time-consuming task.
The findings suggest the need for additional inquiry in future work, especially in two directions: first, the direction of refining and standardizing KPIs and developing a software tool for measuring them. Standardization requires further research in more European universities which comply with the Bologna Process, have implemented a quality assurance system and offer accredited and/or validated degrees; second, the direction of applying quality metrics to maintain academic program sustainability. Activities for sustainability at higher education institutions should involve interdisciplinary cooperation and close collaboration with diverse stakeholders in the society. So, the plan is to continue working on designing and testing the generic KPIs developed here (see Table 1). According to the plan, this will be achieved through a pilot testing of the application of the proposed KPIs in three undergraduate courses in Greece and Spain within the year 2019. In future research, key performance indicators (KPIs) will be codified in the most appropriate category as shown by the analysis performed and further discussed through in-depth interviews with higher education administrators and faculty to further validate them and consider measuring them. These KPIs are one of the learning analytic aspects which have received growing attention in recent years [21]. The methods and analysis results of learning analytics will directly affect decision-making and strategy in higher education [22]. According to Rajkaran and Mammen [23], decisions will lead to establishing short, medium and long-term measures by academic departments to attain the strategic goals of the HEI in which they operate. This point of view is supported by Asif et al. [24], who state that the emergence of critical factors of total quality management in higher education have implications such as the development of control mechanisms including measures for performance evaluation. Finally, the metrics produced by the measurement of KPIs will provide the necessary intelligence to decision and policy makers towards enhancing university curricula (see Construct #8 Findings). The latter will be a key ingredient for ensuring the sustainability of higher education institutions.

Author Contributions

Conceptualization, E.V.; Methodology, E.V., M.-Á.S. and S.S.-A.; Resources, E.V.; Supervision, M.-Á.S.; Writing—original draft, E.V.; Writing—review & editing, M.-Á.S. and S.S.-A.

Funding

This research received no external funding.

Acknowledgments

The authors would like to thank all colleagues at the American College of Greece and University of Alcalá in Spain for their valuable contribution to the study.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

The structured interview questionnaire of Round 1:

Appendix B

The structured interview questionnaire of Round 2:

References

  1. Skilbeck, M. The University Challenged. A Review of International Trends and Issues with Particular Reference to Ireland; The Higher Education Authority: Dublin, Ireland, 2001; p. 7. ISBN 0-904556-76-X. [Google Scholar]
  2. Sanyal, B.C.; Martin, M. Quality assurance and the role of accreditation: An overview. In Higher Education in the World 2007: Accreditation for Quality Assurance: What Is at Stake? Palgrave Macmillan: New York, NY, USA, 2007; pp. 3–23. [Google Scholar]
  3. Aas, G.H.; Askling, B.; Dittrich, K.; Froestad, W.; Haug, P.; Lycke, K.H.; Moitus, S. Assessing Educational Quality: Knowledge Production and the Role of Experts Assessing Educational Quality: Knowledge Production and the Role of Experts; European Association for Quality Assurance in Higher Education (ENQA): Brussels, Belgium, 2009; ISBN 978-952-5539-30-1. [Google Scholar]
  4. Vroeijenstijn, A.I. Improvement and Accountability: Navigating between Scylla and Charybdis. Guide for External Quality Assessment in Higher Education; Taylor and Francis: Bristol, PA, USA, 1995; p. 24. ISBN 1853025461. [Google Scholar]
  5. McDonald, R.; Van Der Horst, H. Curriculum alignment, globalization, and quality assurance in South African higher education. J. Curric. Stud. 2007, 39, 6. [Google Scholar] [CrossRef]
  6. De Ketele, J.M. The social relevance of higher education. In Global University Network for Innovation (GUNi) Report on Higher Education in the World 3; Palgrave: London, UK, 2008; pp. 55–59. [Google Scholar]
  7. Deming, W. Improvement of quality and productivity through action by management. Natl. Product. Rev. 2000, 1, 12–22. [Google Scholar] [CrossRef]
  8. McLean, M. Pedagogy and the University; Continuum International Publishing Group: London, UK, 2006; p. 15. ISBN 9781847141910. [Google Scholar]
  9. Sterling, S. Higher Education, Sustainability, and the Role of Systemic Learning. In Higher Education and the Challenge of Sustainability: Problematics, Promise, and Practice; Corcoran, P.B., Wals, A.E.J., Eds.; Kluwer Academic Press: Dordrecht, The Netherlands, 2004; pp. 47–70. [Google Scholar]
  10. Fadeeva, Z.; Mochizuki, Y. Higher Education for Today and Tomorrow: University Appraisal for Diversity, Innovation and Change towards Sustainable Development. Sustain. Sci. 2010, 5, 249–256. [Google Scholar] [CrossRef]
  11. Dlačić, J.; Arslanagić, M.; Kadić-Maglajlić, S.; Marković, S.; Raspor, S. Exploring perceived service quality, perceived value, and repurchase intention in higher education using structural equation modelling. Total Qual. Manag. Bus. Excell. 2014, 25, 141–157. [Google Scholar] [CrossRef]
  12. Woodall, T.; Hiller, A.; Resnick, S. Making sense of higher education: Students as consumers and the value of the university experience. Stud. High. Educ. 2014, 39, 48–67. [Google Scholar] [CrossRef]
  13. Suryadi, K. Key Performance Indicators Measurement Model Based on Analytic Hierarchy Process and Trend-Comparative Dimension in Higher Education Institution. In Proceedings of the 9th International Symposium on the Analytic Hierarchy Process for Multi-criteria Decision Making (ISAHP), Viña del Mar, Chile, 2–6 August 2007. [Google Scholar]
  14. Chalmers, D. Teaching and learning quality indicators in Australian universities. In Proceedings of the Institutional Management in Higher Education (IMHE) conference, Paris, France, 8–10 September 2008. [Google Scholar]
  15. Varouchas, E.; Lytras, M.; Sicilia, M.A. Understanding Quality Perceptions in Higher Education: A Systematic Review of Quality Variables and Factors for Learner Centric Curricula Design. In EDULEARN16—8th Annual International Conference on Education and New Learning Technologies; IATED: Barcelona, Spain, 2016; pp. 1029–1035. [Google Scholar]
  16. Maykut, P.; Morehouse, R. Beginning Qualitative Research, A Philosophic and Practical Guide; The Falmer Press: London, UK, 1994; p. 18. ISBN 0-7507-0272-9. [Google Scholar]
  17. Baker, S.E.; Edwards, R.; Doidge, M. How Many Qualitative Interviews Is Enough? Expert Voices and Early Career Reflections on Sampling and Cases in Qualitative Research; National Centre for Research Methods Review Paper; Economic & Social Research Council (ESRC): Swindon, UK, 2012; Available online: http://eprints.brighton.ac.uk/11632/1/how_many_interviews.pdf (accessed on 29 November 2018).
  18. Varouchas, E.; Sicilia, M.A. A Qualitative Review of Academics Perceptions on Quality in Higher Education: A Key Performance Indicators Approach. In EDULEARN17—9th Annual International Conference on Education and New Learning Technologies; IATED: Barcelona, Spain, 2017; pp. 10166–10173. [Google Scholar]
  19. Yarime, M.; Tanaka, Y. The Issues and Methodologies in Sustainability Assessment Tools for Higher Education Institutions: A Review of Recent Trends and Future Challenges. J. Educ. Sustain. Dev. 2012, 6, 63–77. [Google Scholar] [CrossRef]
  20. Burke, J.C.; Minassians, H. Linking State Resources to Campus Results: From Fad to Trend—The Fifth Annual Survey; Rockefeller Institute of Government: New York, NY, USA, 2001; p. 7. [Google Scholar]
  21. Lytras, M.D.; Aljohani, N.R.; Visvizi, A.; De Pablos, P.O.; Gasevic, D. Advanced Decision-Making in Higher Education: Learning Analytics Research and Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 937–940. [Google Scholar] [CrossRef]
  22. Zhang, J.; Zhang, X.; Jiang, S.; de Pablos, P.O.; Sun, Y. Mapping the Study of Learning Analytics in Higher Education. Behav. Inf. Technol. 2018, 37, 1142–1155. [Google Scholar] [CrossRef]
  23. Rajkaran, S.; Mammen, K.J. Identifying Key Performance Indicators for Academic Departments in a Comprehensive University through a Consensus-Based Approach: A South African Case Study. J. Sociol. Soc. Anthropol. 2014, 5, 283–294. [Google Scholar] [CrossRef]
  24. Asif, M.; Awan, M.U.; Khan, M.K.; Ahmad, N. A Model for Total Quality Management in Higher Education. Qual. Quant. 2013, 47, 1883–1904. [Google Scholar] [CrossRef]
  25. Varouchas, E.; Sicilia, M.-A.; Sánchez-Alonso, S. Towards an Integrated Learning Analytics Framework for Quality Perceptions in Higher Education: A 3-Tier Content, Process, Engagement Model for Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 1129–1141. [Google Scholar] [CrossRef]
Figure 1. Methodological steps in the research.
Figure 1. Methodological steps in the research.
Sustainability 10 04752 g001
Figure 2. A methodological framework for quality perceptions in higher education (Varouchas and Sicilia, 2017).
Figure 2. A methodological framework for quality perceptions in higher education (Varouchas and Sicilia, 2017).
Sustainability 10 04752 g002
Figure 3. An integrative model for the study of quality perceptions in higher education.
Figure 3. An integrative model for the study of quality perceptions in higher education.
Sustainability 10 04752 g003
Table 1. Generic key performance indicators (KPIs).
Table 1. Generic key performance indicators (KPIs).
Generic KPIs
  • Preparation time = developments in the field + frequency of teaching + motivation time + engagement scenario + core knowledge
  • Technology infusion = (blended + CMS)/traditional
  • Industry alignment = number of case study analyses per course x time allocated per analysis/total course teaching hours in an academic term
  • Interaction with practitioners = number of Interactions per course per academic term
  • Research works = number of research works delivered per major
  • Research work depth = number of student works published in peer-reviewed conferences
  • Engagement = documented discussions/total number of 50-min lectures per course
  • Skillset = number of intended skills per course/average class grade per course
  • Interdisciplinary character = number of disciplines involved in teaching material of course

Share and Cite

MDPI and ACS Style

Varouchas, E.; Sicilia, M.-Á.; Sánchez-Alonso, S. Academics’ Perceptions on Quality in Higher Education Shaping Key Performance Indicators. Sustainability 2018, 10, 4752. https://doi.org/10.3390/su10124752

AMA Style

Varouchas E, Sicilia M-Á, Sánchez-Alonso S. Academics’ Perceptions on Quality in Higher Education Shaping Key Performance Indicators. Sustainability. 2018; 10(12):4752. https://doi.org/10.3390/su10124752

Chicago/Turabian Style

Varouchas, Emmanouil, Miguel-Ángel Sicilia, and Salvador Sánchez-Alonso. 2018. "Academics’ Perceptions on Quality in Higher Education Shaping Key Performance Indicators" Sustainability 10, no. 12: 4752. https://doi.org/10.3390/su10124752

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop