Assessment and Evaluation in Higher Education—Series 2

A special issue of Education Sciences (ISSN 2227-7102). This special issue belongs to the section "Higher Education".

Deadline for manuscript submissions: closed (30 November 2022) | Viewed by 34286

Special Issue Editors


E-Mail Website
Guest Editor
Department of Psychology and Education, Universidade Portucalense Infante D. Henrique, 4200-072 Porto, Portugal
Interests: higher education; active learning; student assessment; teacher evaluation; project-based learning (PBL); curriculum development; education management and administration
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Education and Distance Learning, Universidade Aberta (UAb), 1000-013 Lisboa, Portugal
Interests: teacher education; assessment and evaluation in education; project-based learning (PBL); higher education; curriculum development; teacher collaboration
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
CISAS, Escola Superior de Tecnologia e Gestão, Instituto Politécnico de Viana do Castelo, 4900-498 Viana do Castelo, Portugal
Interests: higher education development and evaluation; active learning; service-learning; teachers’ development and collaboration; skills development and evaluation; process-based approach
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue of Education Sciences focuses on research and practice concerning assessment and evaluation in higher education. Based on the assumption that assessment and evaluation are crucial processes for the advancement of higher education, this Special Issue intends to contribute to this line of research.

Recent research on assessment in higher education shows that the most favored assessment methods continue to be examinations within a one-and-only timeframe. Therefore, despite policy recommendations, student assessment continues to be mostly focused on a unique episode and much less aimed at promoting the development of students’ competences, which remain, in most cases, outside of the assessment process. The impact of assessment methods and tasks on students’ learning processes, including their motivation, engagement, and approaches to learning, is an important issue to be discussed. Assessment should be seen as an educational development process, rather than a final outcome or winning post. For these reasons, this Special Issue welcomes submissions regarding the different dimensions and forms of assessment within higher education. This includes topics related to assessment methods (formative and summative), assessment purposes (assessment of/for/as learning), rubrics assessment and the assessment of learning outcomes (knowledge and skills). In this line of research, we also acknowledge the need to develop research on the assessment of active learning approaches and new pedagogical methodologies, based on student-centered assessment practices, which include, but are not restricted to, self- and peer-assessment and other assessment methods that engage students directly on their own.

Currently, the challenges, strengths, and opportunities of online and virtual assessment, which were topics already on the research agenda, are now of greater concern due to the impact of the COVID-19 pandemic, with implications for all stakeholders. It is, therefore, relevant and urgent to bring new and alternative forms of assessment, also regarding distance learning, for higher education practitioners and stakeholders.

This Special Issue also understands the need for the continuous improvement, development and evaluation of higher education institutions (HEI). For this reason, in this Special Issue, we welcome research on the evaluation of HEI’s organizational management processes, evaluation of teacher performance, performance appraisal processes, evaluation of HEI performance, pedagogical innovation in HEIs, curriculum changes, evaluation of training programs and evaluation of institutional and inter-institutional practices of innovation.

Original and unpublished works reporting on empirical studies, research articles, reviews, case studies, and concept papers discussing this topic will be considered for acceptance in this issue. We look forward to your contributions!

Dr. Sandra Raquel Gonçalves Fernandes
Dr. Marta Abelha
Dr. Ana Teresa Ferreira-Oliveira
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Education Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • assessment of student learning
  • development and assessment of competences
  • assessment methods and practices
  • online assessment and distance learning
  • COVID-19 impact on assessment practices
  • research in assessment and evaluation
  • evaluation of teacher performance
  • evaluation of pedagogical innovation
  • evaluation of curriculum and education programs
  • organizational evaluation of higher education institutions
  • performance evaluation of higher education institutions
  • evaluation of institutional and inter-institutional practices of innovation

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

19 pages, 2705 KiB  
Article
Online Peer Assessment for Learning: Findings from Higher Education Students
by Paula Loureiro and Maria João Gomes
Educ. Sci. 2023, 13(3), 253; https://doi.org/10.3390/educsci13030253 - 27 Feb 2023
Cited by 10 | Viewed by 3644
Abstract
Assessment practices in the higher education (HE) context have undergone profound changes over recent years, particularly regarding their purpose, strategies, and available resources. This exploratory study seeks to analyze, through the perceptions of HE students, the contribution and adequacy of an assessment for [...] Read more.
Assessment practices in the higher education (HE) context have undergone profound changes over recent years, particularly regarding their purpose, strategies, and available resources. This exploratory study seeks to analyze, through the perceptions of HE students, the contribution and adequacy of an assessment for learning strategy, namely, online peer assessment (OPA), inspired by the conceptual framework of the PrACT Model, a framework which aims to contribute to the dissemination of alternative assessment practices. The main data collection technique used was the survey questionnaire and the study participants (n = 16) were students from a higher education institution in Portugal. Results point to the lack of student experience in the practice of OPA and are discussed in conformity with the dimensions of the PrACT framework. OPA is considered, from the student’s perspective, an adequate alternative digital assessment strategy, contributing to student motivation as well as to the development of cognitive, metacognitive, and digital skills. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

18 pages, 4401 KiB  
Article
Rubric’s Development Process for Assessment of Project Management Competences
by Mariane Souza, Élida Margalho, Rui M. Lima, Diana Mesquita and Manuel João Costa
Educ. Sci. 2022, 12(12), 902; https://doi.org/10.3390/educsci12120902 - 9 Dec 2022
Cited by 2 | Viewed by 2527
Abstract
Assessment rubrics are recognized for their positive effects, being defined as an evaluative instrument that establishes assessment criteria and performance levels. In this sense, assessment rubrics can be associated with professional practices for more authentic assessment processes. In the context of Project Management, [...] Read more.
Assessment rubrics are recognized for their positive effects, being defined as an evaluative instrument that establishes assessment criteria and performance levels. In this sense, assessment rubrics can be associated with professional practices for more authentic assessment processes. In the context of Project Management, the International Project Management Association (IPMA) has developed a framework that establishes the individual competences for professionals working in the area, the Individual Competence Baseline (ICB). The objective of this study is to propose a process of rubric development for competence assessment in Project Management. A rubric for Leadership competence was developed to show the applicability and relevance of the proposed process. The research methodology adopted in the study was Design Science Research. The application and evaluation of this rubric in a pilot study show that the rubric development process allowed the creation of a specific rubric for the assessment of leadership competence. This paper guides those who need to develop and assess project management competences, and it is intended to propose a replicable process for the other ICB competences. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

11 pages, 223 KiB  
Article
Analysis of Quality Teaching and Learning from Perspective of University Students
by Marek Vaclavik, Martin Tomasek, Iva Cervenkova and Barbara Baarova
Educ. Sci. 2022, 12(11), 820; https://doi.org/10.3390/educsci12110820 - 16 Nov 2022
Cited by 1 | Viewed by 2134
Abstract
This paper presents the results of empirical research focused on the quality of teaching and learning methods, from the perspective of master’s students at one of the Czech universities. The research focused on learning outcomes, teaching forms and methods, and the use of [...] Read more.
This paper presents the results of empirical research focused on the quality of teaching and learning methods, from the perspective of master’s students at one of the Czech universities. The research focused on learning outcomes, teaching forms and methods, and the use of ICT technologies following a quantitative survey in this area, which showed the need to examine the topic in depth and in a broader context. Data for the qualitative research were collected through in-depth interviews; the primary research method was focus groups. The data were processed and analysed by coding techniques. The results showed that students prefer teaching and learning outcomes associated with the use in future practice. The teaching forms depend on the teacher’s style rather than on the declared description in the curriculum. Contrary to most practices, students prefer teaching methods that lead to active learning. The advantages are identified in the frame of involvement of ICT in teaching, which makes sense and positively impacts students’ learning; however, the effect depends on how the teaching forms are used. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
24 pages, 1172 KiB  
Article
An Interval AHP Technique for Classroom Teaching Quality Evaluation
by Ya Qin, Siti Rahayu Mohd. Hashim and Jumat Sulaiman
Educ. Sci. 2022, 12(11), 736; https://doi.org/10.3390/educsci12110736 - 24 Oct 2022
Cited by 2 | Viewed by 1619
Abstract
Classroom teaching evaluation is one of the most important ways to improve the teaching quality of mathematics education in higher education, and it is also a group decision making problems. Meanwhile, there is some uncertain information in the process of evaluation. In order [...] Read more.
Classroom teaching evaluation is one of the most important ways to improve the teaching quality of mathematics education in higher education, and it is also a group decision making problems. Meanwhile, there is some uncertain information in the process of evaluation. In order to deal with this uncertainty in classroom teaching quality evaluation and obtain a reliable and accurate evaluation result, an interval analytic hierarchy process (I-AHP) is employed. To begin with, the modern evaluation tool named RTOP is adapted to make it more consistent with the characteristics of the discipline. In addition, the evaluation approach is built by using the I-AHP method, and some details of weights of the criteria and assessors are developed, respectively. Thirdly, a case study has been made to verify the feasibility of the assessment approach for classroom teaching quality evaluation on mathematics. Finally, a comprehensive evaluation of classroom quality under an interval number environment is conducted, and some results analyses and comparisons are also discussed to show that the proposed approach is sound and has a stronger ability to deal with uncertainty. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

27 pages, 892 KiB  
Article
Ready for a Career in the Agriculture Sector in Egypt? Perceptions of Students, Faculty, and Employers on the Value of Essential Technical and Employable Skills
by Ramjee P. Ghimire, D. Hashini Galhena Dissanayake, Karim Maredia, Nanda P. Joshi and Paul Ebner
Educ. Sci. 2022, 12(10), 713; https://doi.org/10.3390/educsci12100713 - 17 Oct 2022
Cited by 1 | Viewed by 1551
Abstract
High unemployment among college graduates has been a big concern in Egypt for many years now. Mismatch in technical competencies and lack of job-oriented skills and inequity in education and career by gender pose a major constraint for Egyptian youth to find employment. [...] Read more.
High unemployment among college graduates has been a big concern in Egypt for many years now. Mismatch in technical competencies and lack of job-oriented skills and inequity in education and career by gender pose a major constraint for Egyptian youth to find employment. Information about whether the gender of the mentor has any effect on the quality of mentoring is also nonexistent. Using web and in-person survey data among agricultural students, faculty, and potential private sector agribusiness employers, this paper attempts to investigate whether there are any significant differences in the application and use of career guidance by male and female students to prepare for careers prospects and align with the industry needs. The results will validate if there are significant differences in between male and female faculty in their perception of relevance of technical and employable skills as well as the use and application of career guidance and mentoring by students to increase their prospects with employers. The findings will be used to develop interventions that would help align student skills with employer expectations as well as upgrade faculty competencies. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

12 pages, 765 KiB  
Article
E-Learning Courses Evaluation on the Basis of Trainees’ Feedback on Open Questions Text Analysis
by Dimitrios O. Tsimaras, Stylianos Mystakidis, Athanasios Christopoulos, Emmanouil Zoulias and Ioannis Hatzilygeroudis
Educ. Sci. 2022, 12(9), 633; https://doi.org/10.3390/educsci12090633 - 18 Sep 2022
Cited by 3 | Viewed by 2229
Abstract
Life-long learning is a necessity associated with the requirements of the fourth industrial revolution. Although distance online education played a major role in the evolution of the modern education system, this share grew dramatically because of the COVID-19 pandemic outbreak and the social [...] Read more.
Life-long learning is a necessity associated with the requirements of the fourth industrial revolution. Although distance online education played a major role in the evolution of the modern education system, this share grew dramatically because of the COVID-19 pandemic outbreak and the social distancing measures that were imposed. However, the quick and extensive adoption of online learning tools also highlighted the multidimensional weaknesses of online education and the needs that arise when considering such practices. To this end, the ease of collecting digital data, as well as the overall evolution of data analytics, enables researchers, and by extension educators, to systematically evaluate the pros and cons of such systems. For instance, advanced data mining methods can be used to find potential areas of concern or to confirm elements of excellence. In this work, we used text analysis methods on data that have emerged from participants’ feedback in online lifelong learning programmes for professional development. We analysed 1890 Greek text-based answers of participants to open evaluation questions using standard text analysis processes. We finally produced 7-gram tokens from the words in the texts, from which we constructed meaningful sentences and characterized them as positive or negative. We introduced a new metric, called acceptance grade, to quantitatively evaluate them as far as their positive or negative content for the online courses is concerned. We finally based our evaluation on the top 10 sentences of each category (positive, negative). Validation of the results via two external experts and data triangulation showed an accuracy of 80%. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

19 pages, 1349 KiB  
Article
Categorized and Correlated Multiple-Choice Questions: A Tool for Assessing Comprehensive Physics Knowledge of Students
by Shabnam Siddiqui
Educ. Sci. 2022, 12(9), 575; https://doi.org/10.3390/educsci12090575 - 23 Aug 2022
Cited by 1 | Viewed by 2325
Abstract
An efficacious assessment tool is as necessary for improving physics education as are innovative and effective methods of teaching physics. Most tests focus on evaluating students knowledge in specific areas such as conceptual understating, quantitative, and analytical problem-solving skills. Testing students’ critical thinking [...] Read more.
An efficacious assessment tool is as necessary for improving physics education as are innovative and effective methods of teaching physics. Most tests focus on evaluating students knowledge in specific areas such as conceptual understating, quantitative, and analytical problem-solving skills. Testing students’ critical thinking has remained a difficult task. Further, testing students comprehensive knowledge is even more challenging. We present here a new assessment tool with the acronym “Categorized and Correlated Multiple Choice Questions” (CCMCQs) for evaluating the comprehensive physics knowledge level of students. This tool consists of correlated questions posed in three different sub-categories of physics. Those sub-categories are: (i) Conceptual understanding, (ii) Critical thinking, and (iii) Quantitative understanding. The questions are structured to first pose a conceptual question which is then correlated with a critical thinking question and that critical thinking question is further correlated with a quantitative question. Thus, all three questions are correlated with each other, and such correlated questions aid the student and teacher alike to identify learning deficiency more accurately, and guide students to self-correct their knowledge of physics by providing appropriate direction. Further, we discuss the outcomes of a one semester study on CCMCQs using data obtained from an introductory physics course. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

20 pages, 985 KiB  
Article
Effectiveness of Doctoral Defense Preparation Methods
by Eva O. L. Lantsoght
Educ. Sci. 2022, 12(7), 473; https://doi.org/10.3390/educsci12070473 - 8 Jul 2022
Cited by 2 | Viewed by 2623
Abstract
The doctoral defense is an important step towards obtaining the doctoral degree, and preparation is necessary. In this work, I explore the relation between the way in which a doctoral candidate prepares for the defense and two important aspects of the defense: the [...] Read more.
The doctoral defense is an important step towards obtaining the doctoral degree, and preparation is necessary. In this work, I explore the relation between the way in which a doctoral candidate prepares for the defense and two important aspects of the defense: the outcome of the defense, and the student perception during and after the defense. I carried out an international survey with an 11-point Likert scale, multiple choice, and open-ended questions on the doctoral defense and analyzed the data of the 204 completed surveys using quantitative and qualitative methods. The methods I used included the statistical tests of the correlation between, on the one hand, the preparation and, on the other hand, the defense outcome and student perception. I used an inductive thematic analysis of the open-ended survey questions to gain a deeper insight into the way candidates prepared for their defense. I found that candidates most often prepare by making their presentation, reading their thesis, and practicing for the defense. The most effective measure is the mock defense, followed by a preparatory course. The conclusion of this work is that doctoral candidates need to understand the format of their defense in order to be able to prepare properly, and that universities should explore either individual pathways to the defense or pilots using a mock defense and/or preparatory course to prepare their doctoral candidates for the defense. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

16 pages, 1321 KiB  
Article
The Influence of Students’ Self-Determination and Personal Achievement Goals in Learning and Engagement: A Mediation Model for Traditional and Nontraditional Students
by Ana Rothes, Marina S. Lemos and Teresa Gonçalves
Educ. Sci. 2022, 12(6), 369; https://doi.org/10.3390/educsci12060369 - 25 May 2022
Cited by 4 | Viewed by 5813
Abstract
Self-determination theory (SDT) and achievement goal theory (AGT) assume that students’ level of self-determination and the goals they pursue in class are important factors in engagement and learning. The aims of this study were to: (1) investigate the links between the students’ types [...] Read more.
Self-determination theory (SDT) and achievement goal theory (AGT) assume that students’ level of self-determination and the goals they pursue in class are important factors in engagement and learning. The aims of this study were to: (1) investigate the links between the students’ types of motivation and personal achievement goals; (2) explore how these two sets of variables relate to learning, engagement, and exploring mediation effects; and (3) understand the specificities of nontraditional students vs. traditional students, regarding the way these variables relate to each other. The study used a sample of 361 Portuguese adult students, 138 traditional (younger than 25 years old), and 223 nontraditional (active adults returning to education, 25 or older). The instruments used were: Self-regulation Questionnaire—Learning, Personal Achievement Goal Orientations Scale, Adult Learning Strategies Evaluation Scale and Behavioral Engagement Questionnaire. Path analysis for the total sample revealed that mastery goals mediated the relationship between autonomous motivation and all educational outcomes, and performance-avoidance goals mediated the relationship between introjected regulation, external regulation, and behavioral and emotional engagement. Multiple-group path analysis revealed a much stronger pattern of relationships for nontraditional students, especially between the SDT and AGT variables. The theoretical and practical implications of the study are discussed. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

20 pages, 2606 KiB  
Article
Why Do Students Prefer Augmented Reality: A Mixed-Method Study on Preschool Teacher Students’ Perceptions on Self-Assessment AR Quizzes in Science Education
by Angelos Sofianidis
Educ. Sci. 2022, 12(5), 329; https://doi.org/10.3390/educsci12050329 - 8 May 2022
Cited by 12 | Viewed by 5501
Abstract
Students’ perceptions on AR applications have gained researchers’ interest in the field of ICT-enhanced teaching and learning, especially in recent years. The current study investigates students’ perceptions concerning their learning and immersive experiences gained using AR quizzes with formative self-assessment purposes in a [...] Read more.
Students’ perceptions on AR applications have gained researchers’ interest in the field of ICT-enhanced teaching and learning, especially in recent years. The current study investigates students’ perceptions concerning their learning and immersive experiences gained using AR quizzes with formative self-assessment purposes in a science education university course during one semester. The research followed the mixed-method approach, and the data were collected sequentially by questionnaires and focus group discussions. A descriptive statistical analysis and a thematic analysis were conducted, respectively. Fifty-one (51) students participated in the quantitative data collection procedure and ten (10) of them participated in the focus groups. The results indicate that students are in favor of AR quizzes which justify their stance based on the learning gains and the immersive experiences. AR was underlined to play a significant role by creating an engaging environment of immersion. The findings support the positive stances of students over the combination of AR and formative self-assessment and highlight the role of immersion supported by AR technologies. Additionally, based on the relatively long period of application, the findings create doubts concerning the influence of the novelty effect on students’ positive stances toward AR. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Show Figures

Figure 1

Other

Jump to: Research

12 pages, 255 KiB  
Concept Paper
Towards a Framework to Support the Implementation of Digital Formative Assessment in Higher Education
by Sila Kaya-Capocci, Michael O’Leary and Eamon Costello
Educ. Sci. 2022, 12(11), 823; https://doi.org/10.3390/educsci12110823 - 17 Nov 2022
Cited by 7 | Viewed by 2999
Abstract
This paper proposes a framework to support the use of digital formative assessment in higher education. The framework is informed by key principles and approaches underpinning effective formative assessment and, more specifically, by approaches to formative assessment that leverage the functionalities of technology. [...] Read more.
This paper proposes a framework to support the use of digital formative assessment in higher education. The framework is informed by key principles and approaches underpinning effective formative assessment and, more specifically, by approaches to formative assessment that leverage the functionalities of technology. The overall aim is to provide a structured conceptualisation of digital formative assessment that supports the planning of lectures and other teaching and learning activities in higher education classrooms. At the heart of the framework, as presented in this paper, is a 12-cell grid comprising 4 key formative assessment strategies (sharing learning intentions and success criteria, questioning and discussion, feedback, and peer- and self-assessment) crossed with 3 functionalities of technology (sending and displaying, processing and analysing, and interactive environments). These functionalities of technologies are used as the basis to integrate digital tools into formative assessment for effective teaching and learning processes. For each cell in the grid, an exemplary digital formative assessment practice is described. This paper highlights the framework’s potential for enhancing the practice of digital formative assessment and its significance in light of the ongoing digital transformation. This paper concludes with suggesting a programme of research that might be undertaken to evaluate its utility and impact in higher education contexts. Full article
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education—Series 2)
Back to TopTop