Next Article in Journal
“Wholeness Is No Trifling Matter”: Toward an Epistemology of Care, Touch, and Celebration in Education
Previous Article in Journal
Exploring Students’ Learning Experience and Engagement in Asynchronous Learning Using the Community of Inquiry Framework through Educational Design Research
Previous Article in Special Issue
Analysis of Online Learning Issues within the Higher Education Quality Assurance Frame: ‘Pandemic Lessons’ to Address the Hard Time Challenges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Incentivizing Student Participation in QAS Questionnaires: An Evaluation of a Guaranteed Prize System at the University of Malaga

by
Cristina Vereda-Alonso
1,*,
Maria del Mar Cerrillo-Gonzalez
2,
Cesar Gomez-Lahoz
2,
Maria Villen-Guzman
2 and
Carlos Vereda-Alonso
2
1
Department of English, French, and German Philology, Faculty of Philosophy and Letters, University of Malaga, Campus de Teatinos, 29071 Malaga, Spain
2
Department of Chemical Engineering, Faculty of Science, University of Malaga, Campus de Teatinos, 29071 Malaga, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(3), 216; https://doi.org/10.3390/educsci14030216
Submission received: 16 December 2023 / Revised: 14 February 2024 / Accepted: 17 February 2024 / Published: 21 February 2024
(This article belongs to the Special Issue Higher Education Quality Assurance)

Abstract

:
This paper investigates the effects of a guaranteed prize incentive, in the form of an extra score, on student engagement in the quality assurance system (QAS) questionnaires employed for evaluating teaching performance at the University of Malaga. The incentive system aims to counteract declining participation rates and mitigate potential survey fatigue among students. Employing a comprehensive dataset spanning multiple academic years and subjects, the study utilized statistical analyses to evaluate the incentive’s effectiveness, considering its potential impact on both final grades and QAS questionnaire outcomes. The results demonstrate a substantial increase in participation rates, with over 85% of students acknowledging the motivating influence of the incentive. However, concerns regarding the compromise of anonymity arose among 40% of students, possibly linked to the physical presence of teachers during the verification process of the QAS questionnaire submission. The statistical analyses raise questions about the incentive’s influence on students’ final grades while indicating that the incentive system does not significantly affect the results of the QAS questionnaires. The study contributes valuable insights into the complexities of incentivizing student participation in teaching assessments within the higher education landscape. To the best of our knowledge, there are few publications that investigate the use of an additional score as an incentive for students’ participation in QAS questionnaires.

1. Introduction

The adaptation of studies to the European higher education area (EHEA) has led Spanish universities to develop systematic evaluation mechanisms for the teaching–learning process in terms of methodology and teaching competencies [1]. To enhance quality in line with the standards outlined by the European Association for Quality Assurance in Higher Education (ENQA), universities have established internal and external systematic mechanisms for evaluating the quality of their activities. The official regulations in Spain [2,3,4] mandate that university programs must incorporate an internal quality assurance system encompassing, among other aspects, evaluation and improvement procedures for teaching and staff. These programs are required to adhere to the criteria and guidelines for quality assurance outlined in the EHEA (Standards and Guidelines for Quality Assurance in the European Higher Education Area, ESG).
In this context, the evaluation of teaching activities has become a significant aspect of this assessment. Consequently, the “DOCENTIA” programs were initiated in Spanish universities in 2007 [5], aligning with standards established by internationally recognized organizations for teacher evaluation. These standards, particularly those outlined in “The Personnel Evaluation Standards” by The Joint Committee of Standards for Educational Evaluation [6], serve as a reference for designing, developing, and evaluating teacher assessments. DOCENTIA programs provide a procedure that allows addressing the evaluation of teaching activity in all areas of action of university teaching staff by analyzing four basic dimensions: (1) teaching planning, (2) teaching development, (3) results, and (4) innovation and improvement. The various stakeholders engaged in the learning process, such as department heads, deans, the teaching innovation service, teachers, and students, participate in assessing these four dimensions. Notably, the students’ perspective currently holds a significant influence on the appraisal of teaching quality [5,7]. Moreover, in the context of evaluation and improvement procedures, the established protocol includes deliberate mechanisms linking teacher evaluation to their training, acknowledgment, and promotion, as outlined in the support guide of the verification program [4].
In this sense, Spanish universities have developed questionnaires that incorporate the assessment of various items related to the teaching–learning process for each subject and for all the teachers involved in that subject. These questionaries are normally filled out voluntarily and anonymously by students at the end of the semester. Table 1 shows the questionary used at the University of Malaga, indicating the three dimensions to which the students’ evaluations contribute. The items are assessed using a Likert scale with a scoring range from 1 (completely disagree) to 5 (completely agree). The results of these questionnaires contribute 28% to the overall evaluation of teaching activity, while the remaining evaluation is contributed by teaching managers (30%), the innovation service (30%), and teachers (12%).
Teachers undergoing the accreditation process for promotion require the assessment, which is expressed numerically on a scale from 0 to 100, with the following categorization: unfavorable (<50), favorable (50–69), very favorable (70–89), and excellent (>90).
Questionnaires are a valuable tool in the assessment of teaching quality, enabling the collection of feedback from students and the generation of easily analyzable quantitative data [8,9]. The anonymity provided by questionnaires encourages students to express their opinions freely. However, the widespread use of surveys, driven by the desire to ensure teaching quality, has resulted in potential survey fatigue among students [10,11]. This saturation may affect the quality of responses, as individuals can become overwhelmed by the number of surveys they encounter. Furthermore, student participation in surveys is typically voluntary, and this fatigue can result in low participation, leading to potential underrepresentation and biased feedback.
Making survey participation mandatory could raise ethical concerns, as it infringes upon students’ autonomy and may generate coerced or insincere responses. The imposition of mandatory participation can lead to student resentment and compromise the integrity of the feedback. In such a scenario, students may provide responses just to fulfill the requirement, potentially undermining the purpose of collecting meaningful feedback. Incentivizing students to engage in surveys emerges as a potential solution [12,13]. While incentives can enhance participation rates, the choice of incentives is crucial for ensuring the integrity of feedback.
This study focused on the use of incentives to encourage student participation in the questionnaires for evaluating teaching quality. It is worth noting that we detected a lack of publications in the existing literature exploring the implementation of a supplementary evaluation score as a motivational tool for student participation in QAS questionnaires. In this context, the proposed incentive system provides an extra score equivalent to 5% of the overall attainable grade for the student. The QAS questionnaires are provided by the university’s quality service and distributed by the teacher. The responses, which serve as the primary data source in this study, were collected from the student participants through an online platform. It is crucial to emphasize that this online format ensures the anonymity of the student responses. The research was conducted across a total of five subjects, including diverse undergraduate and master’s courses from two different degree programs of the University of Malaga, to guarantee the representativeness of the study results. With the aim of assessing the impact of the incentive system, the results were compared with those from previous academic years where the incentive system was not employed. The proposed questions that served as a guide for this research are the following:
  • How does the proposed guaranteed prize incentive system influence student participation in quality assessment questionnaires?
  • Is there any significant relationship between the incentive system and students’ final grades?
  • How does the incentive system influence the student’s perception regarding the anonymity of their responses to the survey?
The assessment of the impact of the proposed prize incentive system is a critical aspect of this research. Understanding how incentives influence student engagement in the evaluation process is essential for improving the effectiveness of questionnaires as a tool for assessing teaching quality. The proposed methodology aims to assess the motivational impact of the incentive system. Studying the potential relationship between the incentive system and the students’ final grades is crucial to understand its implications on academic outcomes. Finally, this paper explores students’ perceptions regarding the anonymity of their questionnaire responses, focusing on understanding their concerns regarding the confidentiality of their feedback.

2. Materials and Methods

The study was conducted as part of an educational innovation project at the University of Malaga (see funding section), one of whose objectives was to evaluate the use of incentives in assessing teaching quality.

2.1. Participants

The initiative was implemented in five subjects across various undergraduate and master’s programs, as detailed in Table 2. Students participating in this study were enrolled in at least one of these five subjects. The age range of the participants is 20 to 24 years, comprising 46% men and 54% women.
In all the programs where the study was conducted, each semester comprises six subjects, and it is common for each of these subjects to be taught by more than one teacher. Hence, the number of questionnaires to which the students are subjected ranges between six and twelve per semester. This substantial volume of surveys has led to a decline in student participation in this evaluation process. Specifically, the teachers of the subjects analyzed in the present study observed the beginning of this decline between 2016 and 2018. Within the extensive data published in the public information section of the quality assurance system (QAS) of the University of Malaga, information on the student response rate to the QAS questionnaire (IN_45) is available only from the 2019–2020 academic year [14]. The values of this response rate for the entire university are alarmingly low: 5.20%, 14.69%, and 12.40% for the academic years from 2019–2020 to 2021–2022.
Based on these data, the necessity for implementing a student incentive system to encourage questionnaire completion is evident. Moreover, the design of this incentive should be compatible with the official procedure employed for collecting responses to these questionnaires.

2.2. Instruments

Questionnaires serve as the primary tool for gathering data from participants. This study employed two types of questionnaires: the official QAS questionnaire of the University of Malaga (Table 1) and a concise, self-developed questionnaire designed to gather information regarding the implementation of the new incentive system.
Furthermore, the distribution of students’ final grades in the academic years under investigation was obtained while maintaining the anonymity of the students. This information was sourced from the professors responsible for each of the subjects.

2.2.1. Official QAS Questionnaire

The official QAS questionnaire model is imposed by the quality service of the University of Malaga. Its structure is common for all programs offered at the university, which prevents any modification. Logically, the participants’ responses to these questionnaires are anonymous.
The current model presented in Table 1 has been in use since the 2017–2018 academic year. It represents the culmination of an evolutionary process from previous questionnaires; it initially comprised 35 questions (before 2007–2008), was then reduced to 22 questions (until 2016–2017), and then reached the current version with 14 questions. This reduction in the number of questions is designed to mitigate survey fatigue among participants.

2.2.2. Questionnaire about the New Incentive System

This questionnaire (Table 3) is intentionally concise to avoid overwhelming students with an excessive number of questions. The participants’ responses to this questionnaire are anonymous. Its aim is to gather information regarding the following aspects:
  • Evaluating students’ satisfaction with the incentive system;
  • Determining the potential compromise of anonymity resulting from the teacher’s presence during the QAS questionnaire;
  • Verifying if participation in official QAS questionnaires across all subjects taught in that semester remained low.

2.3. Procedure

2.3.1. Official QAS Questionnaire

Initially, prior to the 2018–2019 academic year, the compilation of the questionnaires was entrusted to a group of students who received a scholarship to carry out this task (1–2 students per center). These students assumed the role of surveyors, actively engaging with their peers in class to conduct the surveys. During the survey process, the teacher temporarily left the classroom for approximately 15 min, allowing the surveyors to explain the significance of gathering students’ opinions for the evaluation of teaching quality. The students manually completed the questionnaires on paper, assuring their anonymity. This collection system offered the advantage of raising a sense of peer-to-peer encouragement: A fellow student, familiar to their classmates, served as a motivating force, encouraging active participation in the questionnaire process.
In the 2018–2019 academic year, the system for collecting surveys changed to the current system. The main change involved transitioning from in-person surveys to online surveys, eliminating the role of the student surveyor and placing the responsibility of survey administration on the teaching staff. The current process at the University of Malaga is outlined below, following the guidelines from the quality service of the university. The questionnaires are filled out online, and there are two possible mechanisms for distributing them to students:
1.
On-line distribution: The access code to the questionnaire is sent to students through forums, email, etc. The teacher specifies the period during which the questionnaire will be available, typically spanning several days.
2.
In-class distribution: At the beginning of a class, the teacher provides the access code to the students present in the class. Students can then complete the questionnaire using their mobile devices, requiring the teacher to briefly leave the class for approximately 15 min.
In both procedures, students are required to identify themselves to access the questionnaire. The teacher must communicate to students that the quality service of the University ensures the anonymity of the survey. The identification process is implemented to prevent errors and duplications and to ensure that each teacher and corresponding subject are accurately evaluated. In cases of duplicate assessments, the last survey completed by the student is used.
Of the two options for distributing the questionnaires, only the second one allows us to reliably determine whether the student has participated, and thus, it enables us to encourage that participation.
The incentive employed involves providing an extra score equivalent to 5% of the total achievable grade for the student. In the case in which two teachers are involved in teaching a subject, as is the case of the IMS subject, the extra score is extended to 10%: 5% for each questionnaire associated with each teacher. The grading system in Spain ranges from 0 to 10 points. In theory, this supplementary score could enable the student to surpass the maximum grade of 10. If a student attains the maximum grade (10), the additional score would not be applicable. While one might consider that this detail may not be motivating enough for students who consistently achieve high grades, it is reasonable to assume that such students would not refrain from participating in any activity that could enhance their final grade. Naturally, teachers communicate the incentive mechanics clearly to students, ensuring a thorough understanding of the potential impact on their overall grades.
As previously mentioned, the surveys are administered in the face-to-face class by the teacher. However, this incentive system deviates from the directive from the university’s quality service, which specifies that the teacher should leave the classroom during the survey. In this innovative procedure, the teacher remains apart from the students in a corner of the classroom. When a student completes and submits the online questionnaire, they request the teacher to approach and view the acknowledgment screen presented by the online application upon survey submission. Importantly, the student’s answers to the questionnaire are not visible on the device screen during this verification. After confirming the student’s participation, the teacher records the student’s name to assign the corresponding extra score. To maintain transparency and motivation, the list of participating students along with their respective additional scores is published on the subject webpage the following day, providing the timely feedback. The results of the questionnaires from one semester are provided to the professor in the subsequent semester once the evaluation for the preceding semester has concluded.

2.3.2. Questionnaire about the New Incentive System

During the final week of classes, the teacher distributes a printed version of this short questionnaire in class. Students anonymously complete the questionnaire within approximately 10 min. Subsequently, a student volunteer collects the responses from their classmates and delivers them collectively to the teacher.
The new incentive system was initially implemented in two of the subjects involved in this study (SOPC and IMS). Following the confirmed success of the incentive in encouraging questionnaire participation, the system was subsequently extended to the remaining subjects.
In a university context, assessing the effectiveness of an incentive system for enhancing student participation in questionnaires goes beyond its success in increasing that engagement. It requires an examination of whether the extra score significantly affects a student’s final grade and an analysis of its impact on the outcomes of the QAS questionnaires. The students’ final grade should predominantly reflect the assessment of acquired skills related with the subject. To ascertain these impacts, a statistical comparison was conducted, analyzing the grades from academic years implementing the incentive system against those from previous courses where this incentive was not employed. Similarly, the outcomes of the QAS questionnaires were also compared for those same academic years. These comparative analyses serve as a comprehensive approach to gauge the system’s influence on participation, academic outcomes, and feedback from questionnaires, ensuring a detailed assessment of its overall effectiveness.

3. Results

Figure 1 shows the evolution of student participation in QAS questionnaires for the subjects SOCP and IMS. The modifications in the system for collecting questionnaire responses are also highlighted, along with the period of COVID-19 confinement in Spain. The gaps in the trends result from the fact that none of the authors held the teaching responsibility for the subject during those periods. Consequently, the feedback from those questionnaires remains private to safeguard the personal assessments of the instructors who taught the subject in those academic years.
As can be seen, participation in the questionnaires started declining in both subjects from 2016. The shift to an online survey administration did not stop this trend. The slight increase in participation observed during the COVID-19 confinement could probably be attributed to the exceptional circumstances of the confinement and the widespread imperative for interpersonal engagement. Nevertheless, participation in the academic year 2020–2021 remained consistently low. It was only with the implementation of the incentive system that a significant surge in participation became evident.
The significant enhancement in participation observed in those two subjects during the 2021–2022 academic year led to the extension of the incentive system to the remaining subjects included in this study for the 2022–2023 courses. Table 4 presents the results obtained in all five subjects over the three academic years under investigation.
Table 5 shows the responses of students to the short questionnaire regarding the new incentive system. The overall participation rate of students in this questionnaire was 70%.
The advantage of the incentive used in this study is that it imposes no monetary cost on the institution; it involves an extra score added to the student’s final grade. However, as mentioned earlier, it is essential to verify that the impact of this additional score does not significantly affect the overall final grades of students, ensuring that it does not unduly influence the academic assessment process. The box-and-whisker diagram in the Figure 2 compares the grades achieved by students over the three academic years under examination. As the same teacher taught the SOCP subject for over 5 years, its results span two additional academic years, taking advantage of the data availability.
At first glance, there seems to be no substantial impact of the incentive on the final grades of the two mandatory subjects, SOPC and IMS. Furthermore, if any effect were present in IMS, it would apparently go against what was expected. In the TSC subject, the upward trend in the average grade seems to precede the introduction of the incentive. The average grade trends in MENPP and PASEL subjects appear to exhibit more randomness.
To quantify whether the incentive significantly affects final grades, a one-way analysis of variance (ANOVA) [15] was employed to examine whether the differences between the average grades across academic years are statistically significant. Table 6 summarizes the ANOVA results for all subjects at a significance level of 0.05, treating academic years as distinct groups.
Furthermore, we can take a step further and examine whether specific groups differ significantly from one another in the four subjects where ANOVA results indicate significant differences in average grades across the three academic years under study. Table 7 presents the results of pairwise multiple comparison tests, highlighting the academic years with incentive in bold.
Finally, the outcomes of the QAS questionnaires were also compared for the same academic years across all subjects, except for MENPP, as it has questionnaire results available only for the 2022–2023 academic year. These comparisons are presented in Table 8, Table 9, Table 10 and Table 11.

4. Discussion

Survey fatigue represents a facet of respondent burden, commonly defined as the time and effort required to participate in a survey [16]. Survey fatigue can arise in both scenarios: surveys with a substantial number of questions and the administration of consecutive surveys. In the former, it typically manifests as a decline in response rates towards the end of the survey. Conversely, in the latter, it is characterized by a reduction in participant engagement [10].
In our study, we observed fatigue primarily due to the consecutive administration of questionnaires to students within the same semester. This is evidenced by the decline in questionnaire participation over time, as shown in Figure 1. Interestingly, we did not observe fatigue attributable to the number of questions within the survey, as the response rates remained consistent across all 14 questions included in the quality assurance survey (QAS) questionnaire. This finding aligns with previous research indicating that the length of the survey itself may not be the sole determinant of survey fatigue [10]. Instead, our results suggest that the timing and frequency of survey administration play a significant role in exacerbating respondent burden and subsequent survey fatigue among participants.
As can be seen in the results presented in Figure 1 and Table 4, the incentive system consistently leads to a significant increase in student participation in QAS questionnaires across all subjects. The effectiveness of incentivized surveys in increasing participation has been demonstrated in various non-educational environments [13,17,18]. Incentives such as gift vouchers, participation in raffles, and lotteries are commonly employed. However, it appears that the educational environment is not an exception, and a guaranteed prize [19,20,21] is more likely to be successful, as observed in this experience.
The following theories provide frameworks for understanding how incentives influence response rates in surveys by considering factors such as perceived value, costs, and rewards [22,23]:
1.
Social exchange theory: This theory posits that individuals weigh the costs and benefits of participating in an activity. In the context of surveys, respondents evaluate the effort required to complete the survey against the perceived rewards or incentives offered. If the perceived benefits outweigh the costs, respondents are more likely to participate;
2.
Leverage saliency theory: This theory suggests that respondents are more likely to participate in surveys when they perceive the incentives offered as valuable and relevant. In other words, incentives that are salient and meaningful to respondents are more likely to motivate participation;
3.
Benefit–cost theory: This theory emphasizes the comparison between the benefits gained from participating in a survey (such as incentives or rewards) and the costs associated with participation (such as time and effort). If the perceived benefits exceed the perceived costs, respondents are more likely to participate.
The results from the short questionnaire administered in the last week of class indicate that students are very satisfied with the incentive system (Table 5). Overall, 88% of students acknowledged that the incentive in the form of an extra score played a significant role in motivating their participation in the QAS questionnaires. This positive response suggests that the extra score is an effective incentive, aligning with the idea discussed above that a guaranteed prize is a favorable motivating factor. Our opinion aligns with the literature findings [13,22,23,24], indicating that using a guaranteed prize to incentivize questionnaire participation is consistent with leverage saliency theory. The extra score seems to exert more influence over students’ decisions to complete the survey than the time and effort required.
However, a potential concern arises: Approximately 40% of students expressed concern about the compromise of anonymity within the incentive system. This concern appears to arise from the physical presence of the teacher during the verification process of the questionnaire submission, even though the device screen only displays an acknowledgment upon survey submission. Addressing this concern could enhance the perceived confidentiality of the process. A solution to the anonymity issue could be that the quality service of the university takes on the responsibility of verifying students’ submissions. This could be achieved through an email notification sent to the teacher with a list of students who have successfully submitted the QAS questionnaire, eliminating the need for direct teacher–student interaction during that verification process. Finally, students reported completing an average of 3.8 questionnaires in the current semester, which is significantly lower than the expected range of 6 to 12 questionnaires. This suggests a notable decline in participation, reinforcing the notion of survey fatigue among students, in line with the consecutive administration of surveys [10]. Addressing this issue is crucial for maintaining the effectiveness of the evaluation process and ensuring a representative response rate.
Regarding whether the incentive significantly influences the academic assessment process, the ANOVA results summarized in Table 6 indicates that the average grades across the five academic years did not differ significantly in the SOPC subject. This indicates that the incentive system did not have a significant impact on the academic assessment in that subject. In contrast, for the remaining four subjects, the F values exceed their critical values, suggesting that the observed differences are unlikely to be attributed to random chance. This indicates that there is some factor affecting the average grade, but it does not directly identify the incentive as the factor causing the observed effect.
The results of pairwise multiple comparison tests for these four subjects (Table 7) are not conclusive. For IMS, Tukey’s test [15] suggested no significant difference for the last two academic years (with incentive); nevertheless, as shown in Figure 2, the decrease in average grade from 6.8 (2020–2021) to 5.8 (2021–2023) is not coherent with an extra score. For TCS, the pairwise multiple comparison test also indicated no significant difference for the last two academic years, one without incentive and the other with it, suggesting no significant impact of the incentive system on the average final grade. Similar discrepancies were found in the Tukey’s comparison tests across academic years for MENPP and PASEL subjects. In conclusion, the non-significance in some of the Tukey’s post hoc tests do not definitively negate the global significance found in the ANOVAs for those four subjects. It suggests that there may not be clear evidence of specific pairwise differences. The lack of clarity on the incentive’s effect on the final grade could be attributed to factors such as sample size, effect size, heterogeneity within academic years, etc. However, this does not necessarily imply that the incentive has no effect; rather, it highlights the complexity of interpreting results in the context of various statistical considerations. Therefore, the influence of the incentive on the final grade remains uncertain or inconclusive based on the current analysis. This emphasizes the need for further investigation or consideration of additional factors to draw more definitive conclusions.
Regarding whether the incentive significantly affects the outcomes of the official QAS questionnaires (Table 8, Table 9, Table 10 and Table 11), no significant impact on the results of any questions within the QAS questionnaire was detected for the SOPC, IMS, and TCS subjects at a confidence level of 0.05. Regarding the PASEL subject, the questionnaire results, available for only the last two academic years, were analyzed, with notably low participation (7%; six student responses) in the 2021–2022 academic year. This could potentially explain why questions 5, 7, 9, and 13 appeared to be influenced by the incentive system, exhibiting an increase in their assessment when the incentive was applied.
Finally, when examining the crucial question, “14. I am satisfied with the teaching performance of this teacher”, it becomes evident that the incentive system does not impact students’ satisfaction with the teaching performance across all subjects. This finding aligns with the results of other authors who did not find significant differences in response distributions among groups that received incentives or not [13,25,26,27], although it should be highlighted that those works are not in the education area.

5. Conclusions

The guaranteed prize incentive consisting of an extra score has proven to be highly effective in significantly boosting student participation rates in the QAS questionnaires used for teaching performance assessment at the University of Malaga. The notably positive feedback from students, with over 85% acknowledging that the incentive motivates their engagement, demonstrates its success in encouraging active participation. However, the incentive system should address a noteworthy concern raised by 40% of students regarding the compromise of anonymity in the QAS questionnaire due to the implementation of the incentive. This concern seems to arise from the physical presence of teachers during the verification process of the questionnaire submission. We suggest that a third-party entity, such as the quality service of the University, should take responsibility for verifying student submissions.
The emergence of survey fatigue is evident in the low number of QAS questionnaires that students reported completing per semester, aligning with the official participation rates published by the university. This highlights the importance of addressing this issue to maintain the effectiveness of the assessment process and ensure a representative response rate.
The analysis of the influence of the extra score incentive on final grades remains inconclusive. While no significant differences were found in the final grades for one subject across academic years both with and without incentives, the results for the remaining subjects did not provide a clear indication of the incentive’s impact. This underscores the complexity of interpreting the influence of incentives on academic outcomes and highlights the need for further investigation or consideration of additional factors for more definitive conclusions.
Regarding the QAS questionnaire results, the study indicates that the incentive system does not significantly affect the results for all the subjects studied.

Author Contributions

Conceptualization, C.V.-A. (Cristina Vereda-Alonso) and C.V.-A. (Carlos Vereda-Alonso); data curation, M.d.M.C.-G. and M.V.-G.; formal analysis, C.G.-L. and C.V.-A. (Carlos Vereda-Alonso); funding acquisition, C.V.-A. (Carlos Vereda-Alonso); investigation, C.V.-A. (Cristina Vereda-Alonso), M.d.M.C.-G., C.G.-L., M.V.-G. and C.V.-A. (Carlos Vereda-Alonso); methodology, C.G.-L. and C.V.-A. (Carlos Vereda-Alonso); project administration, C.V.-A. (Carlos Vereda-Alonso); resources, C.V.-A. (Cristina Vereda-Alonso), M.d.M.C.-G., C.G.-L., M.V.-G. and C.V.-A. (Carlos Vereda-Alonso); software, M.d.M.C.-G. and M.V.-G.; supervision, C.V.-A. (Cristina Vereda-Alonso) and C.G.-L.; validation, C.V.-A. (Cristina Vereda-Alonso), M.d.M.C.-G. and M.V.-G.; visualization, C.V.-A. (Cristina Vereda-Alonso), C.G.-L., M.V.-G. and C.V.-A. (Carlos Vereda-Alonso); writing—original draft, C.V.-A. (Carlos Vereda-Alonso); writing—review and editing, C.V.-A. (Cristina Vereda-Alonso) and C.G.-L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Servicio de Formación e Innovación del Vicerrectorado de Personal Docente e Investigador de la Universidad de Málaga, under the Innova Project PIE22-069 (IP Carlos Vereda-Alonso). The APC was partially funded by Universidad de Málaga.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki. No approval by the Institutional Ethics Committee was necessary, as all data were collected anonymously from capable, consenting adults.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are available upon request to the authors.

Acknowledgments

The authors acknowledge funding for open-access charge by Universidad de Málaga/CBUA. We extend our gratitude to Carlos Vereda Alonso for his philanthropic provision of the aforementioned funds, which will be reimbursed to him by the funding entity within several months, without accruing any interest.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Moreno-Murcia, J.A.; Torregrosa, Y.S.; Pedreño, N.B. Questionnaire Evaluating Teaching Competencies in the University Environment. Evaluation of Teaching Competencies in the University. J. New Approaches Educ. Res. 2015, 4, 54–61. [Google Scholar] [CrossRef]
  2. BOE-A-2007-18770 Real Decreto 1393/2007, de 29 de Octubre, Por El Que Se Establece La Ordenación de Las Enseñanzas Universitarias Oficiales. Available online: https://www.boe.es/buscar/act.php?id=BOE-A-2007-18770 (accessed on 25 November 2023).
  3. BOE-A-2021-15781 Real Decreto 822/2021, de 28 de Septiembre, Por El Que Se Establece La Organización de Las Enseñanzas Universitarias y Del Procedimiento de Aseguramiento de Su Calidad. Available online: https://www.boe.es/buscar/act.php?id=BOE-A-2021-15781#dd (accessed on 25 November 2023).
  4. Guía de Apoyo: Para la elaboración de la MEMORIA DE VERIFICACIÓN de Títulos Universitarios Oficiales (Grado y Máster). ANECA. 2023. Available online: https://www.aneca.es/documents/20123/63546/UEEII_Guia+de+Apoyo_v2_21022023.pdf/9b2b275c-7313-3f5b-ece7-b5fdeb8763f8?t=1681474871054 (accessed on 25 November 2023).
  5. Sistema de garantía de la calidad de la docencia. ANECA. 2021. Available online: https://www.aneca.es/sistema-garantia-calidad-docencia (accessed on 25 November 2023).
  6. Gullickson, A.R.; Howard, B.B. The Personnel Evaluation Standards: How to Assess Systems for Evaluating Educators, 2nd ed.; Corwin Press u.a: Thousand Oaks, CA, USA, 2009; ISBN 0-7619-7508-X. [Google Scholar]
  7. González Ramírez, T. Evaluación y Gestión de La Calidad Educativa: Un Enfoque Metodológico; Biblioteca de Educación; Aljibe: Archidona, Spain, 2000; ISBN 84-95212-83-8. [Google Scholar]
  8. Kember, D.; Leung, D.Y.P. Development of a Questionnaire for Assessing Students’ Perceptions of the Teaching and Learning Environment and Its Use in Quality Assurance. Learn. Environ. Res. 2009, 12, 15–29. [Google Scholar] [CrossRef]
  9. Cashin, W.E. Students Do Rate Different Academic Fields Differently. New Dir. Teach. Learn. 1990, 1990, 113–121. [Google Scholar] [CrossRef]
  10. Porter, S.R.; Whitcomb, M.E.; Weitzer, W.H. Multiple Surveys of Students and Survey Fatigue. New Dir. Institutional Res. 2004, 2004, 63–73. [Google Scholar] [CrossRef]
  11. Fass-Holmes, B. Survey Fatigue--What Is Its Role in Undergraduates’ Survey Participation and Response Rates? J. Interdiscip. Stud. Educ. 2022, 11, 56–73. [Google Scholar]
  12. Betancourt, N.; Wolff-Eisenberg, C. Surveying Community College Students: Strategies for Maximizing Engagement and Increasing Participation. Ithaka S+R. 2019. Available online: https://sr.ithaka.org/publications/surveying-community-college-students/ (accessed on 25 November 2023). [CrossRef]
  13. Singer, E.; Ye, C. The Use and Effects of Incentives in Surveys. ANNALS Am. Acad. Political Soc. Sci. 2013, 645, 112–141. [Google Scholar] [CrossRef]
  14. Servicio de Calidad, Planificación Estratégica y Responsabilidad Social—Resultados UMA, Centros y Títulos—Universidad de Málaga. Available online: https://www.uma.es/calidad/info/142287/resultados-uma-centros-y-titulos/ (accessed on 12 December 2023).
  15. Montgomery, D.C. Design and Analysis of Experiments, 9th ed.; John Wiley & Sons: New York, NY, USA, 2017; ISBN 0-471-31649-0. [Google Scholar]
  16. SHARP, L.M.; FRANKEL, J. Respondent Burden: A Test of Some Common Assumptions. Public Opin. Q. 1983, 47, 36–53. [Google Scholar] [CrossRef]
  17. Brown, J.A.; Serrato, C.A.; Hugh, M.; Kanter, M.H.; Spritzer, K.L.; Hays, R.D. Effect of a Post-Paid Incentive on Response Rates to a Web-Based Survey. Surv. Pract. 2016, 9. [Google Scholar] [CrossRef]
  18. McKernan, S.C.; Reynolds, J.C.; McInroy, B.; Damiano, P.C. Randomized Experiment on the Effect of Incentives and Mailing Strategy on Response Rates in a Mail Survey of Dentists. J. Public Health Dent. 2022, 82, 484–490. [Google Scholar] [CrossRef] [PubMed]
  19. Royal, K.D.; Flammer, K. Survey Incentives in Medical Education: What Do Students Say Will Entice Them to Participate in Surveys? Med. Sci. Educ. 2017, 27, 339–344. [Google Scholar] [CrossRef]
  20. Blaney, J.M.; Sax, L.J.; Chang, C.Y. Incentivizing Longitudinal Survey Research: The Impact of Mixing Guaranteed and Non-Guaranteed Incentives on Survey Response. Rev. High. Educ. 2019, 43, 581–601. [Google Scholar] [CrossRef]
  21. Crews, T.B.; Curtis, D.F. Online Course Evaluations: Faculty Perspective and Strategies for Improved Response Rates. Assess. Eval. High. Educ. 2011, 36, 865–878. [Google Scholar] [CrossRef]
  22. Koskey, K.L.K.; Cain, B.; Sondergeld, T.A.; Alvim, H.G.; Slager, E.M. A Mixed-Methods Investigation of Factors and Scenarios Influencing College Students’ Decision to Complete Surveys at Five Mid-Western Universities. Mid-West. Educ. Res. 2015, 27, 3–30. [Google Scholar]
  23. Porterfield, V.; Brescia, S. The Effect of Incentives on Student Surveys: An IR Perspective; NEAIR: Jersey City, NJ, USA, 2017; pp. 126–157. [Google Scholar]
  24. Laguilles, J.S.; Williams, E.A.; Saunders, D.B. Can Lottery Incentives Boost Web Survey Response Rates? Findings from Four Experiments. Res. High. Educ. 2011, 52, 537–553. [Google Scholar] [CrossRef]
  25. Ryu, E.; Couper, M.P.; Marans, R.W. Survey Incentives: Cash vs. In-Kind; Face-to-Face vs. Mail; Response Rate vs. Nonresponse Error. Int. J. Public Opin. Res. 2006, 18, 89–106. [Google Scholar] [CrossRef]
  26. Keating, N.L.; Zaslavsky, A.M.; Goldstein, J.; West, D.W.; Ayanian, J.Z. Randomized Trial of $20 Versus $50 Incentives to Increase Physician Survey Response Rates. Med. Care 2008, 46, 878. [Google Scholar] [CrossRef] [PubMed]
  27. Cantor, D.; O’Hare, B.C.; O’Connor, K.S. The Use of Monetary Incentives to Reduce Nonresponse in Random Digit Dial Telephone Surveys. In Advances in Telephone Survey Methodology; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2007; pp. 471–498. ISBN 978-0-470-17340-4. [Google Scholar]
Figure 1. Evolution of student participation in QAS questionnaires for the subjects SOCP and IMS.
Figure 1. Evolution of student participation in QAS questionnaires for the subjects SOCP and IMS.
Education 14 00216 g001
Figure 2. Trends in students’ final grades across academic years in the five subjects: (a) SOCP; (b) IMS; (c) TSC; (d) MENPP; (e) PASEL.
Figure 2. Trends in students’ final grades across academic years in the five subjects: (a) SOCP; (b) IMS; (c) TSC; (d) MENPP; (e) PASEL.
Education 14 00216 g002
Table 1. Questionnaire for students at the University of Malaga.
Table 1. Questionnaire for students at the University of Malaga.
No.Questions
Teaching Planning
Teaching and Learning Planning
Design of the programs/teaching guides/subject guide
1The teacher informs about the different aspects of the teaching guide or program of the subject (objectives, activities, syllabus contents, methodology, bibliography, evaluation procedure, etc.).12345NR/DK
Teaching Development
Instructional Development
Adherence to the planned activities
2The teacher complies with the planning established in the teaching guide/program of the subject (objectives, evaluation systems, bibliography, and other recommended sources of information).12345NR/DK
3The planned theoretical and practical activities have been coordinated.12345NR/DK
Teaching methodology
4The teacher organizes the activities conducted in class well.12345NR/DK
5Utilizes teaching resources (blackboard, transparencies, audiovisual media, and support materials on the virtual network) to facilitate learning.12345NR/DK
Teaching competencies developed by the teacher
6Explains clearly and confidently and highlights important content.12345NR/DK
7The teacher is concerned about the level of comprehension of their explanations.12345NR/DK
8Resolves any questions or concerns that may arise.12345NR/DK
9Encourages students to develop an interest in the subject (fluent and spontaneous communication).12345NR/DK
10Is respectful in dealing with students.12345NR/DK
Learning Evaluation
Evaluation system
11I have a clear understanding of what I need to learn to pass this subject.12345NR/DK
12I consider the established evaluation criteria and systems adequate to assess my learning.12345NR/DK
Results
Effectiveness
13The activities undertaken (theoretical sessions, practical exercises, individual work, group projects, etc.) contribute to the achievement of the subject’s objectives.12345NR/DK
Student satisfaction
14I am satisfied with the teaching performance of this teacher.12345NR/DK
Table 2. Characteristics of subjects under study.
Table 2. Characteristics of subjects under study.
LevelProgramSubjectAcronymCourseCharacter
UndergraduateChemical EngineeringSimulation and Optimization of Chemical ProcessSOCP4Mandatory
UndergraduateChemical EngineeringIntegrated Management SystemsIMS3Mandatory
UndergraduateChemical EngineeringTreatment of Contaminated SoilTCS4Optional
MasterChemical EngineeringMass Exchange Network for Pollution PreventionMENPP1Optional
UndergraduateEnglish StudiesPsycholinguistics Applied to the Study of the English LanguagePASEL3Optional
Table 3. Questionnaire about the new incentive system.
Table 3. Questionnaire about the new incentive system.
No.Questions
1Has the additional score motivated you to participate in the official QAS questionnaire?YESNO
2Do you believe that the procedure followed for awarding this extra score has compromised anonymity?YESNO
3Please indicate approximately the number of official QAS questionnaires that you have completed during the current semester.
Table 4. Students enrolled in the subjects and their participation in the QAS questionnaires.
Table 4. Students enrolled in the subjects and their participation in the QAS questionnaires.
SubjectEnrollmentParticipation in QuestionnairesParticipation-to-Enrollment Ratio
Academic Year20–2121–2222–2320–2121–2222–2320–2121–2222–23
SOCP594236429 *29 *7%69%81%
IMS404841233 *31 *5%69%76%
TCS1671810618 *63%86%100%
MENPP101512007 *0%0%58%
PASEL8488960624 *0%7%25%
* Shaded cells indicate that the incentive system was applied.
Table 5. Results of the questionnaire about the new incentive system.
Table 5. Results of the questionnaire about the new incentive system.
No.QuestionsYESNONR/DK
1Has the additional score motivated you to participate in the official QAS questionnaire?121142
2Do you believe that the procedure followed for awarding this extra score has compromised anonymity?52841
3Please indicate approximately the number of official QAS questionnaires that you have completed during the current semester.Mean = 3.8
Standard deviation = 2.8
Table 6. ANOVA results for all subjects at a significance level of α = 0.05.
Table 6. ANOVA results for all subjects at a significance level of α = 0.05.
Source of Variation
Between GroupsWithin Groups
SubjectSum of SquaresDegree of FreedomSum of SquaresDegree of FreedomFF CriticalSignificant
SOPC47.44913.21822.3622.421No
IMS21.12133.9957.5003.092Yes
TCS18.8231.53610.7563.259Yes
MENPP27.7246.9308.8623.316Yes
PASEL30.42267.31397.9023.061Yes
Table 7. Tukey’s tests at a significance level of α = 0.05.
Table 7. Tukey’s tests at a significance level of α = 0.05.
Differences in AverageStandard Errorq CriticalDifference in Average CriticalSignificant
IMS20–21 and 21–220.920.2123.3670.71Yes
21–22 and 22–230.200.2013.3670.68No
20–21 and 22–231.110.2143.3670.72Yes
TCS20–21 and 21–221.030.3193.4571.10No
21–22 and 22–230.480.3123.4571.08No
20–21 and 22–231.510.2313.4570.80Yes
MENPP20–21 and 21–221.410.3843.4861.34Yes
21–22 and 22–232.110.3623.4861.26Yes
20–21 and 22–230.700.3983.4861.39No
PASEL20–21 and 21–220.270.2123.3500.71No
21–22 and 22–231.050.1983.3500.66Yes
20–21 and 22–230.790.2003.3500.67Yes
The academic years in which the incentive system was applied are marked in bold.
Table 8. ANOVA results for QAS questionnaires in the SOCP subject (α = 0.05).
Table 8. ANOVA results for QAS questionnaires in the SOCP subject (α = 0.05).
Source of Variation
Between GroupsWithin Groups
Question No.Sum of SquaresDegree of FreedomSum of SquaresDegree of FreedomFF CriticalSignificant
10.327222.03580.4313.156No
20.222224.62590.2663.153No
30.290220.76590.4123.153No
40.684219.73601.0403.150No
50.180222.21590.2393.153No
61.447219.63602.2113.150No
70.343211.93590.8483.153No
80.01820.97590.5613.153No
91.019229.45591.0213.153No
100.171216.43600.3123.150No
110.10729.59590.3303.153No
120.370225.06600.4433.150No
130.313222.00600.4273.150No
140.140218.83590.2203.153No
Table 9. ANOVA results for QAS questionnaires in the IMS subject (α = 0.05).
Table 9. ANOVA results for QAS questionnaires in the IMS subject (α = 0.05).
Source of Variation
Between GroupsWithin Groups
Question No.Sum of SquaresDegree of FreedomSum of SquaresDegree of FreedomFF CriticalSignificant
10.093217.44630.1683.143No
20.117213.73640.2723.140No
30.295213.30630.6983.143No
41.074214.99582.0773.156No
50.027232.09630.0273.143No
60.596236.15640.5273.140No
70.747256.24630.4183.143No
81.023218.05641.8133.140No
92.104235.74621.8253.145No
100.07327.39640.3143.140No
110.539247.88640.3603.140No
120.843227.28630.9733.143No
130.047222.40640.0673.140No
140.919217.40631.6643.143No
Table 10. ANOVA results for QAS questionnaires in the TCS subject (α = 0.05).
Table 10. ANOVA results for QAS questionnaires in the TCS subject (α = 0.05).
Source of Variation
Between GroupsWithin Groups
Question No.Sum of SquaresDegree of FreedomSum of SquaresDegree of FreedomFF CriticalSignificant
11.705232.68310.8093.305No
20.502227.73310.2813.305No
30.021221.62300.0143.316No
41.626226.84310.9393.305No
50.763220.68310.5723.305No
61.932227.70301.0463.316No
70.391225.85300.2273.316No
80.920220.84310.6843.305No
92.252231.93301.0583.316No
100.602219.46300.4643.316No
112.771231.61311.3593.305No
121.391226.84310.8033.305No
131.129220.90310.8383.305No
141.705224.68311.0713.305No
Table 11. ANOVA results for QAS questionnaires in the PASEL subject (α = 0.05).
Table 11. ANOVA results for QAS questionnaires in the PASEL subject (α = 0.05).
Source of Variation
Between GroupsWithin Groups
Question No.Sum of SquaresDegree of FreedomSum of SquaresDegree of FreedomFF CriticalSignificant
10.411115.59300.7924.171No
20.001117.87300.0014.171No
31.260124.24301.5594.171No
41.383132.62301.2724.171No
52.658114.22305.6084.171Yes
61.750134.47301.5234.171No
73.159120.56304.6094.171Yes
80.244121.47300.3424.171No
98.880129.59309.0044.171Yes
100.394111.07301.0694.171No
111.143116.86302.0344.171No
120.582112.77291.3204.183No
135.855125.50296.6584.183Yes
141.200130.00281.1204.196No
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Vereda-Alonso, C.; Cerrillo-Gonzalez, M.d.M.; Gomez-Lahoz, C.; Villen-Guzman, M.; Vereda-Alonso, C. Incentivizing Student Participation in QAS Questionnaires: An Evaluation of a Guaranteed Prize System at the University of Malaga. Educ. Sci. 2024, 14, 216. https://doi.org/10.3390/educsci14030216

AMA Style

Vereda-Alonso C, Cerrillo-Gonzalez MdM, Gomez-Lahoz C, Villen-Guzman M, Vereda-Alonso C. Incentivizing Student Participation in QAS Questionnaires: An Evaluation of a Guaranteed Prize System at the University of Malaga. Education Sciences. 2024; 14(3):216. https://doi.org/10.3390/educsci14030216

Chicago/Turabian Style

Vereda-Alonso, Cristina, Maria del Mar Cerrillo-Gonzalez, Cesar Gomez-Lahoz, Maria Villen-Guzman, and Carlos Vereda-Alonso. 2024. "Incentivizing Student Participation in QAS Questionnaires: An Evaluation of a Guaranteed Prize System at the University of Malaga" Education Sciences 14, no. 3: 216. https://doi.org/10.3390/educsci14030216

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop