Next Article in Journal
Goal Setting for Teacher Development: Enhancing Culturally Responsive, Inclusive, and Social Justice Pedagogy
Next Article in Special Issue
Perceptions of Students and Teachers Regarding Remote and Face-to-Face Assessments in the Evolving Higher Education Landscape
Previous Article in Journal
Vocabulary Instruction for English Learners: A Systematic Review Connecting Theories, Research, and Practices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment-Focused Pedagogical Methods for Improving Student Learning Process and Academic Outcomes in Accounting Disciplines

by
Mădălina Dumitru
and
Voicu D. Dragomir
*
Faculty of Accounting and Management Information Systems, Bucharest University of Economic Studies, Piata Romana 6, 010374 Bucharest, Romania
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(3), 263; https://doi.org/10.3390/educsci15030263
Submission received: 10 January 2025 / Revised: 14 February 2025 / Accepted: 18 February 2025 / Published: 20 February 2025

Abstract

:
The objective of this study is to present and validate a pedagogical method based on practice testing and student-generated questions, delivered in a blended learning environment. The research is founded on assessment-based approaches for two consecutive management accounting disciplines (management accounting and performance measurement and control) at the most prestigious economics university in Romania. Our study is motivated by the desire to improve the student learning process, as students, in general, find management accounting difficult. The moment is especially significant given the large-scale adoption of blended learning after the COVID-19 pandemic. Data were collected for a period of two semesters, starting with the moment that marked the return to traditional learning after lockdown. A new variable labeled “consistent learning” was developed to account for student participation in these learning strategies throughout the semester. The sample comprised 107 students. Hypotheses were formulated to identify and test learning patterns within and between these disciplines using the self-determination theory. The results show that the learning outcomes are positively correlated with consistent learning, for both disciplines. Two clusters were identified: involved learners versus a voluntary non-involvement group. For all learning outcomes, the group that adopted the learning strategy had significantly better results at the end of the semester than the rest of the sample. This study provides an opportunity for professors, showing that the implementation of assessment-based learning strategies in a blended environment leads to significant improvements in student learning outcomes in related disciplines.

1. Introduction

Management accounting is considered challenging by many students. Related content is taught in a smaller number of classes than financial accounting. It involves previous knowledge of economics, mathematics, and marketing (Van den Brink et al., 2003), understanding what happens in practice, a degree of interpretation of data, and decisions that students must make on their own. Mastering the technical part requires time and practice conducted in a consistent manner. The approach described in this article is intended to support the student learning process throughout the semester. It combines practice testing and student-generated questions in a blended learning environment. Thus, the research question of our paper is the following: How are the student learning process and outcomes (in the field of management accounting) influenced by the application of a pedagogical method consisting of practice testing and student-generated questions in a blended learning environment?
The present study was motivated by the desire of our previous cohorts to adopt a more practical approach to management accounting. The research environment was the largest university of economic studies in Romania. The implementation of the learning techniques was the intervention of choice in a quasi-experimental design. The students in the selected cohort were enrolled at the bachelor’s degree level at the Faculty of Accounting and Management Information Systems, and they studied two mandatory disciplines, management accounting (MA—second semester, second year of undergraduate studies) followed by performance measurement and control (PMC) in the next semester. The research was conducted between February 2022 and February 2023 on a sample of 107 students, with 214 individual observations on multiple indicators.
To apply the learning techniques, the following approach was used. Students were asked to choose a simple business model and prepare two applications each week based on the previous lecture. They also had a new quiz each week on the university’s online platform. Pedagogical methods that support student engagement such as cases, flipped classroom, online materials, and problem solving could improve the student learning process and have long-term effects (Downen & Hyde, 2016; Fogarty & Goldwater, 2024; Frick et al., 2020; Van den Brink et al., 2003). Methods centered on asking accounting students to perform various tasks, such as practice testing or generating questions, are linked to deeper knowledge and, consequently, improved learning outcomes (Blondeel et al., 2023a; Chu & Libby, 2010; Frick et al., 2020; Potter & Johnston, 2006).
The two types of learning techniques included in this study, practice testing and question generation, require stronger student engagement than only reviewing lecture notes. One of the objectives of the pedagogical method is the long-term retention of information. Accumulated knowledge can be applied later, as is the case with the PMC discipline. The pedagogical techniques described below are part of the learning process itself; rather than only evaluating the students, they encourage students to review the aspects discussed before engaging in quizzes and homework. In our research, the tasks were completely voluntary and designed within the requirements of self-determination theory (Deci & Ryan, 2012). The study was based on rich quantitative data (collected on the university’s online platform) to test a variety of hypotheses that provide an integrated perspective on the learning process.
The present article adds several contributions to the scientific literature. First, the pedagogical technique was applied for two semesters (28 weeks) for all aspects covered in the syllabus. Consequently, our contribution quantifies a comprehensive and systematic learning experience. In contrast, most previous contributions regarding accounting education were descriptive, based on case studies or questionnaires (Apostolou et al., 2021). Second, we developed a new indicator to measure “consistent learning” to account for the adoption of learning techniques by students throughout the two semesters. This indicator, which is opposed to procrastination, is hypothesized to correlate with academic performance outcomes and attitudes of students toward learning.
Our paper is organized as follows. First, we introduce self-determination theory, which represents the conceptual framework of our study. Second, we focus on the literature review of the selected learning techniques. Third, in the Methods section, we connect the hypotheses to the relevant statistical methods. The results are linked to the hypotheses in a logical sequence. Finally, the discussion and the implications for educators and academic managers are explored.

2. Literature Review

2.1. Theoretical Framing: Self-Determination Theory (SDT)

The theoretical framework for this study is represented by SDT, “one of the most prominent theories to explain human motivation” (Kunz, 2015). “SDT places its emphasis on people’s inherent motivational propensities for learning and growing, and how they can be supported” (Ryan & Deci, 2020). In the context of conducting this research (i.e., at the end of two years of exclusive online learning due to the COVID-19 pandemic), we hypothesized that the students were motivated to readjust to face-to-face learning and obtain the learning outcomes they expected. SDT refers to three basic psychological needs: autonomy, competence, and relatedness. These are positive psychological needs that must be nurtured by professors to support the individual development of the students.
Autonomy means that students choose the learning activities in which they want to participate. Autonomous motivation is opposed to controlled motivation: while autonomous motivation deals with what people do willingly, controlled motivation deals with activities conducted because of external requirements (Van der Hauwaert et al., 2022). Students’ activities are somewhere in between autonomy and control (Ryan & Deci, 2000), with the balance shifting constantly. Students are autonomously involved in activities that give them further knowledge. Having a choice might improve their performance and curiosity (Murayama et al., 2015; Schutte & Malouff, 2019). The aspect of voluntary participation in the learning strategy is a significant strength of the present study. It is also an ethical requirement and an aspect covered by institutional regulations to not discriminate in the application of pedagogical methods.
Student motivation can be classified as intrinsic or extrinsic. Research shows that academic results and student engagement are positively associated with intrinsic motivation (Froiland & Worrell, 2016). There are people who have a good time doing their schoolwork, experiencing intrinsic motivation and feeling autonomous (Froiland & Worrell, 2016; Kunz, 2015). There is also autonomous extrinsic motivation created by regulations that are accepted and integrated by people into their value system, adopting them to internalize a behavior and act in the desired way (Ryan & Deci, 2020; Van der Hauwaert et al., 2022).
Competence refers to the fact that students need to feel that they can achieve the purpose of their studies, that they can succeed when they have appropriate tasks and opportunities for growth. Regarding this psychological need, performance measures (e.g., grades) can motivate students to further increase their autonomy (Groen et al., 2017). Feeling competent, effective, and challenged can be assimilated with positive feedback (Ryan & Deci, 2020), a reward for exceeding the performance of their peers (Cameron, 2001).
Relatedness consists of the students’ need to feel that they can build “positive and mutually satisfying relationships, characterized by a sense of closeness and trust” (Center for Self-Determination Theory, 2023) during their learning process. This need is satisfied through interaction with other students and the professor, while gaining a feeling of mutual respect. A good student–teacher relationship generates student engagement during lectures, positive feelings toward the lecture, confidence, and good communication of learning needs. Relatedness has a positive effect on increasing autonomous motivation (Groen et al., 2017).
The pedagogical method influences the degree to which the three types of student needs are satisfied. Self-determination improves student motivation and self-regulated learning strategies (Ryan & Deci, 2020). Therefore, deeper engagement with learning comes from the degree to which the professor’s approach and student needs are convergent, as indicated by the SDT.
We based our research on the fact that self-determination is the strongest predictor of learning outcomes (Gagné & Deci, 2005). We predict that students’ perception of a learning strategy as a helping tool increases their motivation by satisfying their three basic psychological needs for autonomy, competence, and relatedness. Higher autonomous motivation is expected to improve student performance (Van der Hauwaert et al., 2022).

2.2. Pedagogical Methods for Improving Student Learning Process and Academic Outcomes

The most practiced teaching method in accounting is lecturing (Blankley et al., 2017). Nevertheless, research in accounting education has shown that there are many pedagogical techniques that can significantly improve students’ skills and competencies (Akaaboune et al., 2020; Huber et al., 2017; Kern, 2002), learning experience (Diaz, 2016), motivation to study accounting (Sugahara & Dellaportas, 2018), academic outcomes (i.e., Huber et al., 2017; Marshall & Bolt-Lee, 2022), and relatedness to peers (Krause, 1988; Gainor et al., 2014). The challenge addressed in this study was to devise a research instrument to validate a pedagogical method that combines practice testing and self-generated questions in a blended learning environment. For this purpose, we searched scientific databases for articles that analyze the impact of pedagogical methods on student performance in management accounting.
The most relevant articles are summarized in Table 1. We did not include articles dedicated to case studies (which are numerous in the management accounting domain) because we considered that their purpose (to teach a certain topic) was not comparable with the purpose of the present study (to present a pedagogical method to be applied in any of the course topics). Most of the cited contributions present the pedagogical methods applied in class (Frick et al., 2020; Krom, 2012; Lancaster & Strand, 2001; Peek et al., 1995; Peters & Chiu, 2022). Fogarty and Goldwater (2024) introduced practice testing as a component of their assessment method, but their focus was on testing the difficulty of developed items. Potter and Johnston (2006) described an online learning system that included a practice-testing component. To the best of our knowledge, only Frick et al. (2020) used this pedagogical method in two management accounting disciplines, but their method involves flipped classroom and quizzes during lecture. As such, there are no other previous contributions that have analyzed the effect of practice testing combined with student-generated questions in accounting. This is the research gap addressed in the present study.
From the multitude of pedagogical techniques (Bonwell & Eison, 1991; Freeman et al., 2014; Opre et al., 2024; Prince, 2004), we present the two approaches used in this investigation: practice testing and student-generated questions.
We apply the term practice testing in line with Dunlosky et al. (2013, p. 29), meaning low-stakes or no-stakes testing activities taking place outside the class, in which students can get involved on their own. Practice testing may include multiple-choice, short-answer, and fill-in-the-blanks questions (Dunlosky et al., 2013; Fogarty & Goldwater, 2024; Potter & Johnston, 2006). Practice-oriented tests and quizzes allow students to actively update what they have learned rather than passively review materials (Opre et al., 2024). They are effective for the “long-term retention of coherent learning content” (Ebersbach et al., 2020, p. 724) compared to restudying, note taking, concept mapping, and imagery use (Dunlosky et al., 2013). Research shows that more frequent and spaced practice testing, with feedback, especially when brief tests are taken shortly after the end of the lecture (Bonwell & Eison, 1991), has better results than one quiz (Dunlosky et al., 2013; Rawson & Dunlosky, 2011). Practice testing reactivates related information (Carpenter, 2011) and improves its mental organization and idiosyncratic processing (Hunt, 1995). Practice testing supports deep learning (Opre et al., 2024), helping students to consolidate the information transmitted by the teacher and prepare for class (Cheng & Ding, 2021) and summative assessments.
The long-term retention of information is particularly important when the learning content of a discipline is based on previously studied content (as is the case with different accounting disciplines). Testing in the form of online formative assessment in accounting increases student performance and confidence in their knowledge (Blondeel et al., 2022) and decreases test anxiety (Blondeel et al., 2023a). Quizzes are easy to implement by accounting professors if rather performant information systems are available (Fogarty & Goldwater, 2024). Examples of systems used for practice testing in accounting are the Duga, MarinaLSTM, platforms adopted from Moodle (Blondeel et al., 2023a; Mardini & Mah’d, 2022; Potter & Johnston, 2006). Frick et al. (2020) conducted their study on enhancing student engagement in management accounting in two cycles. In the second cycle, they introduced a testing component before class. This method had a positive effect on student engagement. Potter and Johnston (2006) used MarinaLSTM for practice testing in intermediate management accounting. The results showed that the system improves the final exam scores of the students. The authors considered that an explanation for the results is that the system organizes the students outside class. Previous research (Frick et al., 2020; Fogarty & Goldwater, 2024; Potter & Johnston, 2006) identified the setting of the study and the reduced generalizability of the results as limitations. Another limitation is the indicator used to measure the learning outcome. Additionally, all students had access to all resources, which means that there was no control group.
Student-generated questions are rarely used as a pedagogical method (Aflalo, 2021), although they were found to have positive effects on student comprehension, recall, and problem solving (Ebersbach et al., 2020; Song, 2016). Student-generated questions can be implemented with or without having previously trained the students, collecting the solutions to the questions generated or not. To write meaningful examples, students must know what was discussed during lectures and study, select, analyze, and combine information (Opre et al., 2024). This way, their cognitive abilities are improved, information is understood and retained (Rosenshine et al., 1996), and a database of examples is created that students can use to learn more (Aflalo, 2021). Better information retention is found when students receive training on how to generate questions (Ebersbach et al., 2020). When they are also required to generate solutions to exercises, their understanding of the concepts is enhanced (Opre et al., 2024). Improved learning outcomes were recorded for students who generated questions compared to students taking notes (King, 1992; Berry & Chew, 2008) or summarizing the lecture notes (King, 1992). According to Chu and Libby (2010), after engaging in question creation, accounting students can refine their writing skills and creativity, increasing the complexity of their applications. Through writing their own examples and explaining them, students gain accounting knowledge and are more likely to retain it. The results obtained by Chu and Libby (2010) are limited to the duration of the one-term lecture.
Digital pedagogy relies on two aspects: digital technologies (i.e., devices) and digital platforms (i.e., digital learning tools). One type of digital pedagogy is blended learning. It is defined as an educational approach that combines face-to-face classes with online learning material (Blondeel et al., 2023a; Imran et al., 2023). Blended learning is a key trend in higher education (Adams Becker et al., 2017), responding to the demands of modern teaching strategies in accounting through its flexibility from the point of view of time and space (Grabinski et al., 2015; Hu et al., 2023; Krasodomska & Godawska, 2021). Its potential was recognized by accounting academics due to the COVID-19 crisis (Sangster et al., 2020). The use of blended learning leads to higher grades for students (Abraham, 2007), improves students’ analytical skills (Chen & Jones, 2007) and the retention of information (Harker & Koutsantoni, 2005), and promotes student involvement (Pinto-Llorente et al., 2017). The challenges related to digital pedagogy include costs, financial resources, the delivery of electronic materials, the delivery of technical content, digital competencies of students and teachers, student focus and ethical behavior, reduced networking, the availability of equipment, connectivity issues, and the diverse cultural, linguistic, and academic levels of the students (Henadirage & Gunarathne, 2023; Tettamanzi et al., 2023).
Online platforms used in accounting disciplines automate a part of the tasks performed by instructors, such as grading (Blondeel et al., 2023a). Accounting students consider that the platforms offer flexibility, since homework or quizzes can be uploaded or solved in or outside the class, possibly in a given time range (Mardini & Mah’d, 2022). Students can learn at their own pace, engaging deeper with course content (Hu et al., 2023). Online platforms are used to store student responses, grades, teaching materials, and announcements and to access various statistics on the evolution of the student’s learning process. All these features help teachers structure the lectures and connect with their students, as required by the SDT (Ryan & Deci, 2020).
Digital technologies were more prevalent in accounting education in Italy after the pandemic (Tettamanzi et al., 2023). While acknowledging the potential of IT for business, accounting professors in Italy have restrictions in relation to exclusive digital teaching. Blended learning in accounting could enhance student experience when given enough time for planning, preparation, and improvement (Kelly et al., 2023). Therefore, the sudden switch from face-to-face to online learning caused by the pandemic decreased the impact of student engagement on student marks for business students (Azzali et al., 2022).

3. Description of Methods and Hypothesis Development

3.1. The Context

This study was set in the largest university of economic studies in a CEE country. It functions in an open system (Blondeel et al., 2023a) and applies the Bologna system, according to which the bachelor’s degree includes six semesters. The academic year starts on October 1st and is split into two semesters. Each of them includes 14 weeks of classes and 3 weeks of summative evaluations. For most of the disciplines, students attend a lecture and a tutorial every week, each lasting 80 min. The professors have a certain degree of independence but must follow the syllabus (which includes the names of the chapters, with brief instructions regarding the content, and the percentage of the tutorial score in the final mark). Professors create their own examples and select the theoretical framework to be presented. Also, for the exam, the structure is required to be the same for all cohorts, with each professor allowed to create their own questions.
The COVID-19 pandemic determined a sudden switch in education from traditional, face-to-face learning to online learning. This was acknowledged by researchers as a significant change and explored in the academic accounting literature (Blondeel et al., 2022; Sangster et al., 2020; Tartavulea et al., 2020). However, little was discussed about the opposite situation: students returning to traditional classes. Starting in spring 2022, most university courses around the world returned to hybrid or fully face-to-face lectures and tutorials. The cohort of students described in this study started university in October 2020 and had online classes until February 2022. The first 4 weeks of the first semester included in this study were online, and the remaining 10 weeks were face-to-face. Under these circumstances, it was extremely important that professors were able to help students become comfortable in the physical academic environment and keep up with classes.

3.2. Presentation of the Pedagogical Method

Data used in this study were collected for two related compulsory disciplines: management accounting (MA), which is studied in the second semester of the second year of study, and performance measurement and control (PMC), which is studied in the first semester of the third year of study. The MA and PMC syllabuses are adapted from the ACCA, as the faculty is accredited by this professional body. The main topics included in the syllabus are presented in Appendix A. As illustrated, the two disciplines are connected to each other, representing a common circuit. The second discipline builds on the concepts discussed in the first one, and it is important for the students to be able to maintain the results until the last exam. The sample of students for both disciplines is presented in Table 2.
The data described in Table 3 were obtained for the entire cohort that attended the lectures, allowing the creation of a homogeneous database. As a limitation, the sample size of 107 students is small for generalizing the results beyond the studied population. As in any study, the sample size is a trade-off between reaching statistical power and maintaining the rigorous nature of data collection. This view is supported by Hu et al. (2023, p. 662), who stated that “the use of balanced panel data reduces the noise introduced by participant heterogeneity.” In our case, the pedagogical method was applied to a single cohort of students with the same professor at the lecture, who maintained a rigorous application of the evaluation methods for both disciplines. The academic context did not allow the research to be extended to other cohorts, other disciplines, or other time periods.
The pedagogical method described below was created by the professor in charge of the lectures for the entire cohort and was applied as part of the lecture, not the tutorial. The professor who gave the lectures is one of the authors of the current paper and had more than 20 years of teaching experience at the time this study was conducted. The professor has a master’s degree in finance and performance measurement and a Ph.D. in performance measurement. Most of her teaching activities, conducted at the university or for the Body of Expert and Licensed Accountants of Romania (CECCAR), are related to the MA and PMC disciplines. She published several textbooks in Romanian and English on MA and PMC and has more than 40 publications indexed in the Web of Science.
The pedagogical method was devised as follows. The cohort had two types of assignments established at the beginning of the semester for the MA discipline, with the purpose of stimulating the consistent learning of the students. The assignments consisted of solving a quiz and generating two questions per week. These requirements were separate, and each carried a specific reward (a bonus, described below) that was added to the exam score. The tasks described were the same for MA and PMC. The main goal was to encourage students to self-study (Curtis, 2011) and help them acquire the necessary skills to prepare for the exam. The students were not used to this learning strategy and approached it for the first time.
As recommended by SDT, the lecture was structured in the following manner: All materials and the grading rules were posted at the beginning of the semester on the university’s online platform (adapted from Moodle). The quizzes were available weekly on the platform, immediately after the end of the lecture, to be solved at home, in line with Kibble (2007). The homework had to be uploaded on the same online platform. To meet the autonomy needs, the students could take the quiz and upload homework at their own pace during the week after the lecture. Screenshots with the main functionalities of the platform for quizzes and homework are included in the Supplementary Material.
An advantage of the open-book test is that students can “correct their memory stores” (Ebersbach et al., 2020). There were 11 quizzes on MA, lasting from 8 to 40 minutes (almost 18 minutes on average), and 14 quizzes on PMC, lasting 15 minutes each. The quizzes included four to nine questions (randomly extracted by the platform from a large question bank within a given set of categories, such as multiple-choice questions for the ABC method) based on the aspects discussed in the most recent lecture. The number of attempts for a student was limited to one per quiz; the questions were sequential (the students could not go back to a previous question) and shuffled. The quiz was available using a computer or mobile phone. An example of a quiz for MA and an example for PMC are included in the Supplementary Material. The scoring was automated and disclosed to students when the test expired. Feedback consisted of scores obtained on the platform and a file with detailed solutions and explanations. The most difficult questions were discussed during the next lecture.
The second part of the assignment was weekly homework consisting of two short exercises (multiple choice, short answer, or true/false questions) from aspects discussed during the lecture and for a specific domain. Students decided what to include in the created examples. The purpose was to help students understand the aspects discussed each week. Supporting their autonomy psychological need, students could choose any activity sector they liked for their applications at the beginning of the MA discipline and take into consideration the specific aspects of the manufacturing process when writing the examples during the two semesters. They chose domains such as the production of clothing, juice, bread, furniture, chocolate, ice cream, perfume, or coffee. At the beginning of the semester, the professor presented examples of different types of questions. Sometimes, students were given an additional example to solve. The homework was uploaded to the platform in a Word file that included all the questions generated during the semester. The examples that were plagiarized or did not observe the topic and recommendations were not graded (they received the default point—1). A score of 1 to 10 was assigned to each homework uploaded. The maximum number of weeks with homework assignments was 12 for MA and PMC. An example of homework for MA and an example for PMC are included in the Supplementary Material.
The bonus was calculated each week as an average of the scores obtained by the student on the quizzes and for the homework. The maximum bonus per week was 0.1 points (representing 1% of the final score for the discipline, meaning that it was symbolic), and the maximum bonus per semester was 1 point (so not completing an activity did not affect the possibility of obtaining the maximum overall score because, theoretically, it was possible for students to have a total higher than 1). If a student obtained, for example, 6.4 on the quiz and 8 for homework, the bonus for that week was (6.4 + 8)/200 = 0.072. The low value of the bonus granted per week is a very important aspect of the method, because students were required to work for at least ten weeks to have the chance to obtain the maximum possible bonus. The bonus was calculated for 12 weeks for MA and 13 weeks for PMC. Furthermore, students received 0.1 points for fulfilling a questionnaire at the end of the semester.
The purpose of this study was to present and validate the results of this strategy. In this regard, a quantitative approach was used. The focal variable used to operationalize student engagement with this learning strategy is consistent learning. We define consistent learning per student per semester as in Equation (1), considering the following factors:
Consistent learning (CL) = Homework index + Quiz index + Log index
We computed the homework index as the number of homework sent by a student per semester divided by the total number required per semester. The quiz index was calculated as the number of quizzes solved divided by the total number of quizzes available per semester. The log index was calculated using the following algorithm. The sum of logs per week was extracted from the platform. Examples of logs are logging in or out, accessing a course, viewing an activity within a course, and completing an activity. This represents an objective indicator with which to measure student resource use, which was previously used in accounting education studies (Krasodomska & Godawska, 2021). The students were divided into four quartiles, based on their number of logs: two quartiles above the median and two below. The quartile of a student may be different from week to week. The points were granted weekly as follows:
  • 0 points—no activity;
  • 1 point—the number of logs was in the lowest quartile;
  • 2 points—the number of logs was in the second quartile;
  • 3 points—the number of logs was in the third quartile;
  • 4 points—the number of logs was in the highest quartile (highest number of logs).
The points were added for the entire semester for MA and, respectively, PMC, obtaining the students’ log points. The student log index per semester was calculated as the student log points divided by the maximum value of the index per semester. As can be seen, consistent learning (CL) quantifies the sustained effort of the students and their study habits (Blondeel et al., 2023a), not the actual scores of quizzes and homework.
Starting with the previous explanations, our first hypotheses are as follows:
H1: 
Learning outcomes (quiz average, bonus, and tutorial and exam scores) are positively correlated with consistent learning.
H2: 
Consistent learning in the MA discipline is positively correlated with consistent learning in the PMC discipline.
Pearson correlations were calculated between different types of learning outcomes, for the entire sample and for each semester (H1–H2).
One of the strengths of the approach described in this paper is the fact that the results of the learning strategy could be analyzed over a longer period, rather than a semester. This study covers two management accounting disciplines in a matched sample. Therefore, we were able to see the impact of the method on learning outcomes at the end of the second discipline, which builds on the knowledge accumulated by the students for an entire year of study and for a domain in which specific skills are necessary.
For this purpose, the following hypothesis refers to the improvement in learning outcomes over two semesters of studying management accounting.
H3: 
Maintaining the learning strategy for two consecutive semesters in related disciplines significantly improves the students’ learning outcomes in the second discipline.
To compare learning outcomes in two consecutive semesters, paired t-tests were applied to the results of student participation, average quiz scores, and tutorial and exam scores for the entire group (H3).

3.3. Learning Outcomes for the Two Disciplines

Learning outcomes are the results obtained by students by completing tutorials and exams. In MA, the score obtained in the tutorial represents 30% of the final score (the rest being the exam score). In PMC, the tutorial score is 40% of the final score. For the sample groups, all tutorials were taught by experienced academics (associate or tenured professors). The cohort was split into five groups for the tutorials. In this study, only the overall tutorial score was used.
Both the MA and PMC exams were written on campus with proctoring. A student had to obtain at least 5 as the final score in a discipline to pass an exam. The final score was the sum of the weighted tutorial score and the weighted exam score, including the bonus.
Based on this information, the next hypothesis is
H4: 
Learning outcomes are positively correlated for the two disciplines.
Between semesters, Pearson correlations were calculated on different types of learning outcomes for the entire sample (H4).

3.4. Voluntary Adoption of the Learning Strategy

In line with SDT, the activities described in the present research for the bonus points were voluntary and the students had autonomy in choosing to adopt the learning strategy. They could choose to do nothing, do everything the teacher suggested, or do only some of the activities. Yet, some rules had to be followed (e.g., there were deadlines for each activity, and everything had to be submitted in the required context). No penalty was associated with not adopting this learning method. During the two semesters, the professor noticed that a considerable part of the sample voluntarily adopted the suggested learning strategy. On the opposite side of the spectrum, the matched sample included four students in MA and six students in PMC who received a null bonus at the end of the semester. Thus, the next hypothesis checks whether the learning outcomes of the involved students were better than those obtained by their less involved peers.
H5: 
The voluntary adoption of the learning strategy leads to a significant improvement in the student learning outcomes in both disciplines.
Since this study used a quasi-experimental design, there was no real control group. We did not have a control group because the university regulations do not allow for the creation of experimental groups for teaching purposes, the requirement being that all students always receive the same learning conditions. This choice was in line with the ethical principle of offering equal educational opportunities to all students. Also, by choosing a matched sample, we eliminated any possible bias associated with the lectures or the teacher (e.g., changes in syllabus, lecture requirements). Consequently, the students were not split into a treatment group and a control group, but the group was split post-factum, through clustering. For this reason, a clustering procedure was used to distinguish between the voluntary involvement group and the voluntary non-involvement group (which could serve as a control group). The k-medoids (PAM) method implemented in R was selected to perform the clustering analysis. The clustering variables were the components of the consistent learning (CL) score, as described in Equation (1). The solution with two clusters was deemed appropriate for both semesters, and the results were consistent between the two periods. Independent t-tests (with unequal variances) were applied to learning outcomes between the voluntary involved group and the non-involvement group (H5).

3.5. Student Perception of the Learning Strategy in Management Accounting Disciplines

One of the objectives of this research was to improve the experience of students in future cohorts. Therefore, in terms of the students’ opinion about the MA and PMC learning strategy, a questionnaire was distributed containing four items on a five-point scale: “totally disagree” (−2) to “totally agree” (+2). The items referred to the student experience: the importance of quizzes and the generation of questions. Some of the items were adapted from Pintrich et al. (1991) and Byrne and Flood (2005). The proposed hypothesis addressed using this instrument is as follows:
H6: 
Consistent learning is correlated with the importance attached to quizzes and homework, and with the students’ choice to solve additional tasks.
An exploratory factor analysis was also applied to this questionnaire. Items that decreased the Cronbach alpha of their respective factors were eliminated. The four items of the ad hoc instrument were assigned to a single factor with an alpha of 0.85. Factor scores were generated using linear regression. A correlation matrix between learning outcomes and the students’ perception of the two disciplines was used to test hypothesis H6.

3.6. Predictors of Exam Scores

Accounting is a technical, content-rich discipline (Blondeel et al., 2023b) that requires significant time investment from practitioners (Mihai et al., 2020). The best way to become a good professional is to constantly learn and develop the appropriate competencies for the labor market (Albu et al., 2016). We had no pre-tests or post-tests in this research, but the result of the MA exam was considered the pre-test for PMC. Thus, our final hypothesis is as follows:
H7: 
Consistent learning is a direct predictor of exam scores in the second discipline.
We used the forward stepwise regression to construct the model that would predict exam scores in the PMC discipline. All types of learning outcomes were included in the pool of factors, as well as student attitudes toward the proposed learning strategy. Considering that several factors were very highly correlated, several specifications of the model were tested in a stepwise regression analysis. Variance inflation factors (VIFs) were included in the results to show that the remaining predictors were not significantly correlated with each other. All statistical procedures were performed in the software environment R (version 4.2.2.).

4. Results

We enumerate the research hypotheses as follows:
H1: 
Learning outcomes (quiz average, bonus, and tutorial and exam scores) are positively correlated with consistent learning.
H2: 
Consistent learning in the MA discipline is positively correlated with consistent learning in the PMC discipline.
H3: 
Maintaining the learning strategy for two consecutive semesters in related disciplines significantly improves the students’ learning outcomes in the second discipline.
H4: 
Learning outcomes are positively correlated for the two disciplines.
H5: 
The voluntary adoption of the learning strategy leads to a significant improvement in the student learning outcomes in both disciplines.
H6: 
Consistent learning is correlated with the importance attached to quizzes and homework, and with the students’ choice to solve additional tasks.
H7: 
Consistent learning is a direct predictor of exam scores in the second discipline.
The present section goes through the tests necessary to test hypotheses H1–H7. Descriptive statistics are presented in Table 4. The consistent learning variable (_CL) and the academic performance scores are included in this descriptive summary per discipline. Average quiz scores are calculated using zero as the score for missed quizzes. The tutorial scores (_TS) are calculated at the end of the semester by the tutors assigned to each group of students. The bonus is calculated by the professor who delivers the lecture, based on the algorithm described in the methodology section. The exam scores (_EX) do not include the bonus from different activities during the semester.
The MA discipline started with a quiz participation rate of 66% in the second week, followed by a steep decline in the second half of the semester. The participation rate for the second half of the MA semester reached a minimum of 29% in the 7th week and a maximum of 36% in the 12th week. For the PMC discipline, the quiz participation rate was relatively steady in the first half of the semester and followed a U-shaped curve in the second half of the semester, eventually reaching the highest participation rate of 66% in the 13th week. The mean quiz participation rate for the PMC discipline (50%) was higher than the mean rate for MA (42%), but the difference was not statistically significant in a nonparametric paired Wilcoxon test for the corresponding weeks (p-value = 0.126). The mean quiz score for the PMC discipline for the entire sample (M = 0.496, SD = 0.217, on 754 observations) was significantly higher than the average MA quiz score for the entire sample (M = 0.447, SD = 0.214, on 506 observations), as observed in a nonparametric Wilcoxon test for two independent groups (V = 169,546, p < 0.001, effect size = 0.095, small effect).
The results regarding the homework scores should be considered in relation to the rate of homework submission. The MA discipline started with a submission rate of 73% in the first week, followed by a steep decline in the second half of the semester. The minimum submission rate for the second half of the MA semester is 17% in the eighth week. The final homework of the MA semester had a submission rate of 26%. For the PMC discipline, the submission rate was steady, at no more than 46%. The minimum was reached in the ninth week at 19%. The final homework of the PMC semester had a submission rate of 31%. The mean submission rate for homework was lower for the PMC discipline than for the MA discipline, but the difference was not statistically significant in a paired nonparametric Wilcoxon test for the corresponding weeks (p-value = 0.683). The mean homework score in the PMC discipline (M = 0.767, SD = 0.31, on 456 observations) was lower than the average homework score in MA (M = 0.778, SD = 0.28, on 494 observations), but not statistically significant in a nonparametric Wilcoxon test for two independent groups (V = 109,986, p = 0.499). The PMC discipline is considered more difficult and builds on the knowledge accumulated in the MA discipline. Therefore, the fact that the average score for MA and PMC is almost the same is a favorable outcome.
Consistent learning is a proxy measure of the student’s effort to keep up with all activities introduced during the semester and participate in the proposed learning strategy. The correlation coefficients presented in Table 5 show that this indicator carries almost the same information as the bonus awarded by the professor at the end of the semester (the respective correlations are more than 0.96). Therefore, the bonus (_BN) is a measure of consistent learning (_CL) and a useful evaluation instrument for the professor. Consistent learning is also hypothesized (H1) to be a predictor of learning outcomes for each semester. This assumption proves to be true. Consistent learning is a strong predictor of tutorial scores in both disciplines (correlations greater than 0.70). The association between consistent learning and exam scores for the MA discipline is greater than 0.50, which is considered a large effect, but the correlation between the same variables for the PMC discipline is 0.44, which is a medium effect (Hemphill, 2003). Considering the statistical significance and magnitude of these results, hypothesis H1 is validated. Our results show that the adopted pedagogical method is coherent, leading to consistent results.
This study used a quasi-experimental design, with a between-subjects design on a single experimental group, to check the time effects of the respective treatment (i.e., the pedagogical method). The design evaluated the application of the learning method during each semester and at the end of each semester (post-test at two different moments in time). Table 6 shows that this learning strategy slightly improved student participation but not enough to record a significant difference. Consistent learning in the MA discipline is very strongly correlated with consistent learning in the PMC discipline (as indicated in Table 5), suggesting that this is an individual and stable response from each student. Therefore, H2 is validated.
Maintaining this learning strategy for two consecutive semesters was expected to significantly improve evaluation scores. The average quiz scores have significantly improved during the PMC semester, compared to the previous discipline. As seen in Table 6, tutorial scores were significantly lower, on average, during the second semester (PMC). Exam scores were also lower in the second semester, but not significantly. This situation can be attributed to the lower participation of students in the tutorial, because most became employed. Another factor that leads to this situation and cannot be statistically isolated is the difference in the requirements for the tutorial and the fact that there were different tutors between MA and PMC. Also, PMC is more difficult than MA. On the other hand, this learning strategy was carried out on the online platform for the lecture, so that the consistent learning indicator was not affected by working hours. Regardless of the reason for the decline in the evaluation scores, hypothesis H3 is not validated.
The learning outcomes are positively and strongly correlated between the two disciplines (Table 5). For example, the correlation between the scores in the final exam for the two disciplines has a coefficient of 0.701, which is a large effect. These results indicate two aspects: (a) the academic involvement of students is constant from one discipline to another (H2) and (b) the evaluation method for the exam is reliable and predictable. Therefore, hypothesis H4 is confirmed.
The results indicate that the student sample can be divided into two groups: students who voluntarily accepted the learning strategy (group A) and those who chose a lower level of participation (group B). Clustering variables were the components of the consistent learning (CL) score. A clustering solution of k = 2 was found to be appropriate for both disciplines using the silhouette test. For the MA discipline, group A had 34 members (average silhouette width = 0.64), and group B had 73 members (average silhouette width = 0.60). For the PMC discipline, group A had 46 members (average silhouette width = 0.59), and group B had 61 members (average silhouette width = 0.64). The groups were found to be stable: 32 students in group A and 59 students in group B remained the same in both semesters. Group A lost 2 students and gained 14 new members when it changed from the MA to PMC discipline.
Statistical analysis was conducted over the two groups determined by clustering for the two disciplines. Hypothesis H5 posits that the learning strategy can improve the learning outcomes of students, in both semesters. The results are presented in Table 7. Regardless of the type of learning outcome considered, the group that chose to apply the learning strategy had significantly better results at the end of the semester than the group that was less involved during the semester. These results are important because this significant improvement was observed in a larger cluster in the PMC discipline than in the initial cluster in the MA discipline. Therefore, hypothesis H5 is confirmed.
To capture the opinions of students about the learning process in the MA and PMC disciplines, we administered an ad hoc instrument of four items on a five-point scale (“totally disagree” to “totally agree”). The items were submitted to exploratory factor analysis. The instrument is presented in Table 8. The factor was labeled as “attitude towards quizzes and homework (PMC_QH)”.
The results presented in Table 9 indicate that the students’ opinions regarding the usefulness of quizzes and homework are significantly correlated with consistent learning, quiz scores, and tutorial scores for the PMC discipline (medium effect). Therefore, H6 is validated.
For the final statistical procedure, we chose to construct a regression model that would predict exam scores in the PMC discipline. Based on the variables included in Table 9, the method for model building was forward stepwise regression, performed in R. Considering that the average quiz scores (PMC_QS) and consistent learning (PMC_CL) are strongly correlated (0.940), the factor with the highest correlation with PMC_EX was chosen for the first model specification. The results of the stepwise regression are presented in Table 10, panel A. The model has a satisfactory R-squared value of 0.54 on two factors: exam scores in the MA discipline (MA_EX) and the average quiz scores during the PMC semester (PMC_QS). Both factors are positive and significant predictors of exam performance in the PMC discipline. In the second specification (Table 10, panel B), exam scores in the MA discipline (MA_EX) and the tutorial score at the end of the PMC semester (PMC_TS) were the significant predictors of final exam scores in the PMC discipline. It is apparent that consistent learning (PMC_CL) is not a direct predictor of exam scores, but it is a significant predictor of other learning outcomes (such as average quiz scores). In this sense, hypothesis H7 is not validated.
The implications of these results are discussed in the following section. A summary of the tested hypotheses is included in Table 11.

5. Discussion and Conclusions

The research question we explore in this article is the following: How are the student learning process and outcomes (in the field of management accounting) influenced by the application of a pedagogical method consisting of practice testing and student-generated questions in a blended learning environment? The contribution of this study is the statistical validation of the method. The results show that the student learning outcomes are positively correlated with consistent learning for the two disciplines—MA and PMC (H1). The students who adopted the learning strategy considered practice testing and question generation highly relevant (H6) and maintained a consistent learning approach for the second discipline (H2). However, consistent learning was not found to be a direct predictor of exam scores (H7). The application of the strategy during the two consecutive semesters did not significantly improve student learning outcomes in the second discipline, so the exam scores remained at the same level, on average.
The academic results of the students are explained by self-determination theory (SDT) in the present study. This study was based on data collected for two disciplines during two semesters (out of the six semesters required to complete the bachelor’s degree at the university). As such, this research provides an integrated picture of the student learning process. Previous studies are mostly based on introductory accounting (Blondeel et al., 2022; Kern, 2002), but the disciplines in this study are considered relatively advanced.
The present study provides a pedagogical and an empirical contribution. The pedagogical method consists of two elements (practice testing and student-generated questions), influencing more than one type of learning outcome (Prince, 2004). The teaching method was thoroughly planned for the period of two consecutive semesters. Practice testing and question generation were performed weekly, compared to quizzes or homework at irregular intervals (Diaz, 2016; Gainor et al., 2014; Marshall & Bolt-Lee, 2022). Empirically, we contribute to the literature on accounting education with a study on two related disciplines with differing levels of difficulty.
The purpose was to test the effects of applying this pedagogical method in two consecutive semesters, on a matched sample. An indicator was developed to measure the sustained effort of the students. Consistent learning in accounting is recommended because the topics in each discipline are related to each other. For example, if students do not understand cost calculation methods, they will not be able to make a managerial decision. The method was statistically validated, demonstrating its coherence and consistency. Although there were some students who used the quizzes inappropriately (i.e., sent the answers in less than five minutes), the overall effect of solving the quizzes was positive, confirming the results of Kibble (2007).
Although many researchers have highlighted low participation rates in online formative assessments (Kibble, 2011), we obtained an average of 42% in MA and 50% in PMC. This can be determined by the fact that during the pandemic, students studied online and were accustomed to using the digital platform, which made it easy for them to continue using the online platform for learning. Additionally, studies have shown a decrease in this participation rate throughout the semester (McNulty et al., 2015). Research is also needed to understand how student interest can be leveraged and maintained. Positive implications point to a decrease in dropout rates and an increase in university financing. For the present sample, a key finding is that consistent learning is correlated with the importance placed on practice testing and student-generated questions (measured through the ad hoc questionnaire proposed in this paper), and the free choice of students to solve additional tasks.
Academic scores are a key component of learning, with an impact on the career path of students (Flom et al., 2021; Gikandi et al., 2011). A significant finding of this study is that learning outcomes are positively correlated in the two disciplines, which means that self-determined students, who obtained good scores in the first discipline, continue to learn in the same way. Furthermore, the voluntary adoption of this learning strategy leads to a significant improvement in learning outcomes in both disciplines, showing the efficacy of the method and confirming the results of previous research (Berry & Chew, 2008; Dunlosky et al., 2013; King, 1992). Maintaining the learning strategy for two consecutive semesters significantly improves the average quiz scores of the students in the second discipline (H3). As such, our results show that solving quizzes is more effective for this sample than creating questions. An explanation may be related to the difficulty encountered by students when generating examples, as they are not accustomed to this type of task. However, “presenting data and written work” represents one of the competencies that a professional accountant must develop throughout the qualification (IFAC, 2021). The findings have implications for teachers, who should focus on developing student writing skills, but also for university management, which should create incentives and mobilize additional resources to develop this skill.
Our research was designed in line with SDT, focusing on the autonomous motivation of students and on improving their competence and relatedness. It is based on addressing these three psychological needs of students, an extremely important process, especially at the end of the pandemic, fostering student wellness (Ryan & Deci, 2020). According to the SDT, after reading the homework, the professor could “understand, acknowledge and, where possible, be responsive to students’ perspectives” (Ryan & Deci, 2020). The homework task checked all three psychological needs introduced by SDT. First, the students created examples autonomously. Second, the students received feedback (the scores were periodically made available on the platform), and they could improve their performance during the semester. Some of the problems formulated by the students were used by the professor to create future quizzes, addressing the need for competence, as suggested by SDT. Also, the professor came to know the students’ capabilities through their writing skills, to relate with their ideas.
In the present investigation, students were encouraged to ask questions and understand that they cannot do anything “wrong” during the semester. They were aware that a single mistake in a quiz would not affect their academic outcomes. Therefore, some students voluntarily switched from the non-involvement group to the involvement group, from one discipline to the next. Autonomous motivation leads to student engagement and consequently to improved performance, a result that confirms previous findings (Hu et al., 2023; Murayama et al., 2015). Autonomy supports inclusivity for students in difficult social conditions, like many young people in Romania. A contribution of our study and a key takeaway for teachers is that it shows that students will autonomously participate in activities that give them further knowledge. According to psychological theory, autonomy is a desired outcome that must be nurtured in schools and higher education. Therefore, universities and teachers should offer students diverse resources to improve learning.
The results can be important to support students after significant events such as a sabbatical year, a scholarship abroad, maternity leave or sick leave. Teachers must pay attention to each aspect and should intervene when necessary. However, in the case of a large cohort, catering to individual needs presents a greater challenge. Our study responds to the call from Kibble (2007) for more research on how to encourage student participation. The idea of this study came the students themselves, who asked for more practical examples in management accounting.
Successful learning methods depend on how teachers support the process, encourage the students, and acknowledge their progress through improving pedagogical methods (Adler & Milne, 1997). Their focus should become the student and the learning process rather than transmitting the information (Adler & Milne, 1997). Therefore, professors gain the ability to adapt in a flexible, “go live” environment and be prepared to help students stay motivated even during unexpected events.
An ethical point of our study is the inclusion of a homogeneous sample: all students shared the same conditions, having access to all available educational resources; the data were collected by the same experienced professor. However, a potential limitation is that there are differences from one cohort to another, as sometimes the same strategy may not work for other students, for other disciplines, or faculties (Opre et al., 2024). Another limitation is that a method may not work for several years as students adapt to it, and the strategy would not have the same impact. Thus, a future research idea is to study the effect of this pedagogical method (or a similar one) over a longer period, to establish how long the method can be applied to obtain optimal results, or whether there is a plateau effect. A significant challenge in recent years in education, in general, and in accounting education, in particular, is brought about by the rise of artificial intelligence. Therefore, future investigations should consider the influence of artificial intelligence on pedagogical methods.
There are several limitations to the research method applied in this study. First, since the adoption of the learning strategy was voluntary, there could be self-selection bias, meaning that the students who chose to participate could be different in certain ways (more motivated, better prepared) than those who did not. This could potentially influence the findings, but these personal characteristics were not measured. Second, this study used a quasi-experimental design without a control group. This choice was determined by the need to observe the regulations in our university, which demands that all students have access to the same learning conditions. Therefore, we resorted to clustering to separate involved learners from non-involved learners. Third, the effectiveness of the learning strategy could depend heavily on the skill and energy of the professor implementing it, which could limit its applicability in different settings. Another limitation is given by the size of the sample (107 students). We decided to work with a database of this size because, in this way, we could assure homogeneous conditions for all students. The sample size is comparable to the ones used in other studies dedicated to education in management accounting (e.g., Downen & Hyde, 2016—99 students; Fogarty & Goldwater, 2024—108 students; Krom, 2012—76 students).
There are many requirements for students and accounting professionals to develop new skills and answer new challenges (IFAC, 2023). The approach presented here is one way to support education in management accounting for students to participate in a deep learning experience, improve learning outcomes, engage with complex assessment, and develop their writing skills. The intervention described in this study has the potential to help students find motivation and excel in their careers. In a university with an open admission system (such as the one in the present study), the dropout rate is a matter of increasing concern (Blondeel et al., 2023a). Romania is the country with the highest risk of poverty and social exclusion in the European Union, i.e., 32.0% of the population in 2023 (Eurostat, 2024). Therefore, the impact of teacher interventions can be even more important in this social and educational context, contributing to SDG 4 “Quality Education”. Learning techniques can create the foundation for a successful professional life.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/educsci15030263/s1.

Author Contributions

Conceptualization, M.D. and V.D.D.; methodology, M.D. and V.D.D.; software, V.D.D.; validation, M.D. and V.D.D.; formal analysis, M.D. and V.D.D.; investigation, M.D.; resources, M.D. and V.D.D.; data curation, M.D. and V.D.D.; writing—original draft preparation, M.D. and V.D.D.; writing—review and editing, M.D. and V.D.D.; visualization, M.D. and V.D.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Regarding the questionnaire at the end of the semester, the formulation was: “Please fill out the attached questionnaire. This activity will last for approximately eight minutes. There will be a bonus score of 0.25 points added to the Classwork score if you fill-in the questionnaire. Your names will be automatically collected, but only for the purpose of awarding the bonus. The results of the questionnaire will only be used for scientific research and teaching purposes”.

Data Availability Statement

For data supporting reported results, please e-mail the authors.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MAManagement Accounting
PMCPerformance Measurement and Control
SDTSelf-Determination Theory
SDGSustainable Development Goal

Appendix A

Main topics included in the syllabus for management accounting and performance measurement and control at the host university:
Table A1. A summary of the syllabus contents for the two disciplines in this study.
Table A1. A summary of the syllabus contents for the two disciplines in this study.
Management Accounting (MA)Performance Measurement and Control (PMC)
Management accounting theoretical frameworkAdvanced cost accounting techniques (e.g., ABC costing, target costing, life cycle costing)
Expenses and types of costsSustainable performance measurement
Specific treatment of overheadsCost–volume–profit analysis
Job order costingDecision making
Process costingStandard costing
Standard costingBudgeting
Transfer pricing
Performance measurement and reporting (e.g., balanced scorecard)

References

  1. Abraham, A. (2007, December 2–5). Student-centred teaching of accounting to engineering students: Comparing blended learning and traditional approaches. Australian Society for Computers in Learning in Tertiary Education Annual Conference 2007 (pp. 1–9), SingaporeAvailable online: https://ro.uow.edu.au/cgi/viewcontent.cgi?article=1463&context=commpapers (accessed on 18 November 2024).
  2. Adams Becker, S., Cummins, M., Davis, A., Freeman, A., Hall Giesinger, C., & Ananthanarayanan, V. (2017). NMC HORIZON report, (2017 higher education ed.); New Media Consortium. Available online: https://eric.ed.gov/?id=ED582134 (accessed on 7 February 2025).
  3. Adler, R. W., & Milne, M. J. (1997). Commentary. A day of active-learning: An accounting educators’ symposium. Accounting Education, 6(3), 273–280. [Google Scholar] [CrossRef]
  4. Aflalo, E. (2021). Students generating questions as a way of learning. Active Learning in Higher Education, 22(1), 63–75. [Google Scholar] [CrossRef]
  5. Akaaboune, O., Blix, L. H., Daigle, R. J., & Quarles, R. (2020). Data Analytics in the financial statement audit: Assessing its active learning effects on student performance. The Accounting Educators’ Journal, 30, 115–135. [Google Scholar]
  6. Albu, N., Calu, D.-A., & Gușe, R.-G. (2016). The role of accounting internships in preparing students’ transition from school to active life. Accounting and Management Information Systems, 15(1), 131–153. [Google Scholar]
  7. Apostolou, B., Dorminey, J. W., & Hassell, J. M. (2021). Accounting education literature review (2020). Journal of Accounting Education, 55, 100725. [Google Scholar] [CrossRef]
  8. Azzali, S., Mazza, T., & Tibiletti, V. (2022). Student engagement and performance: Evidence from the first wave of COVID-19 in Italy. Accounting Education, 32(4), 479–500. [Google Scholar] [CrossRef]
  9. Berry, J. W., & Chew, S. L. (2008). Improving learning through interventions of student-generated questions and concept maps. Teaching of Psychology, 35(4), 305–312. [Google Scholar] [CrossRef]
  10. Blankley, A. I., Kerr, D., & Wiggins, C. E. (2017). The state of accounting education in business schools: An examination and analysis of active learning techniques. In T. J. Rupert, & B. B. Kern (Eds.), Advances in accounting education (Vol. 21, pp. 101–124). Emerald Publishing Limited. [Google Scholar] [CrossRef]
  11. Blondeel, E., Everaert, P., & Opdecam, E. (2022). Stimulating higher education students to use online formative assessments: The case of two mid-term take-home tests. Assessment & Evaluation in Higher Education, 47(2), 297–312. [Google Scholar] [CrossRef]
  12. Blondeel, E., Everaert, P., & Opdecam, E. (2023a). A little push in the back: Nudging students to improve procrastination, class attendance and preparation. Studies in Higher Education, 49(11), 2016–2035. [Google Scholar] [CrossRef]
  13. Blondeel, E., Everaert, P., & Opdecam, E. (2023b). Does practice make perfect? The effect of online formative assessments on students’ self-efficacy and test anxiety. The British Accounting Review, 56, 101189. [Google Scholar] [CrossRef]
  14. Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. School of Education and Human Development, George Washington University. [Google Scholar]
  15. Byrne, M., & Flood, B. (2005). A study of accounting students’ motives, expectations and preparedness for higher education. Journal of Further and Higher Education, 29(2), 111–124. [Google Scholar] [CrossRef]
  16. Cameron, J. (2001). Negative effects of reward on intrinsic motivation—A limited phenomenon: Comment on Deci, Koestner, and Ryan (2001). Review of Educational Research, 71(1), 29–42. [Google Scholar] [CrossRef]
  17. Carpenter, S. K. (2011). Semantic information activated during retrieval contributes to later retention: Support for the mediator effectiveness hypothesis of the testing effect. Journal of Experimental Psychology: Learning, Memory, and Cognition, 37(6), 1547–1552. [Google Scholar] [CrossRef] [PubMed]
  18. Center for Self-Determination Theory. (2023). Applying self-determination theory to education. Available online: https://selfdeterminationtheory.org/topics/application-education/ (accessed on 5 October 2024).
  19. Chen, C. C., & Jones, K. T. (2007). Blended learning vs. traditional classroom settings: Assessing effectiveness and student perceptions in an MBA accounting course. Journal of Educators Online, 4(1), 1–15. [Google Scholar] [CrossRef]
  20. Cheng, P., & Ding, R. (2021). The effect of online review exercises on student course engagement and learning performance: A case study of an introductory financial accounting course at an international joint venture university. Journal of Accounting Education, 54, 100699. [Google Scholar] [CrossRef]
  21. Chu, L., & Libby, T. (2010). Writing mini-cases: An active learning assignment. Issues in Accounting Education, 25(2), 245–265. [Google Scholar] [CrossRef]
  22. Curtis, S. M. (2011). Formative assessment in accounting education and some initial evidence on its use for instructional sequencing. Journal of Accounting Education, 29(4), 191–211. [Google Scholar] [CrossRef]
  23. Deci, E. L., & Ryan, R. M. (2012). Self-determination theory. In P. Van Lange, A. Kruglanski, & E. Higgins (Eds.), Handbook of theories of social psychology (Vol. 1, pp. 416–437). SAGE Publications Ltd. [Google Scholar] [CrossRef]
  24. Diaz, M. C. (2016). Assembling the opinion: An active learning exercise for audit students. Journal of Accounting Education, 34, 30–40. [Google Scholar] [CrossRef]
  25. Downen, T., & Hyde, B. (2016). Flipping the managerial accounting principles course: Effects on student performance, evaluation, and attendance. In T. J. Rupert, & B. B. Kern (Eds.), Advances in accounting education (Vol. 19, pp. 61–87). Emerald Group Publishing Limited. [Google Scholar] [CrossRef]
  26. Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14(1), 4–58. [Google Scholar] [CrossRef]
  27. Ebersbach, M., Feierabend, M., & Nazari, K. B. B. (2020). Comparing the effects of generating questions, testing, and restudying on students’ long-term recall in university learning. Applied Cognitive Psychology, 34(3), 724–736. [Google Scholar] [CrossRef]
  28. Edmonds, C. T., & Edmonds, T. P. (2008). An empirical investigation of the effects of SRS technology on introductory managerial accounting students. Issues in Accounting Education, 23(3), 421–434. [Google Scholar] [CrossRef]
  29. Eurostat. (2024). Living conditions in Europe—Poverty and social exclusion. Available online: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Living_conditions_in_Europe_-_poverty_and_social_exclusion (accessed on 18 November 2024).
  30. Flom, J. M., Green, K. Y., & Wallace, S. (2021). Helping higher education students succeed: An examination of student attributes and academic grade performance. Active Learning in Higher Education, 24(2), 221–235. [Google Scholar] [CrossRef]
  31. Fogarty, T. J., & Goldwater, P. M. (2024). No pain, no gain: The structure and consequences of question difficulty in a management accounting course. Journal of Accounting Education, 68, 100916. [Google Scholar] [CrossRef]
  32. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. [Google Scholar] [CrossRef] [PubMed]
  33. Frick, H., Birt, J., & Waters, J. (2020). Enhancing student engagement in large management accounting lectures. Accounting & Finance, 60(1), 271–298. [Google Scholar] [CrossRef]
  34. Froiland, J. M., & Worrell, F. C. (2016). Intrinsic motivation, learning goals, engagement, and achievement in a diverse high school. Psychology in the Schools, 53(3), 321–336. [Google Scholar] [CrossRef]
  35. Gagné, M., & Deci, E. L. (2005). Self-determination theory and work motivation. Journal of Organizational Behavior, 26(4), 331–362. [Google Scholar] [CrossRef]
  36. Gainor, M., Bline, D., & Zheng, X. (2014). Teaching internal control through active learning. Journal of Accounting Education, 32(2), 200–221. [Google Scholar] [CrossRef]
  37. Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. [Google Scholar] [CrossRef]
  38. Grabinski, K., Kedzior, M., & Krasodomska, J. (2015). Blended learning in tertiary accounting education in the CEE region—A Polish perspective. Journal of Accounting and Management Information Systems, 14(2), 378–397. [Google Scholar]
  39. Groen, B. A. C., Wouters, M. J. F., & Wilderom, C. P. M. (2017). Employee participation, performance metrics, and job performance: A survey study based on self-determination theory. Management Accounting Research, 36, 51–66. [Google Scholar] [CrossRef]
  40. Harker, M., & Koutsantoni, D. (2005). Can it be as effective? Distance versus blended learning in a web-based EAP programme. ReCALL, 17(2), 197–216. [Google Scholar] [CrossRef]
  41. Hemphill, J. F. (2003). Interpreting the magnitudes of correlation coefficients. American Psychologist, 58(1), 78–79. [Google Scholar] [CrossRef] [PubMed]
  42. Henadirage, A., & Gunarathne, N. (2023). Retaining remote teaching and assessment methods in accounting education: Drivers and challenges in the post-pandemic era. The International Journal of Management Education, 21(2), 100810. [Google Scholar] [CrossRef]
  43. Hu, Y., Nath, N., Zhu, Y., & Laswad, F. (2023). Accounting students’ online engagement, choice of course delivery format and their effects on academic performance. Accounting Education, 33(5), 649–684. [Google Scholar] [CrossRef]
  44. Huber, M., Law, D., & Khallaf, A. (2017). Active learning innovations in Introductory Financial Accounting. In T. J. Rupert, & B. B. Kern (Eds.), Advances in accounting education (Vol. 21, pp. 125–167). Emerald Publishing Limited. [Google Scholar] [CrossRef]
  45. Hunt, R. R. (1995). The subtlety of distinctiveness: What von Restorff really did. Psychonomic Bulletin & Review, 2(1), 105–112. [Google Scholar] [CrossRef]
  46. IFAC. (2021). Syllabus & competencies matrix for a three-level qualification. A project under the IFAC accountancy capacity building program. Available online: https://www.ifac.org/knowledge-gateway/professional-accountancy-organization-development-paod/publications/syllabus-competencies-matrix-three-level-qualification (accessed on 31 May 2023).
  47. IFAC. (2023). Sustainability education for aspiring and professional accountants—Background information. Available online: https://ifacweb.blob.core.windows.net/publicfiles/2023-10/Sustainability%20Stakeholder%20Outreach%20Package.pdf (accessed on 19 June 2024).
  48. Imran, R., Fatima, A., Elbayoumi, S., & Allil, K. (2023). Teaching and learning delivery modes in higher education: Looking back to move forward post-COVID-19 era. International Journal of Management in Education, 21(2), 100805. [Google Scholar] [CrossRef]
  49. Kelly, O., Hall, T., & Connolly, C. (2023). PACE-IT: Designing blended learning for accounting education in the challenging context of a global pandemic. Accounting Education, 32(6), 626–645. [Google Scholar] [CrossRef]
  50. Kern, B. B. (2002). Enhancing accounting students’ problem-solving skills: The use of a hands-on conceptual model in an active learning environment. Accounting Education, 11(3), 235–256. [Google Scholar] [CrossRef]
  51. Kibble, J. D. (2007). Use of unsupervised online quizzes as formative assessment in a medical physiology course: Effects of incentives on student participation and performance. Advances in Physiology Education, 31(3), 253–260. [Google Scholar] [CrossRef]
  52. Kibble, J. D. (2011). Voluntary participation in online formative quizzes is a sensitive predictor of student success. Advances in Physiology Education, 35(1), 95–96. [Google Scholar] [CrossRef] [PubMed]
  53. King, A. (1992). Comparison of self-questioning, summarizing, and notetaking-review as strategies for learning from lectures. American Educational Research Journal, 29(2), 303–323. [Google Scholar] [CrossRef]
  54. Krasodomska, J., & Godawska, J. (2021). E-learning in accounting education: The influence of students’ characteristics on their engagement and performance. Accounting Education, 30(1), 22–41. [Google Scholar] [CrossRef]
  55. Krause, P. (1988). Active learning for budgeting concepts. Journal of Accounting Education, 6(2), 331–337. [Google Scholar] [CrossRef]
  56. Krom, C. L. (2012). Using FarmVille in an introductory managerial accounting course to engage students, enhance comprehension, and develop social networking skills. Journal of Management Education, 36(6), 848–865. [Google Scholar] [CrossRef]
  57. Kunz, J. (2015). Objectivity and subjectivity in performance evaluation and autonomous motivation: An exploratory study. Management Accounting Research, 27, 27–46. [Google Scholar] [CrossRef]
  58. Lancaster, K. A. S., & Strand, C. A. (2001). Using the team-learning model in a managerial accounting class: An experiment in cooperative learning. Issues in Accounting Education, 16(4), 549–567. [Google Scholar] [CrossRef]
  59. Mardini, G. H., & Mah’d, O. A. (2022). Distance learning as emergency remote teaching vs. traditional learning for accounting students during the COVID-19 pandemic: Cross-country evidence. Journal of Accounting Education, 61, 100814. [Google Scholar] [CrossRef]
  60. Marshall, L. L., & Bolt-Lee, C. E. (2022). A neuroeducation and cognitive learning-based strategy to develop active learning pedagogies for accounting education. The Accounting Educators’ Journal, 32. Available online: https://www.aejournal.com/ojs/index.php/aej/article/view/689 (accessed on 25 May 2023).
  61. McNulty, J. A., Espiritu, B. R., Hoyt, A. E., Ensminger, D. C., & Chandrasekhar, A. J. (2015). Associations between formative practice quizzes and summative examination outcomes in a medical anatomy course. Anatomical Sciences Education, 8(1), 37–44. [Google Scholar] [CrossRef]
  62. Mihai, F., Stan, M., Radu, G., & Dumitru, V. F. (2020). Heavy work investment for the accounting profession in Romania at time of coronavirus pandemic [Special issue]. Amfiteatru Economic, 22(4), 1121–1139. [Google Scholar] [CrossRef]
  63. Murayama, K., Matsumoto, M., Izuma, K., Sugiura, A., Ryan, R. M., Deci, E. L., & Matsumoto, K. (2015). How self-determined choice facilitates performance: A key role of the ventromedial prefrontal cortex. Cerebral Cortex, 25(5), 1241–1251. [Google Scholar] [CrossRef] [PubMed]
  64. Opre, D., Șerban, C., Veșcan, A., & Iucu, R. (2024). Supporting students’ active learning with a computer-based tool. Active Learning in Higher Education, 25, 135–150. [Google Scholar] [CrossRef]
  65. Peek, L. E., Winking, C., & Peek, G. S. (1995). Cooperative learning activities: Managerial accounting. Issues in Accounting Education, 10(1), 111. [Google Scholar]
  66. Peters, M. D., & Chiu, C. (2022). Interactive spreadsheeting: A learning strategy and exercises for calculative management accounting principles. Issues in Accounting Education, 37(4), 47–60. [Google Scholar] [CrossRef]
  67. Pinto-Llorente, A. M., Sánchez-Gómez, M. C., García-Peñalvo, F. J., & Casillas-Martín, S. (2017). Students’ perceptions and attitudes towards asynchronous technological tools in blended-learning training to improve grammatical competence in English as a second language. Computers in Human Behavior, 72, 632–643. [Google Scholar] [CrossRef]
  68. Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ); The University of Michigan. Available online: https://eric.ed.gov/?id=ED338122 (accessed on 18 June 2019).
  69. Potter, B. N., & Johnston, C. G. (2006). The effect of interactive on-line learning systems on student learning outcomes in accounting. Journal of Accounting Education, 24(1), 16–34. [Google Scholar] [CrossRef]
  70. Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231. [Google Scholar] [CrossRef]
  71. Rawson, K. A., & Dunlosky, J. (2011). Optimizing schedules of retrieval practice for durable and efficient learning: How much is enough? Journal of Experimental Psychology: General, 140(3), 283–302. [Google Scholar] [CrossRef]
  72. Rosenshine, B., Meister, C., & Chapman, S. (1996). Teaching students to generate questions: A review of the intervention studies. Review of Educational Research, 66(2), 181–221. [Google Scholar] [CrossRef]
  73. Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25(1), 54–67. [Google Scholar] [CrossRef] [PubMed]
  74. Ryan, R. M., & Deci, E. L. (2020). Intrinsic and extrinsic motivation from a self-determination theory perspective: Definitions, theory, practices, and future directions. Contemporary Educational Psychology, 61, 101860. [Google Scholar] [CrossRef]
  75. Sangster, A., Stoner, G., & Flood, B. (2020). Insights into accounting education in a COVID-19 world. Accounting Education, 29(5), 431–562. [Google Scholar] [CrossRef]
  76. Schutte, N. S., & Malouff, J. M. (2019). Increasing curiosity through autonomy of choice. Motivation and Emotion, 43(4), 563–570. [Google Scholar] [CrossRef]
  77. Song, D. (2016). Student-generated Questioning and Quality Questions: A Literature Review. Research Journal of Educational Studies and Review, 2(5), 58–70. [Google Scholar]
  78. Sugahara, S., & Dellaportas, S. (2018). Bringing active learning into the accounting classroom. Meditari Accountancy Research, 26(4), 576–597. [Google Scholar] [CrossRef]
  79. Tartavulea, C. V., Albu, C. N., Albu, N., Dieaconescu, R. I., & Petre, S. (2020). Online teaching practices and the effectiveness of the educational process in the wake of the COVID-19 pandemic. Amfiteatru Economic, 22(55), 920. [Google Scholar] [CrossRef]
  80. Tettamanzi, P., Minutiello, V., & Murgolo, M. (2023). Accounting education and digitalization: A new perspective after the pandemic. The International Journal of Management Education, 21(3), 100847. [Google Scholar] [CrossRef]
  81. Van den Brink, H., Kokke, K., De Loo, I., Nederlof, P., & Verstegen, B. (2003). Teaching management accounting in a competencies-based fashion. Accounting Education, 12(3), 245–259. [Google Scholar] [CrossRef]
  82. Van der Hauwaert, E., Hoozée, S., Maussen, S., & Bruggeman, W. (2022). The impact of enabling performance measurement on managers’ autonomous work motivation and performance. Management Accounting Research, 55, 100780. [Google Scholar] [CrossRef]
Table 1. A review of studies presenting pedagogical methods in management accounting.
Table 1. A review of studies presenting pedagogical methods in management accounting.
StudyPedagogical MethodValidation Method
Downen and Hyde (2016)Half of the cohort was taught using a traditional approach and half using a flipped classroom approach. The approaches were changed for the second part of the semester.A total of 99 students in the cohort. Student evaluations of the course and instructor; repeated measures and regression analysis for overall effects; quantile regression for performance-segregated effects.
Edmonds and Edmonds (2008)A student response system (SRS) was used in class in three of the six introductory management accounting lectures.A sample of 554 students, 279 in the SRS group and 275 in the non-SRS group. Statistical validation (using an econometric model) of the effect of the SRS on student performance.
Fogarty and Goldwater (2024)A database containing 17,192 multiple-choice questions was used for practice, quizzes, and exams to test the effect of difficulty levels and the evaluation context on student success.A total of 108 students participated in the study. Statistical evaluation of the data collected for practice, quizzes, and exams.
Frick et al. (2020)Two blended learning models were used during lectures to improve student engagement in two management accounting courses (principles and advanced). The first model (implemented in the principles of management accounting course) consisted of an online pre-lecture material and “an audience response system for students to interactively answer multiple-choice questions during class meetings.” In the second course, the model was improved. The pre-lecture material was carefully revised, the students had several quizzes during the lecture, and they were encouraged to discuss the solutions with their colleagues.A total of 363 students in the first course and 299 students in the second course: perceptions of the usefulness survey in the first course (137 responses); test results; review of statistical records; 96 answers to student survey in the second course.
Krom (2012)The use of the free Zynga FarmVille computer game during the class.A total of 76 students, of which 25 were included in the FarmVille group; data were collected through three formal surveys in class.
Lancaster and Strand (2001)Team-learning model.A total of 82 students in the control group and 81 students in the treatment group. Statistical analysis of exam scores and of students’ perception of the course and instructor.
Peek et al. (1995)Cooperative learning techniques in introductory management accounting.-
Peters and Chiu (2022)Interactive spreadsheets for core calculative principles used during management accounting lectures.Experience in use, 169 responses to student feedback surveys, three senior professors from business school, and three instructors.
Potter and Johnston (2006)Online learning system (MarlinaLSTM) used in intermediate management accounting. The system consists of three modules: (1) short answer, narrative-type questions, and longer, interactive-style questions to be solved by students prior to the tutorial; (2) online tutor; (3) revision tool.A total of 1116 students were included in the sample. Statistical validation of the impact of the use of the online learning system and students’ characteristics on learning outcomes.
Van den Brink et al. (2003)Problem-solving strategy based on the use of case studies. A general questionnaire was designed to guide the students in solving the case.“Students reacted equally enthusiastically to the new course.”
Table 2. The matched sample selection.
Table 2. The matched sample selection.
ExplanationAmounts
Students included in the MA cohort146
-Students who were assigned to another cohort of PMC or have withdrawn from university studies30
-Students who did not attend the exam for PMC9
=Number of students included in the sample, matched between MA and PMC107
Table 3. Data collected.
Table 3. Data collected.
Data CollectedManagerial Accounting (MA)Performance Measurement and Control (PMC)
Year of studies2nd3rd
PeriodFebruary–June 2022, Semester 2October 2022–January 2023, Semester 1
Number of students in cohort138125
Number of quizzes during the semester1114
Quiz scores(Score from 0 to 10)/10
Average quiz score (_QS)Sum of quiz scores per student/11Sum of quiz scores per student/14
Time necessary to take the quiz (per quiz and in total)Seconds (different variants of max. time allowed)Seconds (all quizzes, max. 900)
Homework count, provided after the lecture (e.g., problems created)12
Homework score(Score from 0 to 10)/10
Number of logsNumber of clicks on the platform
Consistent learning (_CL)Calculated based on the proportion of homework submitted, quartiles of platform logs, and proportion of quizzes solved, for each semester
Bonus (_BN)Points from 0 to 1
Tutorial scores (_TS)1–10
Exam (_EX)Range of 1 to 10 (excluding the bonus)
Students who attended both exams107 students
(paired sample)
QuestionnaireAd hoc instrument on the students’ opinion regarding the learning strategy of PMC
(95 responses received on the paired sample)
Table 4. Descriptive statistics for the main variables.
Table 4. Descriptive statistics for the main variables.
VariablesMeanSDMinMaxSkew.Kurt.Normality
Shapiro–Wilk W
MA_CL1.270.920.023.000.47−1.160.91 **
MA_QS0.190.1700.690.84−0.200.89 **
MA_BN0.460.32010.45−1.130.91 **
MA_TS6.282.1109.60−0.670.010.95 **
MA_EX4.332.321.259.750.67−0.690.91 **
PMC_CL1.310.970.022.950.27−1.420.90 **
PMC_QS0.250.2100.730.40−1.100.91 **
PMC_BN0.420.33010.48−1.240.89 **
PMC_TS5.472.171.1310−0.06−0.580.98
PMC_EX4.281.401.358.600.750.080.94 **
Notes. N = 107 students, matched for the two semesters. ** p < 0.01.
Table 5. Spearman correlations (nonparametric) between consistent learning, the bonus at the end of the semester, and tutorial and exam scores for both disciplines.
Table 5. Spearman correlations (nonparametric) between consistent learning, the bonus at the end of the semester, and tutorial and exam scores for both disciplines.
IndicatorsMA_CLMA_QSMA_BNMA_TSMA_EXPMC_CLPMC_QSPMC_BNPMC_TSPMC_EX
MA_CL10.942 **0.961 **0.716 **0.524 **0.851 **---0.452 **
MA_QS 10.926 **0.683 **0.602 **-0.788 **---
MA_BN 10.733 **0.535 **--0.793 **--
MA_TS 10.525 **---0.483 **-
MA_EX 1----0.701 **
PMC_CL 10.953 **0.976 **0.772 **0.441 **
PMC_QS 10.962 **0.794 **0.540 **
PMC_BN 10.791 **0.472 **
PMC_TS 10.531 **
PMC_EX 1
Notes. ** p < 0.01. N = 107 students.
Table 6. Paired nonparametric tests between the results for the students’ participation, average quiz scores, and tutorial and exam scores in two consecutive semesters.
Table 6. Paired nonparametric tests between the results for the students’ participation, average quiz scores, and tutorial and exam scores in two consecutive semesters.
MeasureMA
M (SD)
PMC
M (SD)
Wilcoxon Signed
Rank Test
Effect Size
Consistent learning (_CL)1.266
(0.914)
1.313
(0.973)
V = 2654
(p = 0.466)
0.070
(small)
Average quiz score (_QS)0.192
(0.173)
0.249
(0.209)
V = 1310 **
(p < 0.001)
0.415
(moderate)
Tutorial scores (_TS)6.280
(2.111)
5.468
(2.172)
V = 3983 **
(p < 0.001)
0.348
(moderate)
Exam scores (_EX)4.327
(2.321)
4.278
(1.403)
V = 2770
(p = 0.838)
0.020
(small)
Notes. ** p < 0.01. N = 107 students.
Table 7. Independent nonparametric group tests on learning outcomes between the involved learners’ group and the voluntary non-involvement group.
Table 7. Independent nonparametric group tests on learning outcomes between the involved learners’ group and the voluntary non-involvement group.
Involved Learning Group (A)Voluntary Non-Involvement Group (B)Wilcoxon Rank Sum TestEffect Size
MeasurenM (SD)nM (SD)
MA_QS340.40
(0.12)
730.09
(0.08)
V = 2449 **
(p < 0.001)
0.782
(large)
MA_TS347.98
(1.15)
735.49
(1.99)
V = 2154 **
(p < 0.001)
0.591
(large)
MA_EX346.26
(2.19)
733.42
(1.77)
V = 2092 **
(p < 0.001)
0.551
(large)
PMC_QS460.45
(0.12)
610.09
(0.09)
V = 2762 **
(p < 0.001)
0.828
(large)
PMC_TS467.01
(1.84)
614.31
(1.62)
V = 2464 **
(p < 0.001)
0.646
(large)
PMC_EX464.84
(1.52)
613.86
(1.16)
V = 1957 **
(p < 0.001)
0.337
(moderate)
Notes. ** p < 0.01. Quizzes not submitted are treated as having a zero score. N = 107 students.
Table 8. Factor loadings on the items of an ad hoc instrument regarding the learning strategy, administered at the completion of the PMC discipline.
Table 8. Factor loadings on the items of an ad hoc instrument regarding the learning strategy, administered at the completion of the PMC discipline.
Items (Ad Hoc Instrument)
Scale: −2 → +2
Factor Loadings
(PMC_QH)
MeanSDAlpha if Item Removed
IT2.1. The quizzes in both disciplines corresponded to my expectations.0.6821.110.790.83
IT2.2. I learned many things because I solved the quizzes in both disciplines.0.8491.180.840.78
IT2.3. I learned many things because I created applications (exercises) in both disciplines.0.7770.881.010.81
IT2.4. I learned many things because I solved the topics proposed at the lecture.0.7611.190.820.81
Cronbach alpha (95% confidence boundaries)0.85 (0.80–0.90)
Correlation of (regression) scores with factor0.93
Multiple R square of scores with factors0.86
Notes. Oblimin rotation applied. N = 95 students.
Table 9. Spearman correlations between consistent learning, academic performance, and student opinions regarding the learning strategy in PMC.
Table 9. Spearman correlations between consistent learning, academic performance, and student opinions regarding the learning strategy in PMC.
IndicatorsPMC_CLPMC_QSPMC_TSPMC_EXPMC_QH
PMC_CL10.940 **0.720 **0.428 **0.494 **
PMC_QS 10.761 **0.542 **0.437 **
PMC_TS 10.629 **0.312 **
PMC_EX 10.207 *
PMC_QH 1
Notes. * p < 0.05, ** p < 0.01. N = 95 students.
Table 10. Linear regression results with the PMC exam score (PMC_EX) as the dependent variable.
Table 10. Linear regression results with the PMC exam score (PMC_EX) as the dependent variable.
PANEL A
PredictorCoef.Std. Errort-Valuep-ValueVIF
(Intercept)2.348 **0.21211.060<0.001-
MA_EX0.355 **0.0536.699<0.0011.593
PMC_QS1.519 *0.6052.5130.0141.593
Adjusted R-squared: 0.544
F-statistic: 57.13 ** on 2 and 92 df.
PANEL B
PredictorCoef.Std. Errort-Valuep-ValueVIF
(Intercept)1.987 **0.2966.715<0.001-
MA_EX0.369 **0.0546.870<0.0011.598
PMC_TS0.125 *0.0612.0560.0431.598
Adjusted R-squared: 0.534
F-statistic: 54.94 ** on 2 and 92 df.
Notes. * p < 0.05, ** p < 0.01, N = 95 students. Factors determined using forward stepwise regression.
Table 11. Validation of hypotheses.
Table 11. Validation of hypotheses.
Hyp.HypothesesStatistical TestsConfirmed
H1Learning outcomes (quiz average, bonus, and tutorial and exam scores) are positively correlated with consistent learning.Nonparametric correlationsYes
H2Consistent learning in the MA discipline is positively correlated with consistent learning in the PMC discipline.Paired nonparametric tests for two groupsYes
H3Maintaining the learning strategy for two consecutive semesters in related disciplines significantly improves the students’ learning outcomes in the second discipline.Paired nonparametric tests for two groupsNo
H4Learning outcomes are positively correlated between the two disciplines.Nonparametric correlationsYes
H5The voluntary adoption of the learning strategy leads to a significant improvement in the student learning outcomes in both disciplines.Clustering, independent nonparametric group tests Yes
H6Consistent learning is correlated with the importance attached to quizzes and homework, and with the students’ choice to solve additional tasks.Factor analysis, nonparametric correlationsYes
H7Consistent learning is a direct predictor of exam scores in the second discipline.RegressionNo
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dumitru, M.; Dragomir, V.D. Assessment-Focused Pedagogical Methods for Improving Student Learning Process and Academic Outcomes in Accounting Disciplines. Educ. Sci. 2025, 15, 263. https://doi.org/10.3390/educsci15030263

AMA Style

Dumitru M, Dragomir VD. Assessment-Focused Pedagogical Methods for Improving Student Learning Process and Academic Outcomes in Accounting Disciplines. Education Sciences. 2025; 15(3):263. https://doi.org/10.3390/educsci15030263

Chicago/Turabian Style

Dumitru, Mădălina, and Voicu D. Dragomir. 2025. "Assessment-Focused Pedagogical Methods for Improving Student Learning Process and Academic Outcomes in Accounting Disciplines" Education Sciences 15, no. 3: 263. https://doi.org/10.3390/educsci15030263

APA Style

Dumitru, M., & Dragomir, V. D. (2025). Assessment-Focused Pedagogical Methods for Improving Student Learning Process and Academic Outcomes in Accounting Disciplines. Education Sciences, 15(3), 263. https://doi.org/10.3390/educsci15030263

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop