Next Article in Journal
Intelligent Agents at School—Child–Robot Interactions as an Educational Path
Previous Article in Journal
Alleviating Barriers Facing Students on the Boundaries of STEM Makerspaces
Previous Article in Special Issue
Engineering Education in the Age of AI: Analysis of the Impact of Chatbots on Learning in Engineering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Learn with M.E.—Let Us Boost Personalized Learning in K-12 Math Education!

Department of Informatics, J. Selye University, 945 01 Komárno, Slovakia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(7), 773; https://doi.org/10.3390/educsci14070773 (registering DOI)
Submission received: 27 June 2024 / Revised: 4 July 2024 / Accepted: 13 July 2024 / Published: 16 July 2024
(This article belongs to the Special Issue Application of AI Technologies in STEM Education)

Abstract

:
The traditional educational system, in certain aspects, limits personalized learning. This is mainly evident in the fact that average students, who do not have any learning difficulties, are required to solve the same tasks from the same textbook in the same order. Artificial intelligence and other smart learning tools present great opportunities for implementing a personalized learning system. Our previous surveys and literature reviews also show that educators see the greatest potential in personalized education for the assimilation of artificial intelligence into education. In this context, we have developed educational software called “Learn with M.E. as Math Educator”, which facilitates more personalized teaching of basic mathematical operations. This study presents the structure and operation of this application. We tested the usability of the software in several institutions. Our research target group consists of elementary school students, specifically those aged 11–15. This article provides a detailed overview of the accuracy and educational outcomes of the completed application. We evaluated the application and its effectiveness using both qualitative and quantitative methods. Our research design combined elements of educational technology development and effectiveness assessment. To evaluate student performance, we employed a control group methodology. Data were analyzed by comparing test results between students using the software and those receiving traditional instruction. We examined user satisfaction through survey questionnaires. Teachers’ opinions were gathered through structured interviews, and their responses were categorized using a SWOT analysis. The findings indicated that the use of the software significantly improved students’ mathematics performance compared to the control group. Students provided positive feedback on the software’s user interface, describing it as user-friendly and motivating. Teachers regarded the software as an effective educational tool, facilitating differentiated instruction and enhancing student engagement. The results suggest that digital educational tools, such as the developed software, can provide substantial added value in education.

1. Introduction

Nowadays, education is undergoing significant changes brought about by new technologies, primarily artificial intelligence (AI). The goal of education has always been to prepare individuals to thrive in society and adapt to new challenges. However, the methods of education are constantly evolving and developing. New technologies and scientific discoveries are challenging old perspectives and principles.
A significant disadvantage of traditional teaching is that all students must follow the same learning sequence, despite having different levels of knowledge, learning styles, and needs. Traditional teaching tools encourage students to follow a predetermined curriculum and learning sequence to improve their academic performance. However, these fixed learning sequences are not suitable for every student [1]. Additionally, students do not receive enough opportunities for independent practice and cannot connect real-world experiences with the concepts learned to solve actual life problems [2]. Focusing on mathematics education, the 2019 results from the National Assessment of Education Progress (NAEP) show that only 40% of fourth-grade students have good mathematical knowledge [3]. According to Nores and Barnett’s study, the situation is worse for children from low socioeconomic backgrounds, who start school with significantly less mathematical knowledge. Their research showed that early mathematical skills development is lacking in these students compared to their middle-class peers [4]. This issue persists in later grades as well. Reardon notes that many children from low socioeconomic backgrounds are almost a full year behind their middle-class peers in mathematical knowledge by the time they start school, and this gap often persists and widens over time [5]. Moreover, several studies highlight the challenge for teachers to appropriately personalize and individualize learning for every student in their class due to varying levels of mathematical knowledge [6,7,8]. These problems were already noticeable in the 2010s. Since the COVID-19 pandemic, the number of students struggling with reading and mathematical weaknesses has increased. Simultaneously, the use of educational applications aimed at supporting learning and improving academic performance has also increased [9]. However, simple online teaching platforms do not fully consider students’ needs, instead asking students to adapt to the system [10]. They lack personalized tutoring, which can be extremely effective in practice [11]. This effectiveness is mainly felt in how such systems allow students to receive personalized, differentiated instruction. Intelligent educational software aims to provide appropriate support to the student at the right time [12]. To achieve this aim, they must work with large datasets in real time. The advent of artificial intelligence and big data analysis opens new implementation possibilities [13]. Digital learning environments offer the opportunity for personalized learning paths.
Education is increasingly shifting toward individualism, with personalized learning becoming one of its fundamental principles. The role of artificial intelligence is particularly important in this process. Personalized education tailors educational content and methods to individual learning needs and paces. This allows each student to progress according to their own strengths and weaknesses. The foundation of personalized education is understanding and considering the unique learning styles of students. The goal of adaptive teaching strategies is to meet the learning needs of each student [10]. Consequently, with the help of AI, it becomes easier to identify students’ strengths and weaknesses, create personalized learning plans, and continuously monitor and archive their progress. This, in turn, assists teachers in applying more effective and efficient teaching methods.
Our aim in this study is to present educational software, which we have named Learn with M.E. (Math Educator), equipped with intelligent features designed for more effective and personalized basic mathematics instruction. In the following sections of our study, we will outline the findings of other research addressing similar issues. Subsequently, we will describe the structure and operating principles of our tool, in this case, the educational software. Finally, we will present our educational outcomes, research questions related to the software, our hypotheses, and their evaluation based on feedback from the students and educators involved in the survey.

2. Related Work

Research indicates that children make more informed decisions when using educational applications that provide explanatory feedback, as opposed to those without feedback, where more attempts and incorrect answers are observed [14]. Feedback is crucial for students. However, the type of feedback is also important. Feedback can include cheering or booing sound effects following a calculation, but this is neither highly motivating nor explanatory for the learner. A supportive message, either verbal or non-verbal, can increase motivation. Feedback that includes specific explanations, however, can enhance performance accuracy and reduce the number of future unsuccessful attempts.
Leveling is a method of personalizing learning content that takes into account students’ prior knowledge and development, building upon it [15,16]. According to Kucirova, leveling can be implemented in three ways in intelligent applications designed to facilitate mathematics education: participatory free-form, programmed static, and programmed dynamic forms [17]. Participatory free-form leveling refers to applications that offer a suggested but not enforced sequence of learning content. Schenke and colleagues’ findings suggest that this form of leveling can be beneficial for young children [18], as it provides them with greater agency, allowing them to guide their learning based on their interests. However, there is a risk that children might choose content that is too easy, thereby hindering effective learning.
In contrast, programmed leveling directs children onto a scaffolded and personalized learning path. In programmed static leveling, the learning content is tailored to the child based on an initial performance assessment or is pre-selected by an adult. With programmed dynamic leveling, the presented learning content adapts to the child’s performance in real time during the use of the application [9,19].
Ismail’s study presents a review and categorization framework that can be used to analyze and classify personalized language-learning systems. His findings indicate that the latest language personalization systems are increasingly incorporating artificial intelligence and focusing on cognitive-based personalization. His study also highlights that designing educational software to facilitate learning is far more complex than merely embedding some learning content into an application, as learning is a complex process [20]. This is true not only for languages but also for STEM subjects, such as mathematics, which is the focus of our current problem.
Tapalova’s study examined the impact of artificial intelligence technologies on education. Most participants acknowledged that AI technologies used in education increased their engagement and interest in learning, helped tailor educational content to their personal needs, accelerated the learning process, and stimulated intellectual activity [1].
Colace proposed a tool based on Java and JSP (Jakarta Server Pages) technologies for creating and managing metadata, as well as for the automatic selection of content to design lesson sequences and compile learning paths. Their study introduced the SCORM (Sharable Content Object Reference Model) model. This software module, which plays an intelligent instructional role on the e-learning platform, is tasked with selecting, organizing, and offering the best training path for the learner based on their profile. Selecting the best training path obviously involves choosing the learning objects that best match the learner’s preferences. The more personalized content generation path is determined by four variables (difficulty, interactivity, size, time), which every student profile includes. The tool constructs a learning module from the most suitable content for the user, some additional filler content, and a final test. After evaluating the final test, the user profile and course structure are updated [21].
Several studies have explored the potential of reinforcement learning. Palma described the development and evaluation of an online system aimed at improving and reinforcing mathematics learning among elementary school students [22]. Nie’s study also focused on reinforcement learning, suggesting that reinforcement learning algorithms can learn adaptive decision policies that link a context to an intervention to optimize expected outcomes. Their results indicated that such algorithms are promising, and educational software should be optimized in this direction to best support individual students’ learning experiences. Measurements were conducted on control groups, considering the students’ gender, level of math anxiety, and pre-test scores [23].
Ruann and colleagues developed educational software to support the learning of mathematical concepts for upper elementary school students. They used reinforcement learning to adaptively learn responses to support student learning. According to their studies, reinforcement learning has achieved positive results in various areas of mathematics, including fractions, discrete mathematics, and linear algebra [12,24,25,26].
Outhwite and colleagues conducted a comparative analysis to demonstrate how mathematical applications used in education can enhance learning outcomes. The study employed a new three-step framework to analyze the educational value of mathematical applications for children in the first three years of compulsory schooling. The content analysis framework specifically examined the type of application, the mathematical content, and the design features of 23 mathematical educational applications [9]. Griffith and colleagues emphasize that to understand how different mathematical applications function, it is necessary to examine the underlying pedagogy and design features of the applications [27].
Vasalou’s study investigated personalized game-based learning, emphasizing the continuous maintenance of motivation. This can be achieved through embedded games or rewards. Digital games used in primary education are often designed to support children’s learning through the motivated practice of fundamental subjects, such as literacy and mathematics. The study focused on personalized game-based learning, which is centered around automating the educational process through games assisted by artificial intelligence. Specifically, their study introduced the adaptive game Navigo [28]. Effective learning games promote cognitive engagement and motivation through interactivity, adaptive challenges, and continuous feedback [29]. Digital games capable of assessing and generating data on student learning can provide teachers with information to plan instruction and understand how their students learn.
Bang’s study presented the My Math Academy intelligent game-based mathematics education software. The software aims to eliminate performance disparities and motivate students through successful experiences. The program consists of game-based activities with adaptive learning paths. Teachers are provided with performance dashboards and information on offline activities to support learning. Both teachers and parents can extend the in-game learning. With adaptive learning activities, students receive personalized learning experiences that enable them to acquire mathematical skills and concepts. My Math Academy uses current approaches in game-based learning and experience-focused design to help students learn mathematical concepts through playful experiences. The topics range from counting up to ten to adding and subtracting three-digit numbers, covering primary school students in lower grades [8].
As we can observe, in most cases, the fundamental concept of intelligent learning systems is that the learner interacts with an adaptive interface, which personalizes the learning process based on the user’s profile and academic achievements. For the next generation of intelligent learning systems, this entails the necessity of user interfaces that collect students’ behavioral patterns and their past usable data in real time for constructing appropriate profiles. Meanwhile, the intelligent functions running in the background utilize these historical data to tailor personalized learning pathways.

3. Materials and Methods

Personalized learning is a crucial aspect of education, potentially resulting in students’ smoother progress and comprehension. Therefore, it is important to develop a system capable of synchronously communicating with the learner. Naturally, the presence of a teacher is indispensable in an educational institution; thus, the software should provide feedback to the instructor about the student’s performance. Starting from this easily understandable learning status, the instructor can use personalized examples to help the student catch up in the given area. To develop the application, we examined the specific problem domain and the research findings related to it in order to establish the appropriate design process and implement the functionalities necessary for such an application. Through the practical application of the software, this study aims to address the following research questions:
RQ1: How should the application be designed?
RQ2: What is the feasibility/validity of the application?
RQ3: How effective is the application?
We examined the practical application results of the personalized-education-supporting software we developed from various perspectives. We assessed students’ performance through testing and the application of control groups. Students were required to complete a set of 12 tasks covering topics addressed by our application. The satisfaction of students with the software was assessed using a three-part questionnaire. In addition to demographic data, we measured students’ satisfaction and opinions, which could inform future development directions. We gathered instructors’ feedback through interviews and SWOT analysis. The results of our surveys were subjected to statistical analysis to gain a better understanding of the effectiveness of our research. We utilized the IBM SPSS Statistics version 25.0 software for statistical analysis.

4. Software Development

Within this section, we aim to present the functionality and usability of the developed software. Throughout our work, our endeavor was to develop a system that could facilitate the practice and comprehension of fundamental mathematical operations for students, as well as aid instructors in creating personalized course materials. We envisioned a solution in the form of a computer application. While there are existing online applications capable of generating examples using random numbers and determining the correctness of a student’s answer, beyond simple true/false decision-making, our goal was to design a system capable of identifying the root cause of errors. This would guide the student toward the path of error rectification. To identify errors, we equipped the application with error patterns for comparison. These error patterns were documented through consultations with mathematician colleagues possessing years of pedagogical experience. Furthermore, drawing upon the literature, we examined the causes of errors commonly found in elementary school students’ mathematics and potential corrective measures [30,31].
To answer RQ1, based on the discussion in Section 2, we examined the characteristics that educational software aiming to support individualized teaching should possess. Drawing upon the results and observations of other researchers, during the development process, we aimed to create an application capable of the following functionalities:
  • Generating computational tasks at various difficulty levels.
  • Providing the option for automatic difficulty-level adjustment, allowing the learner to solve tasks matching their own proficiency level, thus ensuring more personalized progression.
  • Automatically and instantly evaluating results, intelligently providing immediate guidance to the student by comparing their results with error patterns, leading them to identify the root causes of errors.
  • Access to a step-by-step derivation of the current example, enabling the student to compare their own reasoning with a possible correct solution.
  • Auxiliary features such as visually representing the given example through simulated fingers, aiding younger age groups and students struggling with computational difficulties.
  • Reviving theoretical knowledge, encompassing computational rules, priorities, sequences, interchangeable elements, sample problems, and multiplication tables.
  • Sustaining motivation, entailing continuous rewarding of correct calculations. Rewarding with trophies also fosters a healthy competitive environment among students.
  • Tracking computations, providing instructors or parents with quick and clear insights into the student’s progress, identifying their strengths and weaknesses, thus facilitating a rapid catch-up with more personalized tasks.
  • Continuous interaction, which can be both written and voice-based from the software’s side.
  • Multilingual support.

4.1. Graphical User Interface and Features of the Software

Our software is a downloadable desktop application that contains a total of six windows (denoted as 1, 2, …, 6). Furthermore, each window includes objects with various functionalities (denoted as A, B, …, n). In the following sections, we will interpret the notations in the following manner: for example, 1A will denote object A in window 1. Upon launching the program, the main window opens for the user, as demonstrated in Figure 1.
Since the software provides the ability to track studies, it is essential for each student to have a user account. Prior registration is not required for this. The user’s task is to enter a username of their choice (or one assigned by the teacher) (denoted as 1T). After entering the username, the login process (denoted as 1U) occurs. During the login process, the system checks two things: Has the user filled in the login field? If not, it prompts the user to fill it in. Is there already a database associated with the provided name? If yes, it simply logs the user into their account and loads their previous results. If not, it informs the user that no digital workbook is associated with the entered name. It then automatically creates one and, during the first login, alerts the user that they can listen to the software usage guide by clicking the info button in the top left corner (denoted as 1A). The user can enable or disable the voice assistant of the application at any time (denoted as 1S). The user has the option to operate the program in five languages (denoted as 1R), which applies not only to the captions but also to the voice messages. After logging in, multiple options are available to the user. We will start introducing the functions with the educational options. Students can choose from a total of five task families: addition, subtraction, multiplication, division, and mixed-bracket tasks. Each task family includes different types of tasks, totaling 34 different types. The program randomly generates tasks from these types according to the selected difficulty level. There are two ways to set the difficulty level. The user can manually choose (denoted as 1D) from an interval ranging from 1 to 6. The difficulty levels affect the generated tasks in the following way: the easier the difficulty level, the smaller the range from which the software generates tasks. At lower levels, decimal or negative results cannot be generated. Additionally, the user can choose to automate the difficulty level setting (denoted as 1B). The automatic level setting only allows the student to move up to the next level if it sees from their progress that they are prepared. This assessment takes multiple factors into account:
  • Condition 1: The user must cover 1/6 of the examples generated from the current task family thus far. This ensures that the user spends an adequate amount of time on each difficulty level.
  • Condition 2: The user must correctly solve at least 70% of the tasks generated at the current difficulty level, ensuring they perform at least commendably.
  • Condition 3: The program measures the user’s calculation time, averages it, and establishes an individual threshold value. If the program sees that the student performs worse than the average time, it will not allow them to advance to a higher level. If it sees that the performance is steady and the calculation time is around the average, the condition is considered met after a certain number of examples.
  • Condition 4: If the user has previously been at one level higher than the current difficulty level, it examines the results of the last calculations performed at that higher level. In the case of incorrect calculations, it further examines how many tasks have been correctly solved at the current difficulty level since then. If a sufficient number of tasks have been completed, the level is considered achieved.
If these four conditions are met simultaneously, the user can proceed to the next level. Based on previous assessments, this typically means solving between 20 and 30 examples per difficulty level and approximately 150 tasks per task type if all are solved accurately and promptly. This implies that the student must solve at least around 750 tasks to automatically reach the highest difficulty level in each task family.
Following the selection of the task family and difficulty level, the example is generated (denoted as 1E). Each student receives a unique individual task (denoted as 1F), ensuring there is no risk of copying the solution and result from a neighbor’s screen. The generated task must be solved in every case. During a test, the student cannot exchange the given example. At the easiest level, for addition and subtraction examples, there is an option to graphically display the task using fingers in a separate window (denoted as 1G). Figure 2 demonstrates such an example.
To solve the generated problem, the calculations must be entered in field 1H, and the final result in field 1I. The program monitors the contents of these fields, and if they are left empty, it will alert the user. For the result field, both the decimal point and decimal comma are accepted for decimal values. When the student believes they are ready, they can check their result (denoted as J). The software will then examine the input provided by the student and compare it with the correct output for the current problem, as well as with stored error patterns. The program contains 429 error patterns. If the student’s result is incorrect, the software will display a list of identified potential errors in a separate window (see Figure 3A). During analysis, the incorrect result provided by the student may correspond to multiple potential errors. Based on the listed error possibilities, it is the student’s task to review their work and identify the source of the mistake. However, the software offers significant assistance by guiding the student toward the solution with immediate feedback. Unlike simple problem-generating programs, Learn with M.E. not only categorizes user results as correct or incorrect but also indicates the source of the error when mistakes are detected. During assessments, it was observed that students tend to avoid reading and often close pop-up windows without reviewing the information provided. Considering this, a checkbox (denoted as 3B) has been integrated into the software. By checking this box, the student acknowledges the software’s observations. Additionally, a voice message will alert the user upon checking the box.
The software provides students with the opportunity to try again in the case of an incorrect answer. If the student solves it correctly on the retry, the software acknowledges that the student identified and corrected the mistake independently. However, if the student generates a new problem without correcting the current one, the automatic difficulty setting will demote them to the previous level, indicating that they were unable to solve the problem. To advance back to the higher level, the student must meet the previously outlined conditions. Additionally, the software allows students to view the step-by-step solution to the correct answer if needed (denoted as 1K). In this case, the program does not award points for the problem and informs the student accordingly (denoted as 1L). Furthermore, the software enables students to review theoretical concepts and calculation rules (denoted as 1M). This is presented as a menu in a separate window, where students can read the rules applicable to each problem set, as shown in Figure 4, 4C. Each problem set (denoted as 4B) includes built-in sample problems that demonstrate how students can solve a problem (denoted as 4B), which elements are interchangeable, and which samples require adherence to the calculation order. Additionally, students can review the multiplication table.
If the student correctly solves the task generated for them, the software awards them points. After accruing a sufficient number of points, the student can win trophies from the software. The program contains a total of 12 trophies, initially displayed in black and white. The goal is for the student’s profile to become more colorful throughout the school year. The menu displaying the trophies can be accessed by clicking on the cup (marked as 1P) by the user. This is demonstrated in Figure 5.
The basic requirements for obtaining each trophy can be viewed by the student at any time by clicking on the info button (marked as 4A). One of the software’s goals is to maintain students’ interest. Therefore, in addition to trophies, we have incorporated additional reward systems into our software. If a student achieves four trophies, they receive a bronze medal from the software, which will also be visible to everyone through its main menu. Furthermore, after obtaining four trophies, the student has the opportunity to change their character’s color (marked as 1O), known as a “skin change”, which is a popular feature among children derived from various computer games. If a student earns at least eight trophies, the program rewards them with a silver medal and allows them to accessorize their character with sunglasses in addition to the color they selected. Finally, upon reaching 12 trophies, they receive a gold medal. If the user operates the program in voice mode, the software responds to the student’s trophy collection with different motivational messages for each medal.
Beyond its immediate interactive and tutorial features, one of the most important functions of the software is traceability. Teachers in school and parents at home have the opportunity to monitor the student’s progress through the computation log (marked as 1Q). This opens up a database management window for the user, where they can examine the student’s results with various filtering options, as seen in Figure 6.
During computations, the software saves the computational data precisely to a .csv file by date. The data stored by the software include the task family, the example type, the specific task, the deduction provided by the student, the student’s result, the computational time allocated for the example, the output detected by the program (correct, incorrect, specific reasons for errors), and the viewing of the deduction (yes or no). Instructors can monitor the dynamic changes in the database during teaching sessions. By the end of the session, they gain a clear insight into how many tasks, along with their types and levels of difficulty, each student has individually solved. Last but not least, the results provide a clear overview of each student’s weaknesses. Based on this, instructors can precisely determine what each student needs for rapid catch-up. Personalized task sets can be prepared based on the results.

4.2. Description of Each Process

In the previous subsection, we outlined the structure and operational principles of our software. In the following section, we would like to demonstrate the automated functions using process diagrams. Automatic difficulty-level adjustment is an essential feature, as the software examines the student’s results and estimates the difficulty level relative to the student’s own level of knowledge. In the previous subsection, we demonstrated how to access and set up this function through the UI. Now, we would like to discuss the background investigations of this process. The software examines two inputs during automation: the selected example type and the initially set difficulty level (ranging from minimum 1 to maximum 5). It then traverses through the database associated with the student’s profile. Based on the extracted data from the database, it examines whether the student meets the requirements for the next difficulty level. The process of this examination is illustrated using the following process diagram (see Figure 7).
Another important function is automatic assessment. Many software programs are capable of generating tasks from specified intervals of highlighted numerical values. However, we consider it crucial that students not only be confronted with their results classified into correct and incorrect categories but also be provided with immediate, real-time guidance from the software. To achieve this, we relied on consultations with educators and identified error sources during testing. Following several months of testing, we constructed an internal database for our software based on these data, containing error patterns associated with different types of tasks and the pathways leading to them. The software examines the student’s response as input. If it is numerical, it compares it with the database of example sets, substituting the currently generated numerical values. If the user’s result matches the correct output, the program updates the user’s point system and rewards them if necessary. In the case of incorrect calculations, it continues searching in the database. If it detects possible causes of the error, it alerts the student to potential sources of mistakes and provides a path for immediate correction. Additionally, it provides access to view the deduction step by step. If the software fails to recognize the cause of the error, it alerts the user to the incorrect calculation and suggests recalculating the entire task. During development, the goal was to minimize the latter message. The 1.7 version used in this study recognized the cause of logically deducible errors with 92% accuracy. The operation of automatic assessment is demonstrated in the following process diagram (see Figure 8).

5. Results

To facilitate more personalized learning, the Learn with M.E. educational software was developed and implemented in a total of seven educational institutions. Among these were four public schools and three church-affiliated schools. We aimed to select educational institutions from different regions and municipalities. We visited a total of three different districts. The survey included Eötvös Street Elementary School in Komárno, the Marianum—Church School Center, and II. Rákóczi Ferenc Elementary School in Kolárovo, covering the Komárno district (KN). From the district of Saľa (SA), the Žihárec and Neded Hungarian-language elementary schools participated. Finally, from the district of Levice (LV), the Fegyvernek Ferenc Catholic School with Joint Management in Šahy and the Palásthy Pál Church Elementary School in Plášťovce participated in the survey. The surveys were conducted over a period of 9 months from May 2023 to January 2024, including the school holiday months of July and August, during which we implemented further development and improvements based on the results of the first and second months. Learn with M.E. has approximately 330 users, covering the upper grades of elementary schools. In this study, we evaluated five educational institutions from those listed above. During the questionnaire survey, we worked with a total of N = 164 students. Seventy-six students were absent from the educational institutions due to health reasons, with an average of fifteen absent students per school. The results of this survey are presented in Section 5.1, Section 5.2 and Section 5.3. Section 5.4 processes the data of all participating students so far, N = 327. At the time of writing, investigations are still ongoing in the remaining institutions, and we are working on the participation of additional educational institutions.

5.1. Evaluation of Students’ User Experience

A total of N = 164 students participated in the survey, including 71 girls and 93 boys. Within the framework of a three-section questionnaire, we inquired about the students’ demographic data and their experiences with the software, to which they could respond using Likert scales, among other methods, and they could also express their observations regarding Learn with M.E. in their own words. To complete the questionnaire, each student had to provide their username used during Learn with M.E. sessions, enabling us to identify individuals in compliance with GDPR regulations, allowing for subsequent cross-referencing with their digital workbooks and, in the case of relevant groups, their test results, as elaborated in Section 5.3 later. Furthermore, the first section of our questionnaire inquired about the students’ ages and the grades they attended. As previously mentioned, our software was implemented in the upper grades of primary schools. The distribution of students enrolled there was as follows: 32% of respondents were from the fifth and sixth grades, with 26 students each. Seventh-grade students accounted for 22% of the survey participants, totaling 37 students. The eighth-graders made up 27% of the respondents, with 44 students, while the remaining 19% were ninth-graders, totaling 31 students. The distribution of participating students using the software is depicted in Figure 9.
The second section of our questionnaire examined the use of the software. Our first question aimed to assess the general user experience, for which students could choose from a 4-point Likert scale, ranging from very poor, poor, good, to very good. A total of 97.6% of respondents provided positive feedback, while 2.4% indicated a more negative sentiment. The complete percentage distribution is illustrated in Figure 10.
Our second question inquired about the software’s graphical user interface. Our survey aimed to gauge how easily the software was perceived to be handled. Respondents again had four options to choose from: very difficult to handle, difficult to handle, easy to handle, and very easy to handle. A total of 95.1% of respondents found the software easy to use. Notably, no students found the software very difficult to handle. The result is illustrated in Figure 11.
Prior to the initial usage, students received detailed descriptions and demonstrations of the software. Naturally, we responded to subsequent queries and provided additional guidance, along with the option of utilizing the built-in audio guidance in the software at any time. Consequently, we asked students whether they received sufficient assistance for proper software usage. A total of 92.1% of respondents found the preliminary guidance sufficient, while the remaining 7.9% found it inadequate. The result is illustrated in Figure 12.
The literature supports the notion that students learn better and more effectively in a playful environment and are more inclined to engage with topics that interest them. Regarding the structure of our software, we introduced a trophy system designed to sustain motivation and foster healthy competition. This section of our questionnaire assessed the effectiveness of motivational elements. A total of 82.3% of students found that these contents made math lessons more enjoyable and perceived them more as games than obligations. The complete statistics are depicted in Figure 13.
The subsequent questions in the second section assessed potential improvements in students’ mathematical knowledge. We asked whether students perceived improvements in their knowledge of basic operations and whether they were better able to grasp the rules of computation with the software’s assistance. An 80% majority of respondents felt that Learn with M.E. aided them in better understanding. The result is illustrated in Figure 14.
Consequently, we further investigated whether students noticed any positive changes in themselves during subsequent math lessons where the software was not used. A total of 61% of students believed they performed better. The result is illustrated in Figure 15.
A fundamental concept of Learn with M.E. is to allow students to solve tasks matching their own difficulty level at their own pace. The “auto” function integrated into the software serves to determine the appropriate difficulty level, automatically selecting the most suitable difficulty level based on students’ past performance. Our question focused on this functionality. A total of 50% of respondents utilized this function during calculations, and 80% of these students considered the difficulty level set by the software to be personalized. This is illustrated in Figure 16.
Based on our personal experiences, where several students approached us at the end of the initial demonstration sessions inquiring about how they could use the software at home, we examined in our questionnaire how many students actually utilize the software at home during their free time. Our question aimed to determine whether respondents had downloaded the software, and if not, whether they would seek assistance to download and use the program at home if they were unsure how to do so. In total, 40% of respondents expressed their willingness to learn or, in the words of a few students, “play” with Learn with M.E. at home.
In the final section of our questionnaire, students were invited to express their views on the software in their own words. Three questions were posed to the students, addressing the positive and negative aspects of the software, as well as any missing features desired by the students. Numerous responses were collected and grouped into categories. Most comments regarding the positive features of Learn with M.E. focused on the collection of trophies, interactive audio and text message features, the manual and automatic adjustment of difficulty levels, specific error warnings, access to full step-by-step solutions, and multilingual support. During the surveys, we personally experienced the usefulness of multilingual support, as there were students who preferred to use the software in Slovak because they spoke it at home, while others, such as Far Eastern and Ukrainian students, preferred English due to difficulties with the Hungarian language. Fewer negative comments were directed toward the software, with some specifically mentioning the software’s design, particularly the lack of full-screen mode. Some students were bothered by the fact that the software spoke to them, unlike others. However, this feature can be toggled on and off, so we do not attribute great significance to it. Additionally, some students perceived the mandatory use of step-by-step solutions as a negative aspect. Nevertheless, we emphasized the importance of this function multiple times with our teaching colleagues. Suggestions for enhancing the software mainly revolved around increasing the number of trophies, but some students also suggested expanding the program with other mathematical topics and online or test functions. Below, we present a few specific responses we received from the students. These data are summarized in Table 1.
In summary, the students positively received the software and its user interface. The motivational elements contributed to an enhanced learning experience. The practical usability of the application’s design features outlined in RQ1 proved adequate in practice. Regarding RQ2, the validation of the software did not necessitate substantial technical preparation. Learn with M.E. was field-tested on local school networks. In institutions lacking pre-established shared network partitions, the teacher’s computer was utilized to locally distribute the application to students’ computers. Work conducted in this manner, along with digital workbooks, was archived on the teacher’s computer.

5.2. Evaluation of Test Results Examined with Control Groups

The effectiveness of Learn with M.E. was examined using control groups. A total of N = 106 students participated in the study (51 using the software and 55 as a control). Participants were selected from the eighth and ninth grades of primary schools, as these age groups have learned all the mathematical concepts and topics covered by Learn with M.E. The learning and teaching methods of the group using the software have been previously described. The control-group students participated in formal education. During theoretical lessons, the entire group followed textbook chapters. In practical sessions, no digital or other technological aids were used. All students completed the same set of tasks, with no personalized assignments applied. Students who asked questions and required teacher assistance naturally received it. Practical sessions were followed by testing. Both groups had to solve a 12-item problem set. The test included a mix of addition, subtraction, multiplication, division, and mixed-bracket problems and also assessed knowledge of remainder division and negative numbers. Students could achieve a maximum of 12 points on the test. Final evaluations were determined according to the ISCED2 standards and local assessments applicable to the upper grades of primary school mathematics as follows [32]. The evaluation we used is illustrated in Table 2.
Unlike in many other countries, our educational policy dictates that lower numerical grades indicate better performance. This plays a significant role in our subsequent analysis.
During data evaluation, we utilized an 18-column table that included the following: student ID, score for each task, total score, percentage score, grade achieved on the test, evaluation, and the variable distinguishing the groups. The analyses were conducted using IBM SPSS Statistics software.
As the first step, we performed a frequency analysis to demonstrate the performance of each group on the test. The control-group students achieved an overall score of 58%, while the students using Learn with M.E. achieved 69%. Among the students using the software, eight submitted excellent tests, with three achieving 100%. Eighteen students performed very well. Twenty-one students achieved good results, while two received satisfactory and two received unsatisfactory evaluations. In the control group, eight students also performed excellently. Twelve students achieved very good evaluations, seventeen students achieved good results, twelve students only achieved satisfactory results, and six students failed. In Table 3 and Table 4, the percentage distribution of each group is illustrated.
Following further investigation of the scores achieved on individual tasks, it was observed that students using Learn with M.E. performed better in calculating bracketed and mixed tasks with 76% accuracy, compared to 56% accuracy by the control-group students in such tasks. In the course of the study, we aimed to examine whether there was a significant difference in grades between the two groups. Accordingly, we first examined the normality of the data. As grades did not follow a normal distribution, the Mann–Whitney U test was applied to examine the difference in grades.
H0: 
There is no significant difference in grades between students using and not using the software.
H1: 
There is a significant difference in grades between students using and not using the software.
Since the p-value from the Mann–Whitney U test was less than the conventional 0.05 significance level (p = 0.030), we rejected our null hypothesis, indicating that there is a significant difference in grades between the two groups. Consequently, we infer that there is a statistically significant difference in grades between students who use the software and those who do not.
Our further aim was to investigate whether there is a statistically demonstrable correlation between the grades of students and usage of the software.
H0: 
There is no correlation between software usage and grades.
H1: 
There is a correlation between software usage and grades.
Since grades did not follow a normal distribution, Spearman’s rho correlation test was employed. The results are illustrated in the Table 5 below.
These findings suggest that, although the correlation is weak, there is a significant relationship between the grades of students and usage of the software. Consequently, we can infer a moderate negative correlation between software users and grades. This leads us to conclude that students learning with Learn with M.E. achieved lower grades. As previously noted, in our evaluation system, lower grades indicate better performance. Thus, based on the correlation analysis, it can be inferred that students using the software generally achieved better results. To better understand the factors at play in this process, further investigations are necessary, including examining other factors, such as the impact of personalized consultations, the frequency of Learn with M.E. usage by students, and student motivation levels during learning, as well as their personal backgrounds and abilities.
In the preceding subsection, we presented students’ feedback regarding the software. The majority, 82%, found the built-in motivational elements enjoyable, with many associating it with playing. Consequently, we examined whether there is any measurable relationship between the grades and motivation of students who took the test and used the software.
H0: 
There is no significant relationship between the grades and motivation of students using the software.
H1: 
There is a significant relationship between the grades and motivation of students using the software.
Using Spearman’s rho correlation test, we detected a weak negative correlation (rs = −0.224). However, the test statistic surpassed the critical value (p = 0.114), rendering the relationship non-significant. Therefore, we accepted the null hypothesis. Factors other than motivation also contributed to the grades of students using the software. Similarly, when examining the relationship between personalized consultations and grades, we observed a weak and non-significant correlation (rs = −0.135 and p = 0.345). Subsequently, we reviewed and compared the test sheets of students using the software, the Learn with M.E. digital workbook associated with each student, and our archive for those who received personalized consultations. In total, we identified 23 such students. Among them, 17% showed no improvement during testing, 26% showed partial improvement, and the remaining 57% showed complete improvement. The last group correctly solved examples similar to personalized practice tasks. This distribution is presented in Figure 17.
It is important to note that the correlation analysis for the examination of motivation and personalized consultations was conducted on a relatively small sample size, N = 51, consisting of students using the software and participating in the testing. The total dataset comprises over 300 students; however, we did not work with control groups beyond the eighth and ninth grades. One of the future research directions includes conducting tests among younger age groups, which would enable further investigations, such as examining the relationship between the time spent using the software and grades.

5.3. Teachers’ Views on Learn with M.E.

We were also interested in the opinions of the teachers participating in the survey. Among the five educational institutions, we independently conducted student engagement and testing for both the software-using and control groups at one institution. At the remaining four institutions, mathematics instructors who taught the respective students assisted us. The teachers were involved throughout the entire process. Initially, they participated in technical demonstrations prior to the first joint lesson and during the software’s debut lessons, and later, on days when we were not personally present at the institutions, they guided the software-using students. They participated in analyzing feedback on the software and in personalized classroom consultations. With their help, we could better understand the mathematical background of each student, identify those who may struggle with computations, and provide more appropriate suggestions for them during software usage. Therefore, we eagerly welcomed their observations and experiences. The instructors took part in interviews combined with a SWOT analysis, where they responded to various questions focusing on the strengths, weaknesses, opportunities, and potential threats of Learn with M.E. They agreed to allow us to use their names when processing their responses during the evaluation. Accordingly, the following table presents the general information of the teachers (referred to as “Ti”) participating in the survey. Table 6 lists the teachers participating in the survey.
Table 7 sets out the questions we asked and the answers we received from each of the trainers.
The results from interviews with educators addressed all three of our research questions effectively. Teachers found the built-in features of Learn with M.E. to be straightforward and clear. Validating the application did not require extensive pre-training from their side, and no significant technical issues were identified that could potentially impact the software’s functionality. The application was recognized as an effective educational tool that tangibly facilitated differentiated instruction.
From the surveys, interviews, and observations, we gleaned various insights. It is important to note that these teachers’ evaluations are subjective. Nonetheless, they are significant to us, as they reflect the results of the program’s implementation in real classroom environments. The surveyed teachers acknowledge Learn with M.E. as a tool that motivates and engages students, in a playful manner, in practicing mathematical operations at personalized difficulty levels. Additionally, it aids in differentiated instruction by highlighting specific deficiencies of individual students through the archives of their learning logs.
Observations and interviews also supported the notion that the motivational tools provided by Learn with M.E. made students more persistent in calculations. When faced with challenging tasks, they did not give up but persisted in their attempts to earn the trophies meant for successful calculations. Excited exclamations such as “yes!” and “got it!” were common among students using Learn with M.E. They eagerly showed their screens to teachers and peers when they received correct answers and unlocked new customizable features in the main menu. The software intentionally incorporates motivational elements that foster a healthy competitive environment, with trophies correlating with achievements. The more trophies a user has, the more vibrant their application’s main screen becomes (featuring colorful, costumed characters, bronze, silver, and gold medals, demonstrating the user’s rank). Teachers acknowledged Learn with M.E. as a valuable resource that motivates children and fosters their commitment to learning mathematics. They appreciated that children could navigate the application easily and independently, enabling them to direct their own learning. This, in turn, allows teachers to engage in other activities with those students who require small-group or individual instruction.

5.4. The Accuracy of Learn with M.E.

Learn with M.E. version 1.0 was introduced and applied for the first time in May and June 2023. In the initial survey, a total of three educational institutions and 92 students participated. Utilizing the software, students attempted to solve a total of 4162 examples. Version 1.0 received positive feedback from both teachers and students; however, we encountered several errors and deficiencies. We endeavored to rectify these errors as quickly as possible, and by the time versions 1.1 and 1.2 were deployed in September, these issues were no longer detected. Subsequent versions expanded with more features for better user experience and easier usability. Furthermore, through analyzing digital workbooks, we identified computational errors from hundreds of students, further enriching the database of patterns. Some errors seemed almost customary, committed by students from various educational institutions who had never met each other and were taught mathematics by different educators. Based on these analyses, version 1.6 was developed in October 2023, capable of handling many more errors than previous versions. As a result, students using Learn with M.E. received immediate feedback and guidance more frequently when making mistakes. Currently, in February 2024, we are working with version 1.7, which only expands on the audio features compared to version 1.6. Therefore, we consider the accuracy of Learn with M.E. from the application of version 1.6 onwards. In the previous sub-sections, we presented the user experience of the latest version, 1.7, from the perspectives of students and teachers. Initially, we aimed to introduce and test the educational software in multiple educational institutions. Later, we spent more time examining the potential of Learn with M.E. for individual students in each institution. At the time of writing this paper, a total of 12,070 tasks have been generated, in addition to the 4162 examples from May and June 2023. Students correctly solved 70% of the generated examples, while in the remaining 30%, they provided incorrect answers. One of Learn with M.E.′s tasks was to analyze these remaining 30% of erroneous cases to help students understand the reasons for their mistakes, thus preventing them from repeating them in the future. However, we and the educators had to further investigate this 30% because, in most cases, students did not struggle with computational difficulties; rather, they intentionally provided incorrect answers, or there was no logically connectable pathway between the generated example and the incorrect result. Summarizing the evaluation of digital workbooks, only 38% (1324 out of 3501) of the incorrect examples were valid, from which the cause of the problem could be deduced. We filtered out invalid incorrect responses based on the following criteria:
  • Filtering intentionally incorrect responses: Students who intentionally provided incorrect answers are filtered out.
  • Filtering repeated occurrences of the same errors: In cases of incorrect results, students are allowed to recalculate and correct their answers. However, some students disagreed with the program’s decision and attempted to submit the same incorrect result multiple times.
  • Filtering illogical results: Students who appear to genuinely attempt to solve the problem, but their result and the process of their deduction cannot be explained, even after recalculation. In these cases, the results are neither intentionally incorrect nor logically connected to any valid pathway.
Out of the valid incorrect responses, Learn with M.E. provided accurate suggestions to students for error correction 1219 times, accounting for 92% of cases. The remaining 8% represents instances where Learn with M.E. did not recognize the source of the error. Currently, we are working on further reducing this value based on feedback from the latest data. The results of this evaluation are illustrated in Figure 18.
As we can observe, Learn with M.E. assisted students over 1200 times in real time, endeavoring to guide them toward the correct calculation path. This also implies that it provided us, the educators, with over 1200 specific observations, enabling us to personalize advice and tasks for students during consultations. More proficient learners were able to progress at their own pace during the lesson. Some sought challenges, while teachers mainly dealt with those who required assistance.
The digitally archived workbooks by Learn with M.E. clearly demonstrated the computational difficulties individual students encountered. During data analysis, we categorized the errors of students using the software into three groups.
  • Complete improvement: This category includes students who, using the software, learned the computational rules. In their case, the errors in question did not recur in subsequent similar tasks. Learn with M.E. provided immediate advice to a total of 61 students, effectively substituting for the instructor, drawing their attention to inaccuracies and the causes of errors.
  • Improving trend, but with occasional recurring errors: Students in this group followed the software’s warnings to correct their mistakes. However, after a period, such as completing k tasks or on another occasion, the same error recurred. Upon recurrence, the software reiterated the cause of the errors to them. Through analysis, we identified a total of 44 such students.
  • No improvement: This category encompasses students who, despite using the software, were unable to grasp the individual rules. Their calculations were incorrect, and following the error, they did not attempt to solve a similar problem. Additionally, it includes students who, following incorrect calculations, did not thoroughly examine the causes of their errors, did not utilize the step-by-step deduction, and instead moved on. Learn with M.E. drew attention to a total of 17 students in this category.
Figure 19 illustrates the frequency of improving, decreasing, and persistent error rates with the help of the application.
Based on observations from Learn with M.E. and reviews by instructors, 25% of students required personalized assistance to catch up with their peers in a timely manner. Over the past few months, we have prepared personalized tasks for a total of 84 students. This group of 84 students includes the 61 students identified by Learn with M.E. who experienced recurrent error types, as well as those students who showed no improvement at all. The remaining students were selected based on errors not detected by the software but identified by us upon review. Overwhelmingly, the majority of computational difficulties involved errors in the sequence of calculations, accounting for 61% of cases. This was followed by issues with negative numbers and omitted signs, as well as inadequate knowledge of multiplication tables. Errors in the mixing of arithmetic operators and confusion in mixed tasks were the least frequent. The complete distribution of error frequencies is illustrated in Figure 20.
In total, 145 students received tutoring using Learn with M.E. Out of these, 61 received real-time assistance from the software, which helped improve their results. Additionally, personalized consultations were conducted with 84 students, of which 72.5% were also identified by Learn with M.E. The remaining 182 students, who did not require significant support, were able to progress at their own pace without being held back by others.
Our findings regarding RQ3 indicate that Learn with M.E. has proven to be an effective tool in the learning and teaching processes. This is supported by significant differences measured with the test results of control groups, individual developments documented in our digital archive, and feedback from both students and teachers, indicating that it facilitated personalized learning.

6. Discussion

Our study examined the structure, operation, and applicability of the Learn with M.E. educational software, which facilitates personalized learning in basic mathematics. Our surveys indicated that the application could be a potential and much-needed tool to boost personalized education.
Throughout our work, we placed significant emphasis on the steps of software development. Similar to the comparative analysis by Outhwite et al., we considered the examination of the application’s design features to be essential [9]. During software development, we aimed to integrate functions essential for software that supports personalized learning. Consequently, continuous feedback to students and teachers, individual route planning facilitated by leveling, and the maintenance of motivation through the collection of playful trophies in our achievement system were crucial aspects we incorporated.
Several studies have also explored the possibilities offered by learning management systems (LMSs) and other digital educational tools. One of the fundamental principles of personalized and guided learning is reinforcement and continuous feedback. The students’ responses determine the next activity: those who provide correct answers can proceed to the next activity, while those who provide incorrect answers return to the previous material. This allows students to learn at their own pace, ensuring the optimal achievement of learning outcomes [33,34]. Furthermore, research shows that children make more conscious decisions when using educational applications with explanatory feedback compared to those without feedback [14]. This was also proven with Learn with M.E., as improvement was measurable in 61 students without teacher intervention in the learning process.
The different difficulty levels allowed students to solve tasks corresponding to their knowledge level, providing everyone with the opportunity to experience success while using Learn with M.E. Leveling proved to be an effective method for personalizing learning content, as previously discussed by other researchers [9,15,16,17,19]. During development, we aimed to set the programmed dynamic leveling in Learn with M.E. according to these guidelines so that the presented learning content would adapt to the child’s performance while using the application. Feedback from students confirmed that this was a useful feature.
Vasalou’s study also demonstrated that continuous motivation can be maintained with built-in games or rewards [28]. Similarly, Learn with M.E. engages students in calculations through challenges necessary to earn trophies and continuous feedback during calculations, making them perceive learning as a game rather than a duty.
During the testing phase, we made notes during on-site visits to schools. Based on these notes, we corrected any identified errors and supplemented the software with new features. Following these occasions, we evaluated the software’s feedback by collecting data from the software’s database. This was followed by the preparation of personalized tasks and personalized work with students during the next on-site visit. We observed that students’ digital literacy increased while using the software. We had to address students’ inexperience, particularly concerning the placement of characters on the keyboard and initial login issues, which partly slowed down the deduction process. However, these issues gradually eased over time as students became accustomed to using the software easily and quickly. The software engaged the students. For those who needed it, we supported them in learning activities with teachers. This is positive in that more adept students were not restricted by weaker performers, and they could seek out more and more challenging tasks. Additionally, weaker students did not feel left behind because they did not know exactly how many tasks their peers had completed. They could discover the various difficulty levels at their own pace and improve their weaknesses with the support of the software and teachers.
Testing with control groups showed significant differences in favor of students learning with Learn with M.E. We are aware that the survey with control groups was conducted on a relatively small scale, so to generalize our results, it is advisable to conduct testing on a larger scale and with different age groups in the future.
In addition to students, teachers also responded positively to the results achieved with Learn with M.E. In their view, the application makes differentiation easier, and every student can solve an unlimited number of tasks at their own pace. Using digital technology is more motivating than performing calculations in a notebook, and with continuous rewards and immediate feedback, it indicates the correctness of solutions to students. It helps in understanding and correcting errors and provides clear feedback to teachers on individual students’ progress, thereby facilitating the creation of more personalized practice tasks.

7. Conclusions

Artificial intelligence and smart learning tools are shaping the future of education. In an era of evolving technologies, education must also adapt to changes and leverage tools that serve the development and success of students. Personalized education creates a future where every student has the opportunity to fully realize their potential. By considering individual needs and skills, students can learn much more effectively.
Our surveys show that both students and teachers have enriched experiences through the use of the application. Nearly 150 students received personalized tutoring. According to the students, the majority of them ended the surveys with better mathematical knowledge. These results were confirmed by tests, as they achieved better results than their peers in control groups. Our third hypothesis supported the significant difference between students using the software and those who did not. Furthermore, our fourth hypothesis highlighted a significant relationship between students using the software and the grades they achieved on tests. Learning with M.E. has made learning more enjoyable, as some students perceived it as a game. According to teachers, our software facilitated differentiated instruction. Students were able to progress at their own pace. Weaker students did not slow down more proficient ones, and they received immediate guidance from the software and from us, the instructors, who were able to clearly identify problematic students and the precise causes of computational difficulties.
In summary, Learn with M.E. can identify individual learning deficiencies based on embedded patterns. Its interactive and rewarding system motivates students. Through embedded summative assessments, it automates the difficulty level for students. By recording and reviewing the complete computational log, it simplifies understanding individual student needs, thus facilitating easier and faster customization of teaching.
The results presented in this article will constitute a part of our future dissertation.

Author Contributions

Conceptualization, N.A.; methodology, N.A.; software, N.A.; validation, N.A., formal analysis, N.A.; data curation, N.A.; writing—original draft preparation, N.A.; writing—review and editing, N.A.; visualization, N.A.; supervision, T.K.; All authors have read and agreed to the published version of the manuscript.

Funding

The research was supported by grant BGA/145/2024 “Support for the J. Selye University’s research activities—social science research on Hungarians in Slovakia” provided by the Government of Hungary.

Institutional Review Board Statement

All subjects gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

We would like to express our gratitude to the leaders of educational institutions, mathematics and computer science teachers, and students who actively participated in the testing of the Learn with M.E. application. We thank J. Selye University and BGA for their support. Last but not least, we appreciate the valuable insights from the publisher’s staff regarding the research, which have enhanced the quality of the completed research article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Tapalova, O.; Zhiyenbayeva, N.; Gura, D. Artificial Intelligence in Education: AIEd for Personalised Learning Pathways. Electron. J. e-Learn. 2022, 20, 639–653. [Google Scholar] [CrossRef]
  2. Sun, N.; Li, K.; Zhu, X. Action Research on Visualization Learning of Mathematical Concepts Under Personalized Education Idea: Take Learning of Geometrical Concepts of Elementary Math for Example. In Blended Learning: Aligning Theory with Practices; Cheung, S., Kwok, L.F., Shang, J., Wang, A., Kwan, R., Eds.; Springer: Cham, Switzerland, 2016; Volume 9757, pp. 367–377. [Google Scholar] [CrossRef]
  3. National Center for Education Statistics (NCES). NAEP Data Explorer. The Nations Report Card. Available online: https://www.nationsreportcard.gov/ndecore/landing (accessed on 3 May 2023).
  4. Nores, M.; Barnett, S. Access to High-Quality Early Care and Education: Readiness and Opportunity Gaps in America; Center for Enhancing Early Learning Outcomes and National Institute for Early Education Research: New Brunswick, NJ, USA, 2014. [Google Scholar]
  5. Reardon, S.F. The widening income achievement gap. Educ. Leadersh. 2013, 70, 10–16. [Google Scholar]
  6. Dixon, F.A.; Yssel, N.; McConnell, J.M.; Hardin, T. Differentiated instruction, professional development, and teacher efficacy. J. Educ. Gift. 2014, 37, 111–127. [Google Scholar] [CrossRef]
  7. Goddard, Y.; Goddard, R.; Kim, M. School instructional climate and student achievement: An examination of group norms for differentiated instruction. Am. J. Educ. 2015, 122, 111–131. [Google Scholar] [CrossRef]
  8. Bang, H.J.; Li, L.; Flynn, K. Efficacy of an Adaptive Game-Based Math Learning App to Support Personalized Learning and Improve Early Elementary School Students’ Learning. Early Child. Educ. J. 2023, 51, 717–732. [Google Scholar] [CrossRef]
  9. Outhwaite, L.; Early, E.; Herodotou, C.; Van Herwegen, J. Understanding how educational maths apps can enhance learning: A content analysis and qualitative comparative analysis. Br. J. Educ. Technol. 2023, 54, 1292–1313. [Google Scholar] [CrossRef]
  10. Zhou, F. Personalized Learning Network Teaching Model. Phys. Procedia 2012, 24, 2026–2031. [Google Scholar]
  11. Nickow, A.; Oreopoulos, P.; Quan, V. The Impressive Effects of Tutoring on Prek-12 Learning: A Systematic Review and Meta-Analysis of the Experimental Evidence; Working Paper 27476; National Bureau of Economic Research: Cambridge, MA, USA, 2020. [Google Scholar]
  12. Ruan, S.; Nie, A.; Steenbergen, W.; He, J.; Zhang, J.; Guo, M.; Liu, Y.; Nguyen, K.; Wang, C.; Ying, R.; et al. Reinforcement Learning Tutor Better Supported Lower Performers in a Math Task. Mach. Learn. 2023, 113, 3023–3048. [Google Scholar] [CrossRef]
  13. Magomadov, V.S. The application of artificial intelligence and Big Data analytics in personalized learning. J. Phys. Conf. Ser. 2020, 1691, 012169. [Google Scholar] [CrossRef]
  14. Blair, K.P. Learning in critter corral: Evaluating three kinds of feedback in a preschool math app. In Proceedings of the 12th International Conference on Interaction Design and Children, New York, NY, USA, 24–27 June 2013; pp. 372–375. [Google Scholar]
  15. Hsin, C.T.; Wu, H.K. Using scaffolding strategies to promote young children’s scientific understandings of floating and sinking. J. Sci. Educ. Technol. 2011, 20, 656–666. [Google Scholar] [CrossRef]
  16. Magliaro, S.G.; Lockee, B.B.; Burton, J.K. Direct instruction revisited: A key model for instructional technology. Educ. Technol. Res. Dev. 2005, 53, 41–55. [Google Scholar] [CrossRef]
  17. Kucirkova, N. Personalized learning with digital technologies at home and school: Where is children’s agency? In Mobile Technologies in Children’s Language and Literacy: Innovative Pedagogy in Preschool and Primary Education; Oakley, G., Ed.; Emerald Publishing: Bingley, UK, 2018; pp. 133–155. [Google Scholar]
  18. Schenke, K.; Redman, E.J.; Chung, G.K.; Chang, S.M.; Feng, T.; Parks, C.B.; Roberts, J.D. Does “measure up!” measure up? Evaluation of an iPad app to teach preschoolers measurement concepts. Comput. Educ. 2020, 146, 103749. [Google Scholar] [CrossRef]
  19. Vandewaetere, M.; Clarebout, G. Advanced technologies for personalized learning, instruction, and performance. In Handbook of Research on Educational Communications and Technology; Springer: Berlin/Heidelberg, Germany, 2014; pp. 425–437. [Google Scholar]
  20. Ismail, H.M.; Harous, S.; Belkhouche, B. Review of personalized language learning systems. In Proceedings of the 2016 12th International Conference on Innovations in Information Technology (IIT), Al Ain, United Arab Emirates, 28–30 November 2016; pp. 1–6. [Google Scholar] [CrossRef]
  21. Colace, F.; De Santo, M.; Vento, M. A Personalized Learning Path Generator Based on Metadata Standards. Int. J. e-Learn. 2005, 4, 317–335. [Google Scholar]
  22. Palma Rosas, G.A.; Polo, M.J.A.; Hermosilla, A.S.; Delgado, A.; Huamaní, E.L. Development of a Web System to Improve and Reinforce Learning in Mathematics in Primary and Secondary Students in Peru. Int. J. Recent Innov. Trends Comput. Commun. 2023, 11, 51–58. [Google Scholar] [CrossRef]
  23. Nie, A.; Reuel, A.K.; Brunskill, E. Understanding the Impact of Reinforcement Learning Personalization on Subgroups of Students in Math Tutoring. In Proceedings of the Artificial Intelligence in Education; Wang, N., Rebolledo-Mendez, G., Dimitrova, V., Matsuda, N., Santos, O.C., Eds.; Springer: Cham, Switzerland, 2023; Volume 1831, pp. 793–798. [Google Scholar] [CrossRef]
  24. Mandel, T.; Liu, Y.-E.; Levine, S.; Brunskill, E.; Popovic, Z. Offline policy evaluation across representations with applications to educational games. In Proceedings of the 2014 International Conference on Autonomous Agents and Multiagent Systems (AAMAS), Paris, France, 5–9 May 2014; pp. 1077–1084. [Google Scholar]
  25. Bassen, J.; Balaji, B.; Schaarschmidt, M.; Thille, C.; Painter, J.; Zimmaro, D.; Games, A.; Fast, E.; Mitchell, J.C. Reinforcement learning for the adaptive scheduling of educational activities. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI), Honolulu, HI, USA, 25–30 April 2020; pp. 1–12. [Google Scholar]
  26. Zhou, G.; Azizsoltani, H.; Ausin, M.S.; Barnes, T.; Chi, M. Hierarchical reinforcement learning for pedagogical policy induction. In Proceedings of the 20th International Conference on Artificial Intelligence in Education (AIED), Chicago, IL, USA, 25–29 June 2019; Springer: Cham, Switzerland, 2019. Part I. Volume 20, pp. 544–556. [Google Scholar]
  27. Griffith, S.F.; Hagan, M.B.; Heymann, P.; Heflin, B.H.; Bagner, D.M. Apps as learning tools: A systematic review. Pediatrics 2020, 145, e20191579. [Google Scholar] [CrossRef] [PubMed]
  28. Vasalou, A. Reflections on Personalized Games-Based Learning: How Automation Is Shaped Within Everyday School Practices. IEEE Technol. Soc. Mag. 2022, 41, 64–67. [Google Scholar] [CrossRef]
  29. Rupp, A.A.; Gushta, M.; Mislevy, R.J.; Shaffer, D.W. Evidence-centered Design of Epistemic Games: Measurement Principles for Complex Learning Environments. J. Technol. Learn. Assess. 2010, 8, 1–45. Available online: http://www.jtla.org (accessed on 1 April 2022).
  30. Herholdt, R.; Sapire, I. An error analysis in the early grades mathematics—A learning opportunity? S. Afr. J. Child. Educ. 2014, 4, 42–60. [Google Scholar] [CrossRef]
  31. Nelson, G.; Powell, S.R. Computation Error Analysis: Students With Mathematics Difficulty Compared To Typically Achieving Students. Assess. Eff. Itervention 2018, 43, 144–156. [Google Scholar] [CrossRef]
  32. Ministerstvo Školstva, Výskumu, Vývoja a Mládeže Slovenskej Republiki—Vzdelávacie Štandardy pre 2. Stupeň ZŠ. Available online: https://www.minedu.sk/vzdelavacie-standardy-pre-2-stupen-zs/ (accessed on 10 November 2022).
  33. Munawaroh, N. The Influence of Teaching Methods and Learning Environment to the Student’s Learning Achievement of Craft and Entrepreneurship Subjects at Vocational High School. Int. J. Environ. Sci. Educ. 2017, 12, 665–678. [Google Scholar]
  34. Suartama, I.K.; Triwahyuni, E.; Suranata, K. Context-Aware Ubiquitous Learning Based on Case Methods and Team-Based Projects: Design and Validation. Educ. Sci. 2022, 12, 802. [Google Scholar] [CrossRef]
Figure 1. The graphical user interface of Learn with M.E.
Figure 1. The graphical user interface of Learn with M.E.
Education 14 00773 g001
Figure 2. Graphical assistance for younger students and those struggling with calculations.
Figure 2. Graphical assistance for younger students and those struggling with calculations.
Education 14 00773 g002
Figure 3. Sample of error possibilities detected by Learn with M.E.
Figure 3. Sample of error possibilities detected by Learn with M.E.
Education 14 00773 g003
Figure 4. The window for theoretical materials and sample problems integrated into Learn with M.E.
Figure 4. The window for theoretical materials and sample problems integrated into Learn with M.E.
Education 14 00773 g004
Figure 5. Trophies in Learn with M.E.
Figure 5. Trophies in Learn with M.E.
Education 14 00773 g005
Figure 6. The interface to load the digital student workbook.
Figure 6. The interface to load the digital student workbook.
Education 14 00773 g006
Figure 7. A flowchart describing the automatic difficulty-level setting.
Figure 7. A flowchart describing the automatic difficulty-level setting.
Education 14 00773 g007
Figure 8. A flowchart describing how the immediate check works.
Figure 8. A flowchart describing how the immediate check works.
Education 14 00773 g008
Figure 9. The distribution of classes using the software and participating in the survey.
Figure 9. The distribution of classes using the software and participating in the survey.
Education 14 00773 g009
Figure 10. Students’ perception of the general user experience of Learn with M.E.
Figure 10. Students’ perception of the general user experience of Learn with M.E.
Education 14 00773 g010
Figure 11. Assessment of software usability.
Figure 11. Assessment of software usability.
Education 14 00773 g011
Figure 12. Assessment of adequacy of pre-usage guidance for software use.
Figure 12. Assessment of adequacy of pre-usage guidance for software use.
Education 14 00773 g012
Figure 13. Magnitude of motivation sustained by trophies.
Figure 13. Magnitude of motivation sustained by trophies.
Education 14 00773 g013
Figure 14. Improvement in students’ mathematical knowledge after using the software.
Figure 14. Improvement in students’ mathematical knowledge after using the software.
Education 14 00773 g014
Figure 15. Positive changes observable in math lessons after using the software.
Figure 15. Positive changes observable in math lessons after using the software.
Education 14 00773 g015
Figure 16. A description of the automatic difficulty level function.
Figure 16. A description of the automatic difficulty level function.
Education 14 00773 g016
Figure 17. The results of students who received personalized tutoring.
Figure 17. The results of students who received personalized tutoring.
Education 14 00773 g017
Figure 18. The accuracy of Learn with M.E. version 1.7.
Figure 18. The accuracy of Learn with M.E. version 1.7.
Education 14 00773 g018
Figure 19. Rate of transient, diminishing, and permanent errors after using Learn with M.E.
Figure 19. Rate of transient, diminishing, and permanent errors after using Learn with M.E.
Education 14 00773 g019
Figure 20. Frequency of errors focusing on personalized consultations based on Learn with M.E. observations.
Figure 20. Frequency of errors focusing on personalized consultations based on Learn with M.E. observations.
Education 14 00773 g020
Table 1. Students’ opinions on Learn with M.E.
Table 1. Students’ opinions on Learn with M.E.
QuestionsAnswers
Which features or details did you like most about the educational software, and why?“I liked everything very much.” “Basically, I liked the whole software, I can’t highlight any main details. In one word, it’s fantastic.”
“The crowns and levels.”
“When fireworks appeared when I answered the question correctly.”
“The competition for trophies, adjustable difficulties, I can see what I calculated.”
“I liked that I had to collect trophies and that I could also speak.”
“You could change to several languages.”
“The bracket tasks because I can calculate better and faster now.”
“The bracket tasks because I can calculate more easily.”
“It tells me exactly what my mistake is, so I can correct it more easily.”
“I liked everything very much, it’s very well done, I learned to calculate better.”
“That it showed my mistakes and that you can get trophies.”
“I liked the auto button very much because I got personalized calculations and it also showed what was wrong in the example.”
“The auto button because at least I can learn at the difficulty level that matches my own knowledge.”
Is there anything in the educational software that you didn’t like or that you think needs improvement? “They used the “*”1 character for multiplication, which might be confusing.”
“I actually like the whole idea, but I don’t really like learning programs like this, I prefer to sit down with my notebook and write.”
“I didn’t like that I had to write down the deduction because it didn’t let me continue without it. It would be good if it allowed you to continue without deduction.”
“I didn’t like that you had to fill out the deduction process.”
“The speech.”
“It should be a bit more minimalistic and the interface should be easier to handle.”
“I think there’s nothing that needs improvement, I liked everything.”
“I think it’s perfect. Hats off to the creator!”
“There was nothing in it that I didn’t like.”
Imagine you are the developer of the educational software. What new features would you add to the software?“Rewards from which you can ask for help.”
“It would be nice to be able to earn more trophies.”
“If you have more trophies, you can get more things, for example, solve a problem for free.”
“There could be a rank order so you have to move up the rank order.”
“I couldn’t think of anything else because I like it this way, but maybe playing against others, for example, timed and being able to choose the difficulty level that suits both.”
“An animated character next to the voice, it would make it more interesting.”
“Two people in one group and they would compete to see who can calculate the most correctly.”
“I would add that you could play online in the google search.”
“Fractions.”
“Pythagorean theorem.”
“Algebra, exponentiation.”
“Word problems.”
1 We used the “*” symbol to denote multiplication because on the numeric keypad, which is most commonly used for describing the calculation process, the “*” symbol also represents the multiplication operation.
Table 2. The assessment system we use.
Table 2. The assessment system we use.
PercentAssessmentGrade
100%–90%Excellent1
89%–75%Laudable2
74%–50%Good3
49%–30%Satisfactory4
29%–0%Failed5
Table 3. The frequency of results for students using the software.
Table 3. The frequency of results for students using the software.
% Achieved on the TestFrequencyPercentValid PercentCumulative Percent
Valid16.67%23.93.93.9
33.33%12.02.05.9
41.67%12.02.07.8
50.00%35.95.913.7
58.33%1223.523.537.3
66.67%611.811.859.0
75.00%1121.621.670.6
83.33%713.713.784.3
91.67%59.89.894.1
100.00%35.95.9100.0
Total51100.0100.0
Table 4. The frequency of results for the control group.
Table 4. The frequency of results for the control group.
% Achieved on the TestFrequencyPercentValid PercentCumulative Percent
Valid8.33%11.81.81.8
16.67%11.81.83.6
25.00%47.37.310.9
33.33%1120.020.030.9
41.67%11.81.832.7
50.00%59.19.141.8
58.33%712.712.754.5
66.67%59.19.163.6
75.00%814.514.578.2
83.33%47.37.385.5
91.67%59.19.194.5
100.00%35.55.5100.0
Total51100.0100.0
Table 5. The results of the correlation analysis.
Table 5. The results of the correlation analysis.
Group
GradeCorrelation
Coefficient
−0.212
Sig. (2-tailed)0.029
N106
Table 6. Teachers participating in the survey.
Table 6. Teachers participating in the survey.
TagNameTitlePairingTeaching Experience (Years)
T1Holocsi JózsefMgr. (Master)Mathematics–Informatics6
T2Varga VeronikaMgr., Ing. (Master, Engineer)Mathematics–Informatics15
T3Oroszlány JózsefPaedDr. (Doctor of Pedagogy)Mathematics–Geography23
T4Fördős GyulaMgr. (Master)Mathematics–Physics38
Table 7. Teachers’ opinions on Learn with M.E.
Table 7. Teachers’ opinions on Learn with M.E.
QuestionsAnswers
What positive feedback have you received from students during or after using the software?T1: “The software is very clear and easy to use.”
T2: “Easy to use, understandable.”
T3: “User-friendly and easy to handle from the user’s perspective.”
T4: “Completely clear. The important things are at the top after logging in. It’s great that there are colorful icons on the interface, encouraging kids to click and gather more information. I really like the auto symbol. Special thanks for adding the Hungarian language. The info button is also practical.”
In what ways does the software assist in personalized teaching of the curriculum?T1: “The students really enjoyed using the program and even asked when they could use it again.”
T2: “Unlimited practice with setting the appropriate level of knowledge, automatic progress. They were motivated by collecting trophies.”
T3: “The students liked it a lot.”
T4: “They really like it. They keep asking when we can use it again. They like that they can determine the difficulty of the tasks themselves, and if they’re not sure, there’s the auto button. Trophy collection is also a big hit.”
What are the advantages of the software compared to traditional mathematics education?T1: “It helps everyone work at their own pace.”
T2: “By selecting the difficulty level, everyone can find suitable tasks and tasks become gradually more challenging during practice, allowing them to progress at their own pace.”
T3: “They can practice and review the material more easily, and we can more easily identify their individual mistakes.”
T4: “Differentiated instruction can be implemented in practice. We don’t have to come up with examples, as the program generates them. We can identify those students who may need assistance earlier than expected. It highlights students’ weaknesses or strengths in the given material. Trophy collection helps them realize that you have to work for results, nothing is given for free. This goes beyond mathematics.”
How does the software help increase student motivation?T1: “The program’s advantages include helping eliminate incorrect calculation sequences, as the program evaluates the error immediately after an incorrect answer and shows the user the mistake.”
T2: “Easier differentiation, own pace, unlimited number of tasks, more interesting and motivating than calculation in notebooks, immediate feedback on whether tasks were solved correctly, assistance in understanding and correcting errors.”
T3: “Both explanation and practice are needed. I see the main advantage in practice.”
T4: “We can solve many more, and qualitatively different, tasks than in class. I can focus on individual students because I can see what they don’t know or what they do know.”
In your opinion, how does the software support teachers’ work and the teaching process?T1: “The more correct answers a student provides, the more the program rewards progress with trophies, and the user can customize their profile figure in various ways.”
T2: “More interesting than calculation in notebooks, they receive rewards, trophies for correct calculations, and they can see their progress.”
T3: “Increases motivation.”
T4: “Competition is very important to them; they already ask how many trophies you have. Interestingly, they discuss what they didn’t know after the session, so they can work more effectively next time.”
Does the software assist those students who need extra help? If yes, which features?T1: “Very well.”
T2: “Excellent for unlimited, varied tasks; assists in customization; provides feedback and summaries on solved tasks and errors, making it easier to identify what needs further review.”
T3: “A new educational tool that makes the teacher’s job more modern and easier.”
T4: “I think so, it shows students’ strengths and weaknesses in black and white. It teaches accuracy, attention to detail, and awareness of connections. For example, it displays each step in case of a wrong answer.”
Does the software provide opportunities for students seeking extra challenges?T1: “Yes. With artificial intelligence-based error checking.”
T2: “Yes, with the theoretical part, available anytime; with visual representation using fingers, with help window and solution derivation in case of incorrect solutions.”
T3: “With the auto function and rewards.”
T4: “Reaching different levels is a big motivation. Additionally, even if they choose easier examples, they still have a sense of accomplishment.”
What technical issues or bugs have you noticed that have not been addressed in previous updates?T1: “Yes.”
T2: “Yes. They can choose more difficult tasks according to their knowledge.”
T3: “Yes.”
T4: “I believe so. Precisely because of the levels, and also during trophy collection. Moreover, time is also an important factor.”
If so, have students reacted to any potential shortcomings or difficulties with the software?T1: “There wasn’t any.”
T2: “I didn’t notice any issues.”
T3: “It failed to start on an older 32-bit computer.”
T4: “What I noticed has been fixed.”
Are there any features or content missing from the software that teachers or students deem necessary? If yes, what are they?T1: “There wasn’t any.”
T2: “No.”
T3: “No reports.”
T4: “I’m not aware of any.”
What design or content limitations hinder the effective implementation of the software?T1: “It is completely suitable for teaching basic operations and improving understanding of the rules.”
T2: “To expand with rounding tasks.”
T3: “Fractions, equations, units of measurement.”
T4: “Basic operations are covered. I’m not sure if there’s an option for others, like expressions, exponents, or roots.”
How does the software handle a large number of students or different student profiles?T1: “There isn’t any.”
T2: “There isn’t any.”
T3: “There isn’t any.”
T4: “So far, I haven’t observed any.”
Do you believe there are any training needs for teachers to effectively use the software?T1: “Completely adequate.”
T2: “Good.”
T3: “Good.”
T4: “The problem for us is rather the lack of computers.”
How might a weak internet connection affect the use of the software?T1: “No.”
T2: “The introduction before use was sufficient for me.”
T3: “Very minimal.”
T4: “I think this question will be answered with broader application.”
What further development opportunities do you see in the software to make mathematics education more effective?T1: “There is no effect of the internet on the program’s usage.”
T2: “It didn’t influence it.”
T3: “Low.”
T4: “No effect.”
How could the software be applied to or expanded for use in other subjects?T1: “More topics could be created (e.g., percentage calculations, combinatorics).”
T2: “To expand with rounding tasks.”
T3: “To expand the topics.”
T4: “To expand the topics.”
How could the software be more integrated into the school curriculum or educational processes?T1: “I can’t answer that right now.”
T2: “I don’t know offhand.”
T3: “To broaden the software program’s offerings.”
T4: “To answer this, more time is needed.”
How could the software be personalized to meet the needs of individual schools or teachers?T1: “I can’t answer that right now.”
T2: “It can be excellently used in every grade level in the appropriate topics.”
T3: “If every student in every school had access to digital devices.”
T4: “I can’t answer that yet.”
The software has been tested in multiple classes and grade levels. How could it be more effectively used for different age groups?T1: “I can’t answer that right now.”
T2: “It’s adequate for me.”
T3: “The difficulty levels of the software are appropriate.”
T4: “Its versatility in usability, I think, requires more practice.”
If necessary, how could the software be used in distance learning?T1: “I can’t answer that right now.”
T2: “I don’t know.”
T3: “Generating worksheets and tests in printable and electronic formats.”
T4: “For instance, I will use it in plus activity.”
What competitors or alternatives are there in the market that could be competitive with the software? Have you heard of or used similar software with similar features in your career? If yes, could you name these software?T1: “I can’t answer that right now.”
T2: “No idea.”
T3: “To create an app for various user interfaces (Android, iOS, Windows devices).”
T4: “I don’t know yet.
What negative changes may occur in the future in mathematics education that could affect the software?T1: “I’m not aware of any.”
T2: “matek.ide.sk”
T3: “No.”
T4: “No.”
Could the software have any negative effects on students’ studies? If you think so, please explain what they might be.T1: “I’m not sure.”
T2: “Basic operations should always be known.”
T3: “Restrictions on digital devices in schools.”
T4: “The merger of the mathematics subject and the decrease in class hours.”
How could technological changes affect the current model or usability of the software?T1: “No.”
T2: “No.”
T3: “Not typical.”
T4: “No.”
Are there any data security or privacy concerns when using the software?T1: “I can’t answer that right now.”
T2: “With constant development, any changes can be addressed.”
T3: “The current model of the software is efficient, modern, and user-friendly.”
T4: “The development is so fast that it’s hard to predict.”
What positive feedback have you received from students during or after using the software?T1: “No.”
T2: “No.”
T3: “No.”
T4: “No.”
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Annuš, N.; Kmeť, T. Learn with M.E.—Let Us Boost Personalized Learning in K-12 Math Education! Educ. Sci. 2024, 14, 773. https://doi.org/10.3390/educsci14070773

AMA Style

Annuš N, Kmeť T. Learn with M.E.—Let Us Boost Personalized Learning in K-12 Math Education! Education Sciences. 2024; 14(7):773. https://doi.org/10.3390/educsci14070773

Chicago/Turabian Style

Annuš, Norbert, and Tibor Kmeť. 2024. "Learn with M.E.—Let Us Boost Personalized Learning in K-12 Math Education!" Education Sciences 14, no. 7: 773. https://doi.org/10.3390/educsci14070773

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop