Next Article in Journal
Exploring the Interplay Between Teaching Strategies and Digital Competencies Beliefs Among Pre-Service Teachers: A Longitudinal Study
Previous Article in Journal
An Analysis of Thermal Comfort as an Influencing Factor on the Academic Performance of University Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating the Impact of an Educational Intervention Using Project-Based Learning on Postpandemic Recovery in Rural Colombia

by
Mercedes Carmen Arrieta-Cohen
1,
Luz Angela Torres-Arizal
1 and
Ricardo León Gómez-Yepes
2,*
1
Center of Science and Technology of Antioquia (CTA), Medellín 050013, Colombia
2
College of Education, University of Antioquia, Medellín 050034, Colombia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(12), 1341; https://doi.org/10.3390/educsci14121341
Submission received: 18 September 2024 / Revised: 5 November 2024 / Accepted: 8 November 2024 / Published: 9 December 2024
(This article belongs to the Section Curriculum and Instruction)

Abstract

:
This study evaluates the impact of a Project-Based Learning (PBL) intervention on postpandemic educational recovery in rural Colombia, focusing on student competencies in mathematics, language, science, and 21st-century skills. Conducted in rural schools, the intervention aimed to address significant learning gaps exacerbated by the COVID-19 pandemic by providing teacher training and direct student support. A pretest–posttest single-group design was used to assess the effectiveness of the intervention, with standardized tests measuring academic competencies and an analytical rubric evaluating 21st-century skills. The results indicate significant improvements in math, language, and science test scores, with notable gains in problem-solving, collaborative work, communication, and critical thinking. However, a decline in creativity scores highlights the need for a stronger emphasis on fostering creativity within the PBL framework. Gender differences were observed, with female students generally outperforming males, suggesting the need for tailored instructional approaches. This study’s limitations, including the absence of a control group, nonrandom sampling, and the use of subjective assessment methods, are acknowledged, with recommendations for future research to address these issues. Despite these limitations, the findings underscore the potential of PBL to enhance student learning outcomes in rural settings, offering valuable insights for educators and policymakers aiming to support educational recovery and development in similar contexts. Further research is recommended to explore the long-term effects of PBL and to refine the intervention for broader implementation.

1. Introduction

In the wake of the COVID-19 pandemic, education in Latin America and the Caribbean has faced a severe crisis, reflecting a loss of more than a decade of progress in learning. Even before the pandemic, over one-third of the region’s children and adolescents (35 million) were not reaching the minimum proficiency level in reading, and more than half (50 million) were failing to meet learning standards in mathematics [1]. The pandemic has only exacerbated these challenges, making the need of effective educational interventions more urgent than ever.
The 2022 PISA assessment highlights the ongoing challenges in Colombian students’ academic proficiency, showing both progress and areas where improvement is still needed. Although there have been gains since 2006 in mathematics, science, and reading, significant gaps persist when compared to OECD averages. In mathematics, Colombia’s average score increased by 13 points, yet it continues to fall substantially below the OECD benchmark. Similarly, while there have been improvements in science, recent trends suggest that progress has stalled, with scores still not reaching the OECD average. Reading scores have improved but continue to lag behind the standards set by the OECD [2].
These statistics underscore the formidable challenges faced by the educational system. In response to these learning gaps, the Center of Science and Technology of Antioquia (CTA), a nonprofit organization based in Medellín, Colombia, with funding from Fundación Fraternidad Medellín, developed and implemented an educational intervention called Alianza por la Educación con Calidad y Equidad—AECE (Alliance for Education with Quality and Equity). The intervention ran from 2010 to 2023 and focused on improving educational outcomes in rural areas by offering training to primary school teachers and a series of remedial sessions specifically tailored for third graders.
The teacher training covered the design, implementation, and evaluation of PBL activities aimed at enhancing children’s competencies in Spanish language, mathematics, science, and 21st-century skills [3], and the remedial activities were designed to elevate students’ performance to grade-level standards in Spanish language, math, and science. By focusing on both teacher training and direct student support, the intervention sought to address educational deficiencies holistically, fostering an environment where students could recover lost ground and develop essential academic skills. This dual approach not only equipped teachers with the necessary tools and methodologies to implement effective PBL but also provided immediate, targeted support to students who had fallen behind due to the disruptions caused by the pandemic.
This article focuses on the results from the 2022 cohort of third graders, who were the first group to return to in-person learning following the prolonged school closures caused by the COVID-19 pandemic. The primary purpose of this study is to evaluate the impact of Project-Based Learning (PBL) on student performance in the abovementioned areas and the development of 21st-century skills within this unique postpandemic educational context. Specifically, this study seeks to address the following research questions:
  • How does Project-Based Learning (PBL) impact children’s competencies in Spanish language, mathematics, science, and 21st-century skills in the postpandemic context?
  • What changes in student performance across these subjects and skills are observed between the pretest and posttest after the implementation of PBL?
  • To what extent does gender influence the effectiveness of PBL in enhancing these competencies?
Assessing the effectiveness of educational interventions is essential for enhancing the basic competencies of children in rural areas and for informing decisions about resource allocation and policy development [4]. This cohort presents a critical opportunity to examine how PBL can address learning losses and facilitate skill development during a period of recovery and adjustment. The analysis aims to contribute to the existing literature on the effectiveness of PBL, particularly in fostering academic resilience and adaptability in the face of unprecedented educational challenges.

2. The Intervention

The AECE intervention was designed to improve the pedagogical practices of primary school teachers in rural areas of the Antioquia region of Colombia, with the goal of strengthening student competencies in the core subjects such as mathematics, Spanish, and science. Over the 13 years of the program’s implementation, it reached approximately 10,500 students and 2500 teachers. The intervention focused on training and supporting teachers in designing, implementing, and evaluating PBL strategies in their classrooms. Additionally, the program promoted the development of 21st-century skills to enhance the quality of education in rural schools.

2.1. Pedagogical Foundations of the Intervention

The educational intervention was founded on an active learning approach [5,6,7] that emphasized the development both of foundational competencies in language, mathematics, and science and of essential 21st-century skills. This dual focus aimed to prepare students not only with the academic knowledge needed for success but also with the soft skills critical for navigating the complexities of modern life. Central to the intervention’s design and implementation was PBL, a pedagogical strategy that actively engages students in real-world projects. PBL encourages learners to take ownership of their education by fostering critical thinking, problem-solving, and collaboration skills, which are integrated with core academic content [8,9,10].
Research on PBL has demonstrated its effectiveness in improving foundational skills across subjects such as language [11,12], mathematics [13,14], and science [15,16]. PBL engages students in real-world projects and has been associated with increased student engagement and improved learning outcomes [17,18]. A key element of PBL is its alignment of assessment strategies with project objectives, allowing students to be evaluated on both academic understanding and their ability to apply these skills in practical contexts [19,20,21].
Similarly, the integration of 21st-century skills into education has become increasingly recognized as essential. These skills, which include critical thinking, collaboration, and digital literacy, are necessary for students’ personal development and their ability to succeed in an ever-changing world [3,22,23,24]. The incorporation of these competencies into educational frameworks aligns with the goals of PBL, as both approaches emphasize the importance of preparing students to meet the complex challenges and opportunities they will face in various aspects of life. Both PBL and 21st-century skills share a focus on preparing students not only academically but also in developing the practical abilities needed for lifelong learning and problem-solving. [23]. PBL supports the acquisition of 21st-century skills by promoting an active learning environment where students are encouraged to think critically, collaborate effectively, and apply their knowledge in diverse situations [25].
However, integrating 21st-century skills into classrooms presents challenges. There is often a gap between recognizing their importance and fully incorporating them into practice [26,27]. Traditional educational approaches, which tend to compartmentalize subjects like mathematics and science, can hinder interdisciplinary learning and limit real-world application opportunities [28].
In contrast, PBL offers a holistic approach, embedding these skills into thematic learning that fosters both academic and interpersonal growth. It encourages students to engage in collaborative projects that require critical thinking, problem-solving, and effective communication, all essential to 21st-century competencies [27]. Research has shown that PBL not only enhances student motivation and engagement but also improves their ability to work collaboratively and think critically [29]. By linking theory to real-world applications, PBL can bridge the gap between knowledge and practice [30].

2.2. Evaluative Foundations

The evaluation of learning outcomes in the intervention draws upon the Evidence-Centered Design (ECD) framework developed by Mislevy and colleagues [31,32,33,34], which provides a systematic and structured approach to educational assessment. This framework has been adapted to meet the unique requirements of users of the program. At the core of the ECD framework are three interconnected models, collectively forming what is referred to as the Conceptual Assessment Framework (CAF): the proficiency model, the task model, and the evidence model (Figure 1) [32].
The proficiency model forms the basis of the assessment strategy, identifying the knowledge, skills, and abilities (KSA) targeted by the intervention. These competencies are aligned with national standards for math, reading, and science education [35]. The proficiency model breaks down these competencies into specific skills and subskills, providing a clear framework for what students need to achieve. This structure guides the design of teacher training and student remediation activities and the subsequent evaluation of student learning.
The task model translates latent competencies into observable behaviors by defining the key features of assessment tasks and items. It ensures that these tasks effectively elicit student behaviors that reflect their competencies. In this intervention, the task model followed Norman Webb’s Depth of Knowledge (DOK) framework [36,37,38], which categorizes cognitive demands into various levels and guides task creation. This approach ensures assessments address a range of cognitive processes, including recall, reasoning, application of skills, and deeper conceptual understanding. For example, tasks designed to assess data interpretation from graphs focus on both recalling relevant information and applying interpretation and problem-solving skills.
The evidence model connects the latent competencies defined in the proficiency model with the observable behaviors elicited by the task model. It encompasses key elements of measurement, including the development of tests, rubrics, scoring methods, and statistical analyses for establishing students’ performance. The tests and rubrics are designed to align with the competencies outlined in the proficiency model and the tasks specified by the task model. These assessments incorporate multiple item types—such as multiple-choice, short answer, and performance-based tasks—to provide a comprehensive evaluation of student abilities. Clear criteria for evaluating student responses were established to ensure consistency and objectivity by defining specific performance levels. As shown in Figure 2, the intervention assessed students using two instruments: a standardized test measuring competencies and cognitive processes in language, mathematics, and science, and an analytical rubric [39] assessing 21st-century skills. For teachers, an observation rubric was used to track the implementation of PBL-based learning strategies in the classroom.

2.3. Implementation of the Intervention

The implementation of the intervention was designed to be comprehensive, involving both teacher development and student support. This approach was organized around three distinct stages. The program ran from 2010 to 2023, with a new cohort of teachers and students participating each year. This study specifically focuses on the 2022 cohort, offering insights into the program’s impact during this period, particularly in the context of postpandemic educational recovery. A detailed timeline of the intervention stages is included in Appendix A.

2.3.1. Stage 1: Planning

In the planning phase, participating schools were identified and engaged to establish the foundation for the intervention. The intervention was designed using a Theory of Change (TOC) framework [40,41], which provided a structured roadmap linking specific inputs, activities, and outputs to intended outcomes, as summarized in the logic model shown in Figure 3. This TOC approach enabled the identification of key performance indicators and milestones, allowing for effective tracking of progress and adjustments to improve program management. By clearly defining how each component contributed to the overall goals, the TOC model ensured that the intervention remained focused on measurable outcomes aligned with student and teacher development objectives.
The TOC was then refined to align with the program’s educational goals, and assessment instruments were developed and piloted to measure both student competencies and teacher practices accurately. Instructional materials were created to address the specific needs of the students and teachers participating in the intervention. The selection and training of educational advisors supported the implementation process. Additionally, remediation materials were revised to target specific learning gaps among the students, ensuring the intervention addressed their educational needs.

2.3.2. Stage 2: Baseline Assessment

In the second phase, baseline assessments of students’ competencies in math, science, and the Spanish language were conducted using cognitive instruments that were specifically designed to align with the intervention’s learning objectives. These baseline assessments, while similar in structure to the posttest instruments, included different items to maintain test validity while measuring the same competencies and standards targeted in the intervention. Like the posttest, 21st-century skills were assessed using a structured rubric to capture students’ proficiency across critical skills like problem-solving and collaboration.
Additionally, teachers’ instructional and evaluation practices were evaluated through a multimethod approach, including structured interviews, classroom observations, self-reported questionnaires, and samples of teaching plans and student work. These assessments provided a comprehensive understanding of both students’ and teachers’ starting points, enabling more accurate measurement of progress and targeted support throughout the intervention.

2.3.3. Stage 3: Delivery

The delivery phase encompassed three main components: professional development sessions for teachers, the implementation of PBL in classrooms, and targeted remedial sessions for students facilitated by CTA advisors. This phase began with eight professional development sessions for teachers, focusing on PBL. These sessions covered both theoretical and practical aspects of designing, implementing, and evaluating PBL-based teaching and learning activities within instructional units. Topics included curriculum alignment, project design, student engagement strategies, formative assessment techniques, and integrating 21st-century skills into classroom activities.
Following the initial training, teachers implemented the PBL methodology over a period of approximately nine months, structuring instruction into two or three PBL cycles. Each cycle included a series of steps designed to engage students in real-world, interdisciplinary projects that reinforced core academic skills and competencies, as explained below:
Initial project selection and planning. Teachers began each cycle by identifying themes that integrated math, science, and language content. Themes were aligned with curriculum standards and chosen based on their relevance to the students, allowing for meaningful learning objectives in each subject.
Launching the projects. To spark curiosity and highlight the interdisciplinary nature of each project, teachers introduced projects with a “driving question.” For example, a question on environmental conservation, “How can we protect local plants and animals?”, framed learning in science, math (data analysis), and Spanish language (reporting and presentations).
Guided inquiry and research. Teachers facilitated structured inquiry activities, guiding students through research to deepen their understanding of the project topics. This phase was supplemented with classroom resources and field trips, with teachers helping students form hypotheses and explore critical questions from multiple disciplinary perspectives.
Collaborative work sessions. Students were allocated collaborative work periods to pursue project goals. Each subject was addressed in targeted sessions—math for data collection, science for experiments, and language for written or oral presentations. Teachers supported students by providing guidance, addressing questions, and promoting diverse approaches to problem-solving.
Regular formative assessments. Throughout each project, formative assessments allowed teachers to observe and provide feedback on students’ academic and skill development. Peer and self-assessments were also used to track growth in critical areas, such as collaboration and communication.
Reflection and self-assessment. Teachers regularly set aside time for students to reflect on their progress, discuss challenges, and identify areas for improvement. This step reinforced students’ sense of ownership over their learning and encouraged self-assessment skills.
Final presentations and evaluations. At the conclusion of each project, students presented their work, integrating math, science, and language elements. Teachers evaluated students’ final outputs and interdisciplinary skills using structured rubrics.
Continuous adjustment and support. Teachers refined each PBL cycle based on observations of student progress. Six in situ follow-up sessions with PBL facilitators and three meetings with school administrators provided additional guidance and alignment support throughout the year, ensuring teachers could effectively tailor their strategies to meet the needs of their students.
The remedial sessions for students were designed to address specific learning gaps in reading, math, and science, aiming to bring participants up to grade-level standards. These sessions were structured to offer targeted support through individualized instruction, group activities, and problem-solving exercises that reinforced core academic concepts. The curriculum focused on strengthening foundational skills that had been disrupted due to the COVID-19 pandemic, with a particular emphasis on interactive learning and real-world applications.
A total of 27 remedial sessions were conducted, with 24 sessions planned and facilitated by experienced pedagogical advisors hired by the CTA. These sessions employed a combination of direct instruction, hands-on learning activities, and formative assessments to monitor student progress. The remaining three sessions were designed and implemented by the teachers who participated in the professional development program, allowing them to apply their newly acquired skills in PBL. Throughout these sessions, students were encouraged to engage in collaborative tasks, critical thinking, and applied learning to help them build both subject-specific competencies and 21st-century skills. The remedial program aimed not only to improve academic performance but also to foster greater confidence and engagement in the learning process.
Based on the program’s Theory of Change (Figure 3), the inputs and activities were expected to increase literacy, numeracy, and science skills among students. Additionally, teachers were expected to become more aware of and skilled in using PBL and active methodologies in their classrooms, leading to improved instructional and evaluation practices (Figure 2).
At the end of the intervention, students and teachers were assessed to identify the improvements in student competencies and the effectiveness of the professional development for teachers. These assessments aimed to measure the overall impact of the program and to gather insights for future educational initiatives.

3. Materials and Methods

3.1. Participants

As mentioned before, this study focused on the 2022 cohort of third-grade students from the Antioquia region of Colombia. The program engaged a diverse group of participants, comprising a total of 287 students, with 150 boys and 137 girls from nine rural schools across seven different municipalities. This distribution ensured a balanced representation of gender and a wide geographic reach, aligning with the program’s goal of improving educational outcomes in underserved rural areas. To maintain ethical standards, informed consent was obtained from all participating students’ parents or guardians. The names of the schools have been anonymized to protect the privacy of participants and institutions, as shown in Table 1.

3.2. Instruments

A multifaceted assessment strategy was implemented to evaluate program outcomes, measuring students’ mastery of core competencies and cognitive abilities in language, mathematics, and science (Table 2). To align assessments with the program’s learning objectives, an evidence model was developed based on the competencies outlined in the proficiency model and their corresponding observable behaviors in the task model [32]. This model incorporated a suite of assessments, including standardized pre- and posttests, a rubric for assessing 21st-century skills, scoring methodologies for each instrument, and statistical analysis to determine the competency level of each student (the instruments used in the intervention are available at https://github.com/ricardophd/cta-pbl-evaluation/blob/main/PBL-assessment-instruments (accessed on 17 September 2024)).
Language assessment focused on inferential and critical reading, using a 12-item test to evaluate comprehension, analysis, and critical thinking skills. The science assessment centered on ecological concepts, with a 20-item test assessing students’ understanding of abiotic and biotic factors and their interactions. The mathematics assessment measured students’ ability to estimate, calculate, and justify their reasoning through 12 questions evaluating computational skills, problem-solving, and logical reasoning (Table 2).
Test items were aligned with Webb’s Depth of Knowledge (DOK) framework [36,37,38], encompassing recall, application, and strategic thinking levels. Recall items tested basic knowledge, application items assessed practical application, and strategic thinking items evaluated the students’ capacity to analyze, synthesize, and make decisions based on their understanding. By integrating these components, the evidence model created an assessment system that provided a detailed picture of student learning and competencies.
To ensure the reliability and validity of these assessments, two key processes were implemented: cognitive laboratories and pilot testing. Cognitive labs involved engaging students with similar characteristics to the target population to understand how they interpreted and responded to the test items [42,43]. Researchers observed their thought processes and provided feedback on potential ambiguities or misinterpretations, ensuring the questions were clear and effectively measured the intended skills. Following the cognitive labs, pilot tests were conducted with a broader group of students. This allowed for the analysis of item difficulty and discrimination, further refining the tests to enhance reliability and validity [44,45,46].
To complement the quantitative data from standardized tests, the program incorporated an assessment of 21st-century skills, prioritizing problem-solving, collaboration, creativity, communication, and critical thinking (Table 3), as outlined in the Partnership for 21st-Century Skills framework [3]. These competencies are important for navigating the complexities of contemporary society.
A structured analytical rubric [39,47] was used to assess these skills in real-world contexts (Table 4), providing a comprehensive framework for evaluating student performance across the five skill dimensions. To ensure reliability, calibration sessions were conducted with trained evaluators, establishing inter-rater reliability (IRR) by standardizing scoring and minimizing subjective variation [48,49]. While the calibration process ensured consistency, the final application of the rubric was performed by single raters in each group. This pragmatic approach balanced the consistency gained through calibration with the efficiency needed for large-scale assessments, a common practice when resources and logistics limit the use of multiple raters. Research supports that careful calibration can mitigate biases in single-rater assessments, maintaining acceptable reliability levels [39,49].
The integration of these assessment components provided a comprehensive perspective on student capabilities, extending beyond cognitive skills to include essential competencies for modern learning. This approach aligns with the program’s objectives, emphasizing the development of critical thinking, problem-solving, and collaborative abilities as prerequisites for success in education, the workforce, and life.

3.3. Data Collection Procedures

The intervention employed a pretest–posttest single-group design, incorporating standardized tests and classroom observations. For the language component, n = 264 students completed the pretest, n = 226 took the posttest, and n = 196 had both sets of results for analysis. In science, n = 265 students completed the pretest, n = 221 took the posttest, with n = 190 providing both pretest and posttest results. For mathematics, n = 267 students completed the pretest, n = 233 took the posttest, and n = 201 had both results available for analysis. The tests were administered by CTA advisors in a controlled environment to ensure consistency and reliability. Students were provided with clear instructions, monitored for compliance, and given breaks as needed. The tests were administered in a structured, uniform manner, with strict adherence to time limits.
Additionally, n = 268 students were observed at the beginning of the intervention, and n = 273 were observed at its conclusion, with n = 238 having both pre- and postintervention observation results. These observations were conducted by CTA advisors who filled out analytical rubrics to assess the application of 21st-century skills during the implementation of PBL activities. The difference in the number of students between the pretest and posttest was due to factors such as absences, changes in enrollment, or logistical challenges that prevented some students from completing both assessments.
Upon completion, tests and rubrics were collected and entered into LimeSurvey [50] for data streamlining, management, formatting, and creation of data dictionaries.

3.4. Data Preparation and Analysis

3.4.1. Cognitive Tests

Student responses were processed to generate a global score for each student in each subject—language, mathematics, and science—using a weighted approach based on the cognitive demands of the assessment items. Each item was categorized as measuring recall, application, or strategic thinking, with weights assigned to reflect the complexity of each item type. After applying these weights, the scores were normalized using z-scores to standardize the distribution of responses.
Subsequently, a global score for each student in each subject was calculated. To facilitate comparison across subjects, these global scores were converted to a t-scale with a mean (M) of 50 and a standard deviation (SD) of 10 [51,52]. This transformation ensured consistency in measuring student performance across subjects and provided a standardized metric for evaluating individual student achievement within the intervention.
In addition to global scores, a measure of competency level was calculated based on students’ performance on each task or item within the tests. The criteria used to determine each performance level are shown in Table 5. These competency levels were established to offer a clearer understanding of students’ progression and areas of improvement, enabling the identification of specific strengths and weaknesses and informing targeted interventions and support.
Changes in global scores between pre- and posttests were evaluated using paired t-tests for each subject area (reading comprehension, mathematics, and science) to determine whether there were differences in mean scores. To account for potential non-normality in the data, Wilcoxon rank-sum tests were conducted as complementary analyses to the t-tests. These nonparametric tests verified the results obtained from the t-tests and provided additional robustness to the findings. Additionally, a repeated measures ANOVA was implemented to assess overall score changes and explore potential gender-based differences in performance trajectories. This analysis provided insights into the general trends in score changes and identified any significant interactions between gender and measurement conditions. All data analyses and visualizations were conducted using R [53].

3.4.2. Evaluation of Students’ 21st-Century Skills

The 21st-century skills rubric scores were used to generate a global score through a multistep process. First, raw scores were calculated for each of the five dimensions: problem-solving, collaboration, creativity, communication, and critical thinking. These raw scores were then standardized into z-scores to allow comparability across the different dimensions.
Subsequently, each z-score was converted to a 0–10 scale using a normalization function that adjusted values based on the minimum and maximum scores within each dimension. This ensured that all dimensions were on the same scale, facilitating direct comparisons between them.
Following this, an overall score for 21st-century skills was calculated by summing the standardized scores for each dimension. To enhance interpretability, the overall score was also converted to a 0–10 scale, aligning it with the individual dimension scores. Additionally, students’ performance levels for each 21st-century skill dimension were determined by first aggregating individual assessment items into total scores for each competency and then assigning a performance level based on the total score: “Beginning” for a score of 0, “In Progress” for a score of 1, “Achieved” for a score of 2, and “Extended” for a score of 3.

4. Results

4.1. Results of Pretest and Posttest Assessment

This section presents the results of statistical tests conducted to evaluate changes in student performance between pretest and posttest conditions. The analyses include paired t-test and Wilcoxon rank-sum tests to determine significant differences in global test scores, as well as repeated measures ANOVA to examine overall score changes and any gender-based differences in performance.

4.1.1. Language

The assessment results indicate an increase in language test scores from the pretest to the posttest. For girls, the mean score increased by approximately 2 points, rising from 52.1 (SD = 10.2) on the pretest to 54.2 (SD = 8.9) on the posttest. Similarly, boys’ mean scores increased from 47.6 (SD = 10.5) on the pretest to 49.6 (SD = 9.3) on the posttest (Figure 4).
Table 6 below presents the results of the paired samples t-test comparing the language test scores between the pretest and posttest conditions.
The paired t-test produced a t-statistic of −3.5515 with 195 degrees of freedom and a highly significant p-value (<0.001). The effect size, as indicated by Cohen’s d, is −0.254 with a standard error of 0.057, suggesting a moderate effect size.
The Wilcoxon test produced a test statistic of 5440.000 with a z-value of −3.161 and a significant p-value (0.002). The effect size for the Wilcoxon test, given by the matched rank biserial correlation, is −0.277 with a standard error of 0.087, also indicating a moderate effect. Overall, the results of both statistical tests for the language test suggest a significant change in test scores between the pretest and posttest conditions.
The results of the analysis of variance (ANOVA) assessing the impact of gender and measurement condition (pretest–posttest) on language test scores are presented in Table 7.
The analysis revealed statistically significant main effects for both gender and type of measurement condition on language test scores. Specifically, the gender factor demonstrated a significant impact, with an F-value of 21.313 and a p-value of 0.0000, indicating that gender significantly influenced test scores. Similarly, the type of measurement exhibited a significant main influence, with an F-value of 4.148 and a p-value of 0.0424, signifying its substantial impact on scores. However, the interaction between gender and type of test was not statistically significant (F = 0.015, p = 0.9014), suggesting that the way the scores changed from pretest to posttest did not differ significantly between genders.

4.1.2. Mathematics

The assessment results indicate an increase in math test scores from the pretest to the posttest (Figure 5). For girls, the mean score increased by approximately 2 points, rising from 52.1 (SD = 10.2) on the pretest to 54.2 (SD = 8.9) on the posttest. Similarly, boys’ mean scores increased from 47.6 (SD = 10.5) on the pretest to 49.6 (SD = 9.3) on the posttest
As shown in Table 8, The paired t-test yielded a t-statistic of −6.44 with 200 degrees of freedom and a p-value of <0.001. This result provides evidence of a significant difference in mean test scores between the two measurement conditions. The Wilcoxon rank-sum test yielded a test statistic (W) of 4355 and a p-value of <0.001. This result confirms the results of the paired t-test and provides strong evidence of a significant difference in test scores between the two measurement conditions
The repeated measures ANOVA test showed that the type of test (pretest or posttest) had a significant impact on math test scores. The F-value of 19.312 and a p-value of less than 0.001 mean that the difference in scores before and after the intervention is statistically significant (Table 9). This result indicates that the intervention had a meaningful effect on the students’ math performance.

4.1.3. Science

The assessment of scientific skills showed a significant change between pretest and posttest scores (Figure 6).
The analysis produced a test statistic (W) of −2.903 with 189 degrees of freedom and a p-value of 0.004135. The p-value indicates a significant difference in mean test scores between the two measurement conditions. Both the Wilcoxon rank-sum test and the paired t-test provide evidence of a significant change in science test scores between the pretest and posttest conditions (Table 10).
The results of the analysis of variance (ANOVA) indicate a significant shift in science test scores between the pretest and posttest conditions, as evidenced by a highly significant F-value (F(1, 49.96) = 17.997, p < 0.001). Specifically, there is a significant variation in scores between the two measurement conditions (Table 11).
For female students, the scores increased from an average of 47.5 in the pretest to 52.7 in the posttest, while male students experienced an increase in scores from 48.2 in the pretest to 51.5 in the posttest. These changes were observed within each gender group.
However, when considering the influence of gender on test scores, the ANOVA results indicate that it did not appear to significantly contribute to the observed variations in test scores (F(1, 1.892) = 0.077, p = 0.781).
The ANOVA results demonstrate a significant change in science test scores between the pretest and posttest conditions, with improvements observed for both female and male students. However, gender itself did not emerge as a statistically significant factor affecting these score differences.
These statistics provide a detailed view of the test score distribution within different groups. Notably, the mean scores for both genders tend to be slightly higher under the posttest measurement condition compared to the baseline with corresponding standard deviations indicating variability in scores.

4.2. Assessment of Students’ Competency Level by Subject

4.2.1. Language Competency Level

The results of the language component of the intervention reveal changes in student performance levels from the pretest to the posttest (Figure 7). At the baseline (pretest), the distribution of students was as follows: 27.0% of students were at the Extended level, 22.4% had Achieved the expected competencies, 30.6% were at the Beginning level, and 19.9% were In Progress.
By the endline assessment, the distribution showed a shift towards higher performance levels. The percentage of students at the Extended level increased slightly to 28.1%, while those in the Achieved category saw a more significant rise to 30.1%. Meanwhile, the proportion of students at the Beginning level decreased to 18.4%, and those In Progress increased to 23.5%.
These results suggest an overall improvement in language competencies, with more students advancing to the higher performance levels (Achieved and Extended) by the endline assessment. The decrease in the Beginning level indicates that fewer students remained at the lowest level of performance, while the increase in the In Progress level suggests that some students were still transitioning toward achieving proficiency.

4.2.2. Mathematics Competency Level

As shown in Figure 8, prior to the intervention, 31.3% of students had Achieved the expected competencies, 28.9% were In Progress, 23.9% were at the Beginning level, and 15.9% were at the Extended level.
By the posttest assessment, the distribution of performance levels showed notable changes. The percentage of students at the Extended level increased significantly to 39.3%, reflecting a considerable improvement. However, the percentage of students who Achieved the expected competencies decreased to 19.4%. Meanwhile, the proportion of students In Progress decreased to 18.4%, and those at the Beginning level also saw a slight reduction to 22.9%.

4.2.3. Science Competency Level

The results of the science component of the intervention, as shown in Figure 9, indicate changes in student performance across the four competency levels. At the baseline, the distribution of students was as follows: 22.6% of students were at the Beginning level, 32.6% were In Progress, 24.2% had Achieved the expected competencies, and 20.5% were at the Extended level. By the endline assessment, the distribution shifted, with 18.9% of students remaining at the Beginning level, a decrease from the baseline. The percentage of students In Progress remained constant at 24.2%, while those in the Achieved category increased to 25.8%. Notably, the percentage of students at the Extended level rose significantly to 31.1%. These results suggest a positive shift in student performance, with a notable increase in the proportion of students reaching the higher performance levels (Achieved and Extended) by the endline assessment. The decrease in the Beginning level indicates that fewer students remained at the lowest level of performance, reflecting overall improvement in the science competencies targeted by the intervention.
These results suggest a significant shift towards higher performance levels, particularly with the large increase in students reaching the Extended level by the posttest assessment. The decrease in both the Achieved and In Progress categories may indicate that some students who were previously meeting or nearly meeting the expected standards advanced further to the Extended level, while the slight reduction in the Beginning level suggests an overall improvement in foundational skills.

4.3. Assessment of 21st-Century Skills

As illustrated in Figure 10, the changes in participants’ standardized scores in 21st-century skills indicate that both gender and the educational intervention had a significant impact on student outcomes. The analysis, using both the Wilcoxon rank-sum test and the paired samples t-test, indicated statistically significant improvements in students’ overall scores between the pretest and posttest.
For the Wilcoxon rank-sum test, the results revealed a significant difference between the overall scores at baseline and endline (W = 7829.500, p < 0.001, 95%CI −0.461 to −0.189). This suggests that there was a shift in the distribution of scores between the two time points, indicating that the students’ overall 21st-century skills improved significantly after the intervention.
Similarly, the paired t-test results corroborate this finding (Table 12). The test showed a statistically significant mean difference of −0.641 (t = −3.506, df = 237, p < 0.001, 95%CI −0.357 to −0.099). This indicates a significant increase in the overall score for 21st-century skills between the baseline and endline assessments, further confirming the positive impact of the intervention on students’ skill development.
The ANOVA results further confirm this, revealing significant main effects for both gender and measurement condition on the overall standardized scores. As shown in Table 13, there was a significant main effect of gender on the overall scores, F(1, 472) = 7.075, p = 0.008. This result indicates that there were statistically significant differences in the overall performance between male and female students.
Additionally, the analysis showed a significant main effect of measurement condition, F(1, 472) = 9.874, p = 0.002, suggesting that the intervention had a substantial impact on student performance, with overall scores differing significantly between the pretest and posttest conditions.
Furthermore, the interaction between gender and measurement condition approached significance, F(1, 472) = 3.532, p = 0.061. Although this interaction was not statistically significant at the conventional 0.05 level, it indicates a possible trend where the effect of the intervention on overall scores may vary between male and female students. This finding suggests that gender may play a role in how students respond to the intervention, warranting further investigation.
Figure 11 illustrates the changes in mean scores for the five 21st-century skills targeted by the intervention—problem-solving, collaborative work, creativity, communication, and critical thinking—comparing baseline and endline assessments. Problem-solving saw an increase in the mean score from 6.40 at baseline to 6.72 at endline, accompanied by a slight rise in the standard error, indicating increased variability among students. Similarly, the mean score for collaborative work improved from 6.48 to 6.99, with a corresponding increase in standard error from 0.15 to 0.22, suggesting both better average performance and greater variation in outcomes.
In contrast, creativity showed a decrease in the mean score from 6.78 at baseline to 6.13 at endline, while the standard error remained relatively stable, increasing slightly from 0.16 to 0.22. This suggests a decline in overall performance with a modest increase in variability. Communication demonstrated a significant improvement, with the mean score rising from 5.35 to 6.37 and an increase in standard error from 0.19 to 0.21, indicating better average performance and more diverse outcomes among students. Finally, critical thinking experienced a substantial increase in mean score, from 4.57 to 6.57, with the standard error also rising from 0.16 to 0.20, reflecting an overall enhancement in critical thinking skills alongside greater variability. Overall, these changes suggest that except for creativity, students improved in most assessed skills, though with varying degrees of increased variability.

5. Discussion

The findings of this study provide evidence supporting the effectiveness of Project-Based Learning (PBL) in enhancing student competencies across language, mathematics, science, and 21st-century skills. This aligns well with existing research indicating the benefits of PBL on academic performance and skill development across multiple domains [54,55,56]. However, the current study also highlights areas where PBL’s impact varies, offering valuable insights into both its strengths and its limitations.
In the language domain, the intervention led to significant improvements in student scores, with a moderate effect size. Gender differences were noted, with girls generally outperforming boys, echoing findings from previous studies that report similar gender-based trends in language learning outcomes [57,58]. These findings underscore the need to consider gender dynamics within PBL frameworks, as girls and boys may engage with and benefit from PBL differently. Nonetheless, the absence of an interaction between gender and measurement condition suggests that PBL is effective for both genders, positioning it as a potentially equitable teaching approach that fosters improvement across student groups [59].
Similarly, in mathematics, this study demonstrated significant gains from pretest to posttest scores, confirming PBL’s role in bolstering mathematical competencies. These results align with the existing literature showing that PBL can drive meaningful improvements in math through active learning and real-world application [54,55,56]. While gender differences were not significant in this area, sustaining student engagement throughout PBL projects remains an ongoing challenge. Previous research highlights the importance of maintaining student motivation to maximize the benefits of PBL, especially in disciplines that require continuous skill application and engagement [59].
For science, the intervention yielded comparable gains, with statistically significant improvements in scores. Consistent with studies that show PBL’s effectiveness in fostering scientific inquiry and critical thinking, this study found that PBL can engage students in meaningful scientific learning without significant gender disparities in outcomes [55,60]. These results reinforce PBL’s versatility as an instructional method that enhances students’ understanding and engagement in science by promoting inquiry and problem-solving skills.
The assessment of 21st-century skills offered further insights into the intervention’s broader impact. While improvements were observed in problem-solving, collaborative work, communication, and critical thinking, a decline in creativity scores suggests an area for future refinement. This aligns with existing concerns that while PBL encourages critical thinking and collaborative skills, it may require additional support to nurture creativity fully [28,61]. Increased variability in postintervention scores across these skills suggests that while average performance improved, the effects of PBL may differ significantly among student groups, indicating diverse individual responses to the intervention [62].
Overall, these findings highlight both the potential and complexity of PBL as an instructional strategy. The gains observed across most competencies support PBL’s relevance in postpandemic educational recovery, especially in rural contexts. However, the variability in outcomes and specific challenges related to fostering creativity suggest that PBL interventions may need to be adapted or supplemented to address all facets of student development. The observed gender differences also call for tailored approaches that account for the unique ways in which male and female students interact with PBL, indicating a need for further research on how gender influences the effectiveness of such interventions.

6. Limitations

This study employed a pretest–posttest single-group design, which, while valuable for assessing the impact of the educational intervention, comes with certain limitations. One of the primary limitations is the absence of a control group. The use of a control group would allow for a clearer attribution of observed changes in student performance to the intervention itself rather than other external factors. However, in educational research, particularly when working with children, it is not always feasible or ethical to withhold potentially beneficial interventions from a control group. The decision to implement the intervention across the entire group of participants was made to ensure that all students had access to the educational benefits it provided, in line with ethical considerations that prioritize the well-being and development of children.
Another limitation is the lack of random sampling, which affects the generalizability of the findings. The participants in this study were not randomly selected, meaning that the results may not be representative of the broader population. This nonrandomized approach limits the extent to which the findings can be generalized beyond the specific group of students who participated in the intervention. However, despite this limitation, this study provides valuable insights into the potential impact of PBL on student competencies, specifically in math, language, and science, as well as 21st-century skills, offering a foundation for further research in more diverse settings.
While the calibration of the rubric was carried out using inter-rater reliability (IRR) to ensure consistency across different evaluators, the actual data collection during the assessment of 21st-century skills was completed by single observers for each group. This presents a potential limitation, as the use of only one observer introduces the possibility of subjective bias in the evaluation process. Despite the calibration efforts to standardize scoring, the absence of multiple raters during data collection limits the ability to measure IRR during the final assessments. Future studies could benefit from incorporating multiple observers to allow for IRR analysis in the data collection phase, thereby further enhancing the reliability of the evaluations.
Despite these limitations, this study provides valuable insights into the effects of educational intervention on student learning outcomes. However, the limitations related to the study design, sampling, and assessment methods should be considered when interpreting the results. Future studies should address these limitations to provide a more comprehensive and generalizable understanding of the impact of educational interventions, particularly through the inclusion of control groups, random sampling, and more robust assessment methodologies.

7. Conclusions and Recommendations

The results of this study demonstrate the effectiveness of the Project-Based Learning (PBL) intervention in improving student competencies in language, mathematics, and science, as well as enhancing essential 21st-century skills among third graders in rural Colombia. The intervention led to significant gains in standardized test scores and observable improvements in critical areas such as problem-solving, collaboration, communication, and critical thinking. These findings align with the growing body of literature that supports the use of PBL as a powerful instructional strategy that fosters both academic achievement and the development of key skills necessary for success in the 21st century.
The significant improvements observed in student performance, particularly the shift from lower to higher competency levels, indicate that the PBL approach effectively addressed the learning gaps exacerbated by the pandemic. The intervention not only helped students recover lost ground but also positioned them to excel in more advanced cognitive and practical tasks. The positive impact on 21st-century skills further underscores the value of integrating these competencies into the curriculum, ensuring that students are better equipped to navigate future challenges in education and beyond.
However, this study also revealed areas that require further attention. While most students showed improvement in various competencies, the decrease in creativity scores suggests that this area may require additional focus within PBL frameworks. Creativity is a critical component of 21st-century learning, and future iterations of the intervention should explore strategies to better support and enhance creative thinking among students. Additionally, the observed gender differences in overall standardized scores highlight the need for more tailored approaches that address the unique needs of male and female students within PBL environments.
Based on these findings, several recommendations can be made:
Firstly, enhance the focus on creativity. Considering the observed decline in creativity scores, it would be beneficial for future interventions to include more targeted activities and assessments aimed at fostering creative thinking. Creativity is a critical component of 21st-century skills, essential for innovation and problem-solving. To address this, the PBL intervention should be designed to include more activities specifically aimed at fostering creative thinking. For example, introducing open-ended projects where students are encouraged to explore various approaches and solutions can provide a platform for creative expression. These projects could be interdisciplinary, allowing students to combine knowledge from different subject areas to develop innovative outcomes. Additionally, it would be beneficial to integrate teaching strategies that explicitly promote divergent thinking—where students are asked to generate multiple ideas and solutions to a single problem. Classroom activities such as brainstorming sessions, mind mapping, and design challenges could be employed regularly to encourage creativity. Teachers could also provide more opportunities for students to reflect on their creative process, helping them to recognize and refine their creative thinking skills. Moreover, assessments that evaluate creativity should focus not only on the product but also on the creative process itself, rewarding originality, adaptability, and the ability to take risks. Encouraging a classroom environment where experimentation and unconventional ideas are valued will help cultivate a culture of creativity.
Secondly, tailor interventions to address gender differences. This study’s findings revealed significant gender differences in overall standardized scores, suggesting that male and female students may respond differently to PBL. To ensure that all students benefit equally from the intervention, future programs should consider gender-sensitive approaches. This could involve differentiated instruction that caters to the specific learning styles and needs of each gender, as well as providing gender-specific support strategies where necessary. Further research into the influence of gender on learning within PBL contexts is also recommended to better understand these dynamics and to inform the development of more inclusive educational practices.
Thirdly, expand the use of PBL in rural education. The success of this intervention in rural Colombia indicates that PBL is a viable and effective approach for improving educational outcomes in underserved areas. Given the challenges faced by students in rural regions, including limited access to resources and educational support, the expansion of PBL could provide significant benefits. Policymakers and educators should consider implementing PBL on a broader scale across rural schools, ensuring that students in these areas have access to high-quality, engaging, and relevant educational experiences. This expansion should be supported by ongoing training for teachers and administrators to effectively implement and sustain PBL practices.
Fourthly, incorporate more objective assessment methods. To strengthen the evaluation of 21st-century skills, it is recommended that future studies include more objective measures alongside observation rubrics. While rubrics provide valuable insights, their subjective nature can introduce bias, particularly when completed by a single observer. Incorporating performance-based assessments and technology-assisted tools and involving multiple observers can enhance the reliability and validity of the findings. Additionally, conducting inter-rater reliability (IRR) analyses would ensure consistency in the assessment process and contribute to the robustness of the results.
Finally, conduct longitudinal studies to better understand the long-term impact of PBL on student learning and development. While this study provides valuable insights into the immediate effects of the intervention, it is essential to track students over time to assess how PBL influences their academic trajectories, skill development, and future success. Longitudinal studies would provide a more comprehensive understanding of the sustained impact of PBL, allowing educators and policymakers to refine and adapt the approach to maximize its benefits for students in diverse educational contexts.

Author Contributions

Conceptualization, M.C.A.-C., L.A.T.-A. and R.L.G.-Y.; methodology, M.C.A.-C., L.A.T.-A. and R.L.G.-Y.; validation, M.C.A.-C., L.A.T.-A. and R.L.G.-Y.; formal analysis, R.L.G.-Y.; data curation, R.L.G.-Y.; writing—original draft preparation, M.C.A.-C., L.A.T.-A. and R.L.G.-Y.; writing—review and editing, M.C.A.-C., L.A.T.-A. and R.L.G.-Y.; visualization, R.L.G.-Y.; project administration, M.C.A.-C.; funding acquisition, M.C.A.-C. and L.A.T.-A. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by funding from Fundación Fraternidad Medellín (Grant # CTA-2022). The funders had no role in the design of the study, data collection and analysis, decision to publish, or preparation of the manuscript.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and received ethical approval from CTA’s Ethical Review & Data Protection Board, ensuring adherence to ethical standards for research involving minors. The Board granted approval under the code 01-2022 on 28 January 2022.

Informed Consent Statement

Informed consent was obtained from the parents or legal guardians of all participating children prior to the intervention. The children were also informed about the nature of the educational intervention in age-appropriate language, and their assent was obtained. Participation in the intervention was entirely voluntary, and participants were free to withdraw at any time without any consequences. All data collected during the intervention were anonymized to protect the privacy and confidentiality of the participants.

Data Availability Statement

The original data, assessment instruments, and code files supporting the conclusions of this article are openly available on GitHub at https://github.com/ricardophd/cta-pbl-evaluation.git (accessed on 17 September 2024).

Acknowledgments

We thank Fundación Fraternidad Medellín for providing the resources that made this project and research possible. We are also grateful to the team of professionals from the Alliance for Education with Quality and Equity for their support in data collection. Lastly, we extend our sincere appreciation to the CTA management team for their continuous encouragement and support in promoting knowledge generation.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Appendix A. Timeline of the Intervention

Table A1. Timeline of the intervention.
Table A1. Timeline of the intervention.
StageTime PeriodActivityDescription
Stage 1: planningLate 2021Identification of schools and participantsSelected schools and teachers for the 2022 cohort.
Theory of Change (TOC) developmentDeveloped TOC to outline program’s sequence of inputs, activities, outcomes, and outputs, supporting effective monitoring and adaptive management.
Creation and piloting of assessment toolsDeveloped and piloted tools to assess student competencies and teacher practices in language, math, science, and 21st-century skills.
Preparation of instructional and remediation materialsCreated instructional materials and revised remediation resources tailored to the needs of participating students and teachers.
Selection and training of educational advisorsSelected advisors to support teachers and deliver remedial sessions.
Stage 2: baseline assessmentEarly 2022Student competency assessmentsConducted baseline assessments in language, math, and science for starting competency levels.
Teacher evaluationEvaluated instructional and assessment practices to identify areas for improvement.
Stage 3: deliveryJanuary–October 2022Professional development for teachersEight sessions on PBL design, curriculum alignment, project planning, student engagement, formative assessment, and 21st-century skills integration.
Implementation of PBL in classroomsTeachers conducted 2–3 PBL cycles (each 2–3 months) in math, science, and language.
Initial project selection and planningIdentified real-world themes integrating math, science, and language.
Launching projectsIntroduced each project with a “driving question” to promote interdisciplinary learning.
Guided inquiry and researchFacilitated structured research and exploration with resources and field trips.
Collaborative work sessionsAllocated group work time for students, focusing on subject-specific goals in math, science, and language.
Regular formative assessmentsConducted assessments with feedback, using peer and self-assessment tools.
Reflection and self-assessmentIncorporated reflection sessions for students to assess their own progress and areas for improvement.
Final presentations and evaluationsStudents presented projects integrating math, science, and language elements; evaluated by teachers using structured rubrics.
Continuous adjustment and supportOngoing follow-up sessions (6 with facilitators, 3 with administrators) to refine and support teaching strategies.
Throughout 2022Remedial sessions for studentsConducted 27 sessions to address learning gaps in reading, math, and science, focusing on foundational skills disrupted by the pandemic.
24 sessions by CTA advisorsTargeted individualized and group instruction to reinforce academic skills through hands-on learning and real-world applications.
3 sessions by teachersTeachers applied PBL training to lead sessions, focusing on core academic competencies and 21st-century skills.
End of intervention assessmentLate 2022Final student and teacher assessmentsMeasured student progress in competencies and evaluated the impact of teacher professional development.
Outcome evaluationAnalyzed results to assess program effectiveness and gather insights for future improvements.

References

  1. UNICEF. Education in Latin America and the Caribbean at a Crossroads; Regional Monitoring Report SDG4—Education 2030; UNESCO: Paris, France, 2022. [Google Scholar]
  2. ICFES. Programa Para La Evaluación Internacional de Alumnos (PISA); Informe Nacional de Resultados Para Colombia 2022; ICFES: Bogotá, Colombia, 2024. [Google Scholar]
  3. P21 Framework for 21st Century Learning 2009. Available online: https://static.battelleforkids.org/documents/p21/P21_Framework_DefinitionsBFK.pdf (accessed on 20 November 2021).
  4. Wu, Y.; Wahab, A. Research on the Material and Human Resource Allocation in Rural Schools Based on Social Capital Theory. Int. J. Educ. Humanit. 2023, 11, 46–49. [Google Scholar] [CrossRef]
  5. Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active Learning Increases Student Performance in Science, Engineering, and Mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef] [PubMed]
  6. Prince, M. Does Active Learning Work? A Review of the Research. J. Eng. Educ. 2004, 93, 223–231. [Google Scholar] [CrossRef]
  7. Vergara, D.; Paredes-Velasco, M.; Chivite, C.; Fernández-Arias, P. The Challenge of Increasing the Effectiveness of Learning by Using Active Methodologies. Sustainability 2020, 12, 8702. [Google Scholar] [CrossRef]
  8. Beckett, G. Teacher and Student Evaluations of Project-Based Instruction. TESL Can. J. 2002, 19, 52. [Google Scholar] [CrossRef]
  9. Guslyakova, A.; Guslyakova, N.; Valeeva, N.; Veretennikova, L. Project-Based Learning Usage in L2 Teaching in a Contemporary Comprehensive School (on the Example of English as a Foreign Language Classroom). Rev. Tempos Espaços Educ. 2021, 14, e16754. [Google Scholar] [CrossRef]
  10. Haatainen, O.; Aksela, M. Project-Based Learning in Integrated Science Education: Active Teachers’ Perceptions and Practices. LUMAT 2021, 9, 149–173. [Google Scholar] [CrossRef]
  11. Padmadewi, N.N.; Suarcaya, P.; Artini, L.P.; Munir, A.; Friska, J.; Husein, R.; Paragae, I. Incorporating Linguistic Landscape into Teaching: A Project-Based Learning for Language Practices at Primary School. Int. J. Elem. Educ. 2023, 7, 467–477. [Google Scholar] [CrossRef]
  12. Yamil, K.; Said, N.; Swanto, S.; Din, W.A. Enhancing ESL Rural Students’ Speaking Skills Through Multimedia Technology-Assisted Project-Based Learning: Conceptual Paper. Int. J. Educ. Psychol. Couns. 2022, 7, 249–262. [Google Scholar] [CrossRef]
  13. Sagita, L.; Putri, R.I.I.; Zulkardi; Prahmana, R.C.I. Promising Research Studies between Mathematics Literacy and Financial Literacy through Project-Based Learning. J. Math. Educ. 2023, 13, 753–772. [Google Scholar] [CrossRef]
  14. Ndiung, S.; Menggo, S. Project-Based Learning in Fostering Creative Thinking and Mathematical Problem-Solving Skills: Evidence from Primary Education in Indonesia. Int. J. Learn. Teach. Educ. Res. 2024, 23, 289–308. [Google Scholar] [CrossRef]
  15. Lavonen, L.; Loukomies, A.; Vartianen, J.; Palojoki, P. Promoting 3rd Grade Pupils’ Learning of Science Knowledge through Project-Based Learning in a Finnish Primary School. NorDiNa 2023, 19, 181–199. [Google Scholar] [CrossRef]
  16. Jabr, E.A.H.A.-S.; Zaki, S.Y.; Mohamed, M.A.S.; Ahmed, M.F. The Effectiveness of Curriculum Developer in Science in View of Project Based Learning to Developing Problem Solving Skills for Primary Stage Pupils. Educ. Res. Innov. J. 2023, 3, 97–131. [Google Scholar] [CrossRef]
  17. Rijken, P.E.; Fraser, B.J. Effectiveness of Project-Based Mathematics in First-Year High School in Terms of Learning Environment and Student Outcomes. Learn. Environ. Res. 2024, 27, 241–263. [Google Scholar] [CrossRef]
  18. TeGrootenhuis, J. The Effectiveness of Project-Based Learning in the Science Classroom. Master’s Thesis, Northwestern College, Orange City, IA, USA, 2018. [Google Scholar]
  19. Zhang, L.; Ma, Y. A Study of the Impact of Project-Based Learning on Student Learning Effects: A Meta-Analysis Study. Front. Psychol. 2023, 14, 1202728. [Google Scholar] [CrossRef]
  20. Fadhil, M.; Kasli, E.; Halim, A.; Evendi; Mursal; Yusrizal. Impact of Project Based Learning on Creative Thinking Skills and Student Learning Outcomes. J. Phys. Conf. Ser. 2021, 1940, 012114. [Google Scholar] [CrossRef]
  21. Biazus, M.d.O.; Mahtari, S. The Impact of Project-Based Learning (PjBL) Model on Secondary Students’ Creative Thinking Skills. Int. J. Essent. Competencies Educ. 2022, 1, 38–48. [Google Scholar] [CrossRef]
  22. Valtonen, T.; Sointu, E.; Kukkonen, J.; Kontkanen, S.; Lambert, M.C.; Mäkitalo-Siegl, K. TPACK Updated to Measure Pre-Service Teachers’ Twenty-First Century Skills. Australas. J. Educ. Technol. 2017, 33, 15–31. [Google Scholar] [CrossRef]
  23. Anagün, Ş.S. Teachers’ Perceptions about the Relationship between 21st Century Skills and Managing Constructivist Learning Environments. Int. J. Instruct. 2018, 11, 825–840. [Google Scholar] [CrossRef]
  24. Herfina, E.S. Feasibility Test of 21st Century Classroom Management Through Development Innovation Configuration Map. J. Pendidik. Dan Pengajaran Guru Sekol. Dasar (JPPGuseda) 2022, 5, 101–104. [Google Scholar] [CrossRef]
  25. Fitria, D.; Lufri, L.; Elizar, E.; Amran, A. 21st Century Skill-Based Learning (Teacher Problems In Applying 21st Century Skills). Int. J. Humanit. Educ. Soc. Sci. 2023, 2, 1366–1373. [Google Scholar] [CrossRef]
  26. Varghese, J.; Musthafa, M.N.M.A. Investigating 21st Century Skills Level among Youth. GiLE J. Ski. Dev. 2021, 1, 99–107. [Google Scholar] [CrossRef]
  27. Bell, S. Project-Based Learning for the 21st Century: Skills for the Future. Clear. House J. Educ. Strateg. Issues Ideas 2010, 83, 39–43. [Google Scholar] [CrossRef]
  28. Usmeldi, U. The Effect of Project-Based Learning and Creativity on the Students’ Competence at Vocational High Schools. In Proceedings of the 5th UPI International Conference on Technical and Vocational Education and Training (ICTVET 2018), Bandung, Indonesia, 11 September 2018; Atlantis Press: Bandung, Indonesia, 2019. [Google Scholar]
  29. Saputra, I.G.N.H.; Joyoatmojo, S.; Harini, H. The Implementation of Project-Based Learning Model and Audio Media Visual Can Increase Students’ Activities. Int. J. Multicult. Multireligious Underst. 2018, 5, 166–174. [Google Scholar] [CrossRef]
  30. Ana, A.; Subekti, S.; Hamidah, S. The Patisserie Project Based Learning Model to Enhance Vocational Students’ Generic Green Skills. In Proceedings of the 3rd UPI International Conference on Technical and Vocational Education and Training (ICTVET 2014), Bandung, Indonesia, 13–14 November 2014; Atlantis Press: Bandung, Indonesia, 2015; pp. 24–27. [Google Scholar]
  31. Mislevy, R.J.; Riconscente, M. Evidence-Centered Assessment Designs: Layers, Concepts and Terminology. In Handbook of Test Development; Downing, S., Haladyna, T., Eds.; Earlbaum: Mahwah, NJ, USA, 2006; pp. 61–90. [Google Scholar]
  32. Bayesian Networks in Educational Assessment; Almond, R.G.; Mislevy, R.J.; Steinberg, L.S.; Yan, D.; Williamson, D. (Eds.) Statistics for Social and Behavorial Sciences; Springer: New York, NY, USA, 2015; ISBN 978-1-4939-2124-9. [Google Scholar]
  33. Mislevy, R.J.; Haertel, G.D. Implications of Evidence-Centered Design for Educational Testing. Educ. Meas. Issues Pract. 2007, 25, 6–20. [Google Scholar] [CrossRef]
  34. Mislevy, R.J.; Almond, R.G.; Lukas, J.F. A Brief Introduction to Evidence-Centered Design; University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST): Los Angeles, CA, USA, 2004. [Google Scholar]
  35. Estándares Básicos de Competencias; MEN: Bogotá, Colombia, 2006.
  36. Webb, N.L. Criteria for Alignment of Expectations and Assessments in Mathematics and Science Education: Research Monograph No. 6; Council of Chief State School Officers: Washington, DC, USA, 1997. [Google Scholar]
  37. Webb, N.L. Identifying Content for Student Achievement Tests. In Handbook of Test Development; Downing, S., Haladyna, T., Eds.; LEA: London, UK, 2011; pp. 155–180. [Google Scholar]
  38. Webb, N.L. Web Alignment Tool; Council of Chief State School Officers: Washington, DC, USA, 2005. [Google Scholar]
  39. Jonsson, A.; Svingby, G. The Use of Scoring Rubrics: Reliability, Validity and Educational Consequences. Educ. Res. Rev. 2007, 2, 130–144. [Google Scholar] [CrossRef]
  40. Weiss, C. Nothing as Practical as Good Theory: Exploring Theory-Based Evaluation for Comprehensive Community Initiatives for Children and Families. In New Approaches to Evaluating Community Initiatives: Concepts, Methods, and Contexts; Connell, J.P., Kubisch, A.C., Schorr, L., Weiss, C., Eds.; Aspen Institute: Washington, DC, USA, 1995; pp. 65–92. [Google Scholar]
  41. Weiss, C. Evaluation: Methods for Studying Programs and Policies, 2nd ed.; Prentice Hall: Upper Saddle River, NJ, USA, 1998; ISBN 978-0-13-309725-2. [Google Scholar]
  42. Willis, G.B. Cognitive Interviewing; SAGE Publications, Inc.: Washington, DC, USA, 2005; ISBN 978-1-4129-8365-5. [Google Scholar]
  43. Willis, G.B.; Royston, P.; Bercini, D. The Use of Verbal Report Methods in the Development and Testing of Survey Questionnaires. Appl. Cognit. Psychol. 1991, 5, 251–267. [Google Scholar] [CrossRef]
  44. DeVellis, R.F. Scale Development: Theory and Applications, 4th ed.; SAGE Publications: Los Angeles, CA, USA, 2017; ISBN 978-1-5063-4158-3. [Google Scholar]
  45. AERA; APA; NCME. Standards for Educational and Psychological Testing; American Educational Research Association: Washington, DC, USA, 2014; ISBN 978-0-935302-35-6. [Google Scholar]
  46. Handbook of Test Development; Downing, S.M.; Haladyna, T.M. (Eds.) Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2006; ISBN 0-8058-5264-6. [Google Scholar]
  47. Staff, T. A Depth of Knowledge Rubric for Reading, Writing, and Math; TeachThought: Louisville, KY, USA, 2014. [Google Scholar]
  48. Newell, J.; Newell, H.; Dahm, K. Rubric Development and Inter Rater Reliability Issues in Assessing Learning Outcomes. In Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition, Montréal, QC, Canada, 16 June 2002; pp. 7.991.1–7.991.8. [Google Scholar]
  49. Stemler, S.E. A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability. Pract. Assess. Res. Eval. 2004, 9, 1–11. [Google Scholar] [CrossRef]
  50. Limesurvey GmbH. LimeSurvey: An Open Source Survey Tool. 2024. Available online: https://www.limesurvey.org/ (accessed on 20 January 2024).
  51. McDowell, I. Measuring Health: A Guide to Rating Scales and Questionnaires; Oxford University Press: Oxford, UK, 2006. [Google Scholar]
  52. De Vaus, D.A. Surveys in Social Research; Routledge: New York, NY, USA, 2014; ISBN 978-0-415-53015-6. [Google Scholar]
  53. R Core Team. R: A Language and Environment for Statistical Computing (Version 4.4.1); R Foundation for Statistical Computing: Vienna, Austria, 2022. [Google Scholar]
  54. Craig, T.T.; Marshall, J. Effect of Project-based Learning on High School Students’ State-mandated, Standardized Math and Science Exam Performance. J. Res. Sci. Teach. 2019, 56, 1461–1488. [Google Scholar] [CrossRef]
  55. Kokotsaki, D.; Menzies, V.; Wiggins, A. Project-Based Learning: A Review of the Literature. Improv. Sch. 2016, 19, 267–277. [Google Scholar] [CrossRef]
  56. Uluçınar, U. The Effect of Problem-Based Learning in Science Education on Academic Achievement: A Meta-Analytical Study. Sci. Educ. Int. 2023, 34, 72–85. [Google Scholar] [CrossRef]
  57. Alenezi, A. Using Project-Based Learning Through the Madrasati Platform for Mathematics Teaching in Secondary Schools. Int. J. Inf. Commun. Technol. Educ. 2023, 19, 1–15. [Google Scholar] [CrossRef]
  58. Kurniadi, D.; Cahyaningrum, I.O. Exploring Project-Based Learning for Young Learners in English Education. NextGen 2023, 1, 10–21. [Google Scholar] [CrossRef]
  59. Fang, C.Y.; Zakaria, M.I.; Iwani Muslim, N.E. A Systematic Review: Challenges in Implementing Problem-Based Learning in Mathematics Education. Int. J. Acad. Res. Progress. Educ. Dev. IJARPED 2023, 12, 1261–1271. [Google Scholar] [CrossRef]
  60. Kurt, G.; Akoglu, K. Project-Based Learning in Science Education: A Comprehensive Literature Review. Interdiscip. J. Environ. Sci. Educ. 2023, 19, e2311. [Google Scholar] [CrossRef]
  61. Usmeldi, U.; Amini, R. Creative Project-Based Learning Model to Increase Creativity of Vocational High School Students. Int. J. Eval. Res. Educ. IJERE 2022, 11, 2155. [Google Scholar] [CrossRef]
  62. Belwal, R.; Belwal, S.; Sufian, A.B.; Al Badi, A. Project-Based Learning (PBL): Outcomes of Students’ Engagement in an External Consultancy Project in Oman. Educ. + Train. 2020, 63, 336–359. [Google Scholar] [CrossRef]
Figure 1. The components of the Evidence-Centered Design. Reprinted with permission from Ref. [34]. Copyright the National Center for Research on Evaluation, Standards, and Student Testing (CRESST).
Figure 1. The components of the Evidence-Centered Design. Reprinted with permission from Ref. [34]. Copyright the National Center for Research on Evaluation, Standards, and Student Testing (CRESST).
Education 14 01341 g001
Figure 2. Evaluation framework of student learning and teacher development.
Figure 2. Evaluation framework of student learning and teacher development.
Education 14 01341 g002
Figure 3. Program’s logic model.
Figure 3. Program’s logic model.
Education 14 01341 g003
Figure 4. Change in language scores from pretest to posttest.
Figure 4. Change in language scores from pretest to posttest.
Education 14 01341 g004
Figure 5. Change in math scores from pretest to posttest.
Figure 5. Change in math scores from pretest to posttest.
Education 14 01341 g005
Figure 6. Change in science scores from pretest to posttest.
Figure 6. Change in science scores from pretest to posttest.
Education 14 01341 g006
Figure 7. Language component, general results by competency level (n = 196).
Figure 7. Language component, general results by competency level (n = 196).
Education 14 01341 g007
Figure 8. Mathematics component, general results by competency level (n = 201).
Figure 8. Mathematics component, general results by competency level (n = 201).
Education 14 01341 g008
Figure 9. Science component, general results by competency level (n = 190).
Figure 9. Science component, general results by competency level (n = 190).
Education 14 01341 g009
Figure 10. Comparison of overall 21st-century skills scaled scores by gender.
Figure 10. Comparison of overall 21st-century skills scaled scores by gender.
Education 14 01341 g010
Figure 11. Assessment of students’ 21st-century skills, general results by competency (n = 238).
Figure 11. Assessment of students’ 21st-century skills, general results by competency (n = 238).
Education 14 01341 g011
Table 1. Anonymized distribution of program participants by location, school, and gender (n = 287).
Table 1. Anonymized distribution of program participants by location, school, and gender (n = 287).
MunicipalitySchoolnPercentage (%)
Municipality ASchool A3110.8
Municipality BSchool B5820.21
Municipality CSchool C5920.56
Municipality DSchool D2910.1
Municipality ESchool E175.92
School F227.67
Municipality FSchool G93.14
Municipality GSchool H248.36
School I3813.24
Total287100
Table 2. Selected standards addressed by the program [35].
Table 2. Selected standards addressed by the program [35].
AreaStandardEvidenceNumber of Items
LanguageReading comprehension and critical analysisThe student can understand the content of a text by analyzing its structure and employing inferential and critical reading processes.12
Text production and compositionThe student can produce various types of texts (expository, narrative, informative, descriptive, argumentative) while considering grammatical and orthographic conventions.
ScienceUnderstanding abiotic and biotic interactionsThe student can explain the influence of abiotic factors (light, temperature, soil, air) on the development of biotic factors (fauna, flora) within an ecosystem.20
Ecological relationships and survivalThe student can understand and explain the intra- and interspecific relationships between organisms and their environment, emphasizing their importance for survival.
MathematicsEstimation and mathematical strategiesThe student can propose, develop, and justify strategies for making estimates and performing basic operations to solve problems effectively.12
Data interpretation and problem-solvingThe student can read and interpret information from frequency tables, bar graphs, and pictograms with scales to formulate and solve questions related to real-world situations.
Table 3. Twenty-first-century skills addressed by the intervention [3].
Table 3. Twenty-first-century skills addressed by the intervention [3].
SkillsDescription
Problem-solvingThe ability of the student to identify a contextual problem and propose and implement alternative solutions.
CollaborationThe ability of the student to take on established roles, listen to peers, and contribute ideas in an organized manner.
CreativityThe ability of the student to represent the solution to the problem through a product, using various resources for its creation and strategies for its presentation.
CommunicationThe ability of the student to convey ideas or opinions clearly and coherently, both verbally and nonverbally.
Critical thinkingThe ability of the student to analyze, evaluate, and synthesize information in a thoughtful and systematic way. It involves the capacity to question assumptions, identify biases, assess evidence, and consider alternative perspectives.
Table 4. Twenty-first-century skills assessment rubric.
Table 4. Twenty-first-century skills assessment rubric.
21st-Century SkillsAssessment CriteriaScore (from 0 to 3)
Problem-solving
Identifies a problem
Searches for possible solutions
Score Education 14 01341 i001
Implements a solution
Collaborative work
Assumes established roles
Listens to peers
Score Education 14 01341 i001
Contributes ideas in an organized manner
Creativity
Represents the solution through a product
Utilizes various resources in product creation
Score Education 14 01341 i001
Applies strategies to present the product
Communication
Conveys ideas verbally or nonverbally
Communicates clear ideas
Score Education 14 01341 i001
Expresses coherent solutions
Critical thinking
Analyzes information
Interprets arguments
Score Education 14 01341 i001
Supports solutions
Score definitions: 0 = Beginning; 1 = In Progress; 2 = Achieved; 3 = Extended.
Table 5. Established performance levels based on students’ task completion.
Table 5. Established performance levels based on students’ task completion.
Performance LevelProbability ThresholdDescription
BeginningProbability of answering recall-level questions correctly is less than 50%.Students at this level are just starting to grasp basic concepts and often struggle with foundational knowledge.
In ProgressProbability of answering recall-level questions correctly is greater than 50% and application-level questions correctly is less than 50%.Students at this level show some understanding of basic concepts but need further development in applying their knowledge to practical scenarios.
AchievedProbability of answering application-level questions correctly is greater than 50% and strategic thinking-level questions correctly is less than 50%.Students at this level have met the standards for their age and grade, demonstrating a good grasp of both basic and applied concepts.
ExtendedProbability of answering strategic thinking-level questions correctly is greater than 50%.Students at this level exhibit a strong understanding of both basic and advanced concepts, applying their knowledge effectively and engaging in higher-order thinking tasks.
Table 6. Paired samples t-test language test scores.
Table 6. Paired samples t-test language test scores.
TestStatisticzdfpEffect SizeSE Effect Size
Student−3.551 19 <0.001−0.2540.057
Wilcoxon5440.000−3.161 0.002−0.2770.087
Note. For the Student t-test, effect size is given by Cohen’s d. For the Wilcoxon test, effect size is given by the matched rank biserial correlation.
Table 7. Impact of gender and measurement on language scores.
Table 7. Impact of gender and measurement on language scores.
Source of VariationDfSum of SquaresMean SquareF-Valuep-Value
Error: id1112112
Error: id: measurement11.9991.999
Error: within38636,85195.5
Gender120352034.821.3130.0000 *
Type of test1396396.04.1480.0424 *
Gender: type of test111.50.0150.9014
The asterisk (*) in the table indicates statistical significance at p < 0.005.
Table 8. Paired samples t-test math test scores.
Table 8. Paired samples t-test math test scores.
95% CI for Effect Size
TestStatisticzdfpEffect SizeSE Effect SizeLowerUpper
Student−6.44 200<0.001−0.4540.068−0.599−0.308
Wilcoxon4355−5.824 <0.001−0.4940.085−0.609−0.358
Note. For the Student t-test, effect size is given by Cohen’s d. For the Wilcoxon test, effect size is given by the matched rank biserial correlation.
Table 9. Impact of gender and measurement on math scores.
Table 9. Impact of gender and measurement on math scores.
Source of VariationDfSum of SquaresMean SquareF-Valuep-Value
Error: student11.7791.779
Error: student: measurement149.9649.96
Error: within39638,00396.0
Gender1192191.81.999>0.05
Type of test118531853.319.312<0.001
Gender: type of test1189189.41.974>0.05
Table 10. Summary of test statistics, effect sizes, and confidence intervals for math test scores.
Table 10. Summary of test statistics, effect sizes, and confidence intervals for math test scores.
95% CI for Effect Size
TestStatisticzdfpEffect SizeSE Effect SizeLowerUpper
Student−2.903 1890.004−0.2110.085−0.354−0.067
Wilcoxon6322−2.277 0.023−0.1970.086−0.354−0.03
Note. For the Student t-test, effect size is given by Cohen’s d. For the Wilcoxon test, effect size is given by the matched rank biserial correlation.
Table 11. Summary of ANOVA results for science scores.
Table 11. Summary of ANOVA results for science scores.
Source of VariationDfSum of SquaresMean SquareF-Valuep-Value
Error: studentid11.8921.8920.0770.781
Error: studentid: test149.9649.9617.9972.77 × 10−05 *
Error: within39037,80096.9
Gender177.50.0770.781
Test117441744.317.9972.77 × 10−05 *
Gender: test1104103.51.0680.302
The asterisk (*) in the table indicates statistical significance at p < 0.005.
Table 12. Summary of test statistics, effect sizes, and confidence intervals for 21st-century skills scores.
Table 12. Summary of test statistics, effect sizes, and confidence intervals for 21st-century skills scores.
Test TypeStatisticZDfpEffect SizeSE Effect Size95% CI Lower95% CI Upper
Student t-test−3.506 235<0.001−0.2280.078−0.357−0.099
Wilcoxon test7829.500−4.229 <0.001−0.3320.078−0.461−0.189
Note: For the Student t-test, effect size is given by Cohen’s d. For the Wilcoxon test, effect size is given by the matched rank biserial correlation.
Table 13. Analysis of variance (ANOVA) results for the effects of gender and measurement condition on overall standardized scores in 21st-century skills.
Table 13. Analysis of variance (ANOVA) results for the effects of gender and measurement condition on overall standardized scores in 21st-century skills.
SourceDfSum SqMean SqF-ValuePr (>F)
Gender135.135.097.0750.00808
Measurement condition149.048.969.8740.00178
Gender: measurement117.517.513.5320.06082
Residuals4722340.74.96
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Arrieta-Cohen, M.C.; Torres-Arizal, L.A.; Gómez-Yepes, R.L. Evaluating the Impact of an Educational Intervention Using Project-Based Learning on Postpandemic Recovery in Rural Colombia. Educ. Sci. 2024, 14, 1341. https://doi.org/10.3390/educsci14121341

AMA Style

Arrieta-Cohen MC, Torres-Arizal LA, Gómez-Yepes RL. Evaluating the Impact of an Educational Intervention Using Project-Based Learning on Postpandemic Recovery in Rural Colombia. Education Sciences. 2024; 14(12):1341. https://doi.org/10.3390/educsci14121341

Chicago/Turabian Style

Arrieta-Cohen, Mercedes Carmen, Luz Angela Torres-Arizal, and Ricardo León Gómez-Yepes. 2024. "Evaluating the Impact of an Educational Intervention Using Project-Based Learning on Postpandemic Recovery in Rural Colombia" Education Sciences 14, no. 12: 1341. https://doi.org/10.3390/educsci14121341

APA Style

Arrieta-Cohen, M. C., Torres-Arizal, L. A., & Gómez-Yepes, R. L. (2024). Evaluating the Impact of an Educational Intervention Using Project-Based Learning on Postpandemic Recovery in Rural Colombia. Education Sciences, 14(12), 1341. https://doi.org/10.3390/educsci14121341

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop