Next Article in Journal
Teachers Training Paraeducators to Implement Systematic Prompting Practices for Students with Significant Disabilities
Previous Article in Journal
Cognitive Load Theory: Emerging Trends and Innovations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Perceptions of Learning Assessment in Practicum Students vs. Initial Teacher Education Faculty in Chilean Physical Education: A Comparative Study of Two Cohorts

by
Francisco Gallardo-Fuentes
1,*,
Bastian Carter-Thuillier
1,
Sebastián Peña-Troncoso
2,3,
Samuel Pérez-Norambuena
4 and
Jorge Gallardo-Fuentes
4,5
1
Departamento de Educación, Universidad de los Lagos, Osorno 5290000, Chile
2
Facultad de Educación, Universidad Austral, Valdivia 5091000, Chile
3
Facultad de educación y Cultura, Universidad SEK, Santiago 7520317, Chile
4
Departamento de Ciencias de la Educación, Universidad del Bío-Bío, Chillán 3780000, Chile
5
Escuela de Medicina Veterinaria, Facultad de Medicina y Ciencias de la Salud, Universidad Mayor, Temuco 4780000, Chile
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(4), 459; https://doi.org/10.3390/educsci15040459
Submission received: 19 January 2025 / Revised: 10 March 2025 / Accepted: 14 March 2025 / Published: 8 April 2025
(This article belongs to the Section Higher Education)

Abstract

:
An education oriented towards learning must necessarily include assessment techniques and instruments that effectively achieve this goal. Currently, assessment has become a crucial element, leading to the promulgation of legal regulations to govern it. This study aimed to explore the perceptions of assessment in three Chilean universities among faculty members vs. practicum students from two cohorts of Chilean Initial Teacher Education in Physical Education (ITEPE) from two cohorts from subsequent moments, with a difference of four years. The study followed a quantitative, cross-sectional, and comparative approach and the sample consisted of a total of 458 participants, including 1st Group: n = 162 practicum students (S1) 2019–20 (M = 22.5, SD = 3.1) and 44 faculty members (FM1) from the same cohort (M = 42.3, SD = 11.2) vs. 197 practicum students (S2) 2023–24 (M = 23.6, SD = 2.2) and 55 faculty members (FM2) from this cohort (M = 40.4, SD = 10.4). Data were collected using the “Questionnaire for the Study of the Assessment System”. The results revealed significant differences between students and faculty regarding the perception of the use of assessment instruments in ITEPE. The 2023–24 cohort students perceived a greater presence of portfolios compared to their peers from four years ago, while faculty from the latest cohort indicated a perception of higher use of traditional exams. In conclusion, it can be observed that decision-making predominantly falls on the faculty, as evidenced by discrepancies regarding feedback and student participation in grade determination. This reinforces the idea that the process remains teacher-centered.

1. Introduction

A student-centered education is fundamental for advancing toward the achievement of the objectives that schools, as primary educational institutions, seek to accomplish (Ngatia, 2022; Nikoladze, 2023). In this context, the components that constitute the teaching–learning process must be explicitly structured and oriented towards the promotion of student learning, with assessment playing a central role, thereby requiring a redefinition of its function (Wang, 2023). Assessment unfolds in the teaching–learning processes (T-L) through feedback as a fundamental practice that fosters interaction during the process with the student and their learning (Barrientos & López-Pastor, 2015; Guerra-Sialer et al., 2023).
Despite this premise, traditional practices still exert a significant influence on classroom assessment methods. It is evident that assessment, historically conceived as a tool primarily intended for certification, continues to be a normalized practice (Berlanga-Ramírez & Juárez-Hernández, 2020; Gedda-Muñoz & Guerrero-Azócar, 2021). Educational processes have also focused on changing behavioral patterns, privileging the use of assessment tools to verify such changes (Mutiawati et al., 2023). This perspective has led to the conceptualization of learning as predominantly observable and quantifiable entities (Giuliano, 2020; Jiménez-Moreno, 2019).
In response to the weight of tradition, a movement has emerged advocating for assessment focused on learning, emphasizing an approach that accounts for the comprehension and application of knowledge in real-life situations (Porlán, 2018). Within this framework, the practicum is positioned as a privileged space for the development of assessment competencies, allowing prospective teachers to experience and reflect on authentic and contextualized assessment practices (Zeichner, 2010). This has even led to discussions about the need for educational contexts to work on teacher assessment literacy and on defining a teaching identity related to assessment (Looney et al., 2018).
Despite efforts to incorporate assessment methods that promote learning and emphasize the T-L process, the field of Physical Education continues to face significant challenges in adopting these approaches (López-Pastor & Pérez-Pueyo, 2017). This is partly due to its close association with the sports domain and practices that prioritize the measurement of observable and quantifiable outcomes (Moreno-Doña et al., 2014; Patel, 2024). This approach contrasts with its role within the school curriculum, where the aim is to promote learning aligned with the discipline from a more pedagogical perspective (Barba-Martín et al., 2020b; Castillo-Retamal et al., 2020; Pérez-Pueyo et al., 2021).
Learning assessment has gained significant relevance, supported by substantial evidence of its impact on learning processes (Magdalena et al., 2023). This has prompted some countries to adopt specific regulations governing what is assessed, how it is assessed, and when it is assessed in different educational contexts (Barreyro & Rothen, 2007; Dill, 2003). Particularly in regions like Europe, there is a push for the implementation of common regulatory frameworks that, in addition to standardizing assessment practices, facilitate more effective dialogue among students, especially in contexts of academic exchange and international internships (Rauner & Wittig, 2010).
In Chile, recent changes in school assessment practices over the past decade have provided robust evidence that assessment is an essential component of the T-L process, establishing new normative and pedagogical guidelines to replace previous regulations. In 2018, the Ministry of Education (MINEDUC) enacted Decree 67 on “assessment, grading, and school promotion” (MINEDUC, 2018). This decree fundamentally promotes assessment with a formative intent, aligning with the principles of formative and shared assessment (FSA), as explicitly reflected in the regulation, which states that assessment should be configured as a comprehensive process enabling both teachers and students to “obtain and interpret information about learning” (MINEDUC, 2018, p. 56). Broadly, the new regulation establishes principles to transform assessment practices, distinguishing between assessment and grading, prioritizing significant evidence when making grading decisions, incorporating students as active agents in the assessment process, and consolidating formative assessment as a central axis of the T-L process. Additionally, it underscores the need to move away from an exclusively certifying perspective (Ruz-López et al., 2024), promoting tools that allow teachers to adjust their planning and focus on authentic learning, positioning formative assessment as an integral framework that supports students in their comprehensive learning development (Herrera-Bravo et al., 2023).
The tension generated by such normative changes lies in their impact, which should not be confined solely to the school system. Documented evidence supports the implementation of formative assessment practices in initial teacher education, highlighting their significant effect on the future adoption and application of these approaches in various educational contexts (Brandt et al., 2018; Herrero-González et al., 2021). This is particularly evident and necessary in Initial Teacher Education in Physical Education (ITEPE), where repeated reliance on fitness tests has shaped the assessment orientations of the discipline (Beltrán-Véliz et al., 2020). Consequently, adequate curriculum planning in ITEPE concerning assessment can influence the future assessment practices of teachers (Villegas, 2024). Understanding the perceptions of advanced ITEPE students regarding learning assessment is essential for implementing improvements aimed at their future performance in educational contexts (Torkildsen & Erickson, 2016). In this sense, the practicum serves as a crucial instance for reorienting traditional sports-focused practices towards more inclusive and reflective pedagogical models (Pascual-Arias & Molina-Soria, 2020). This necessity is underscored by comprehensive curriculum changes, which often generate resistance, uncertainty, and surprise, highlighting the need for strategies that facilitate adaptation to these new paradigms in the short term (Aasland et al., 2024).
In light of the scientific evidence on the relevance of assessment in ITEPE and the tensions arising from the implementation of new regulations in the Chilean context. Within this framework, the objective is:
To examine the perceptions of learning assessment among two cohorts of practicum students and university faculty in Initial Teacher Education in Physical Education from two cohorts from subsequent moments, with a difference of four years.
Additionally, the study included the following hypothesis:
The perceptions of assessment in the initial training of physical education teachers have changed with recent regulations, promoting formative assessment, although discrepancies between teachers and students in grading persist.

2. Materials and Methods

(a) Study Design: The study followed a quantitative, cross-sectional, and comparative approach, analyzing perceptions of learning assessment among practicum students and faculty members from two cohorts (graduates from 2019–20 vs. 2023–24) within ITEPE. To this end, the “Questionnaire for the Study of the Assessment System in Initial Teacher Education in Physical Education” was utilized, which had been validated for both international and national contexts (Ruiz-Gallardo et al., 2013).
(b) Participants: The sample consisted of a total of 458 participants from three Chilean universities, including 1st Group: n = 162 practicum students (S1) 2019–20 (M = 22.5, SD = 3.1) and 44 faculty members (FM1) from the same cohort (M = 42.3, SD = 11.2) vs. 197 practicum students (S2) 2023–24 (M = 23.6, SD = 2.2) and 55 faculty members (FM2) from this cohort (M = 40.4, SD = 10.4).
(c) Measure: For the purpose of measuring perceptions, the questionnaire included 63 items distributed across ten questions and is organized into four groups of variables: (1) planning and assessment system, (2) cognitive abilities and assessment system, (3) assessment instruments, and (4) beliefs about aspects related to the assessment system. For the validation of the Spanish language version, 171 students and 26 Spanish teachers participated, all of whom were involved in the teaching–learning process and was established through a thorough review of previous research and evaluation by expert judges. Regarding reliability, Cronbach’s Alpha coefficients were calculated, yielding values of 0.866 for students and 0.821 for teachers. In the present study, responses to questions 2, 3, 4, 6, and 8 were analyzed, as they were closely related to the objective of this research. The questionnaire employed a Likert scale to record responses (where 0 = none; 1 = few or low; 2 = some or medium; 3 = many or high; and 4 = all or very high).
Furthermore, in adherence to ethical and methodological considerations, the research process was approved by an accredited Scientific Ethics Committee (CEC–Ulagos), which reviewed the informed consent forms signed by each participant.
(d) Statistical analysis: Data were processed using descriptive statistical analyses, allowing for the calculation of the arithmetic mean (M) and standard deviation (SD). Subsequently, inferential statistical analyses were conducted using the Mann–Whitney U test to examine differences in means, comparing intra-cohort perceptions between students and faculty members, as well as inter-cohort relationships between students and students, and faculty members and faculty members. For these analyses, using a significance level (Sig.) of p ≤ 0.05. All analyses were performed using the SPSS statistical software, version 24.0 (IBM Corp., Armonk, NY, USA, 2017).

3. Results

The results include perceptions from students and faculty members within the same cohort, as well as between groups from different cohorts, regarding a series of statements about how assessment is conducted in ITEPE. For the question, “How often did the faculty, through the assessment system used in the different courses, inform you about your learning?”, it was observed that students graduating in 2019–20 reported this occurred “sometimes” (M = 2.1, SD = 1.0), while the faculty members teaching these students indicated that this happened “often” to “always” (M = 3.2, SD = 0.7). A significant difference was found, with faculty providing higher ratings (U = 1476, Sig. = *0.00). For the 2023–2024 cohort, the student ratings (M = 2.5, SD = 1.0) and the faculty members ratings (M = 3.3, SD = 0.6) also showed significant differences, with faculty members again providing higher ratings (U = 2682.5, Sig. = *0.00).
When comparing student ratings between cohorts (U = 12,643, Sig. = *0.00), significant differences were found, with higher ratings reported by students from the 2023–24 cohort. However, no significant differences were found when comparing faculty members ratings between cohorts (U = 1072, Sig. = 0.28).
Table 1 and Table 2 present the responses regarding the presence and importance that students and faculty members assign to various cognitive abilities, which include: Remembering (refers to memorizing content); Applying (refers to using theoretical knowledge), Understanding (refers to grasping ideas and relationships); Analyzing (refers to breaking content into parts and attributing meanings); Synthesizing (refers to combining parts into a meaningful whole), and Evaluating (refers to making judgments).
In Table 1, it can be observed that students rate the presence of the cognitive abilities listed in the table within ITEPE courses lower in almost all items compared to faculty Members, except for the ability to “Remember” (in relation to faculty Members), where student ratings are significantly higher. This pattern is consistent across both cohorts.
When comparing inter-cohort perceptions among faculty Members (2019–20 vs. 2023–24), faculty members teaching practicum students provided higher scores for all assessed cognitive abilities, with statistically significant differences in almost all items. Conversely, for inter-cohort perceptions among practicum students (graduates from 2019–20 vs. 2023–24), ratings were significantly higher across all items, with the 2023–2024 cohort giving the highest scores.
In Table 2, the ratings regarding the importance assigned to cognitive abilities by both practicum students and ITEPE faculty members are presented.
The ratings from faculty members and students regarding the importance of cognitive abilities in ITEPE training, in general, tend to be higher for faculty members in the 2019–20 cohort and lower for the 2023–24 cohort. For the cognitive ability of “Remember”, statistically significant differences were observed, with higher ratings from students in both cohorts.
For the abilities “Apply”, “Understand”, “Analyze”, and “Synthesize”, both faculty members and students gave high ratings (ranging between considerable and great importance), with significant differences between students and faculty members only in the most recent cohort (2023–24) for “Remembering”, “Synthesize”, and “Evaluate”.
When analyzing inter-cohort ratings among faculty, significantly higher mean differences were observed in the most recent cohort (2023–24) for “Analyze”, “Synthesize”, and “Evaluate”. Similarly, when comparing student ratings, significantly higher ratings were observed in the 2023–24 cohort for “Apply”, “Understand”, and “Analyze”.
Table 3 presents the ratings regarding the use of a series of assessment instruments and procedures in ITEPE.
Regarding the frequency of use or experience with various assessment instruments and procedures in ITEPE, it was observed that, in the 2019–20 cohort, faculty members provided higher ratings than students for open-ended question exams, while students gave significantly higher ratings for closed-ended question exams. For the 2023–24 cohort, significant differences were identified in seven items, with consistently higher ratings from students in all cases.
In inter-cohort analyses comparing Faculty Members vs. Faculty Members, significant differences were detected, with lower ratings from the 2023–24 faculty cohort for multiple-choice exams and open-ended question exams. Conversely, when comparing Student vs. Student means, seven items showed significantly higher ratings in the 2023–24 student cohort, suggesting a positive evolution in their perception of the use of these instruments.
The ratings regarding how grades are determined in ITEPE courses are presented in Table 4.
The results regarding how grades are defined in ITEPE courses indicate that, in the 2019–20 cohort, no significant differences were found between student and faculty members’ ratings. However, for the item “Grades are determined through dialogue and consensus”, faculty members provided slightly higher ratings. In the 2023–24 cohort, only for the item “The grade is decided by the professor based on the assessment”, student ratings were significantly higher.
When comparing faculty ratings between cohorts, significant differences were identified for the items “Students self-assess”, “Grades are determined based on self-assessment”, and “Grades are determined based on peer-assessment”, where faculty members from the 2023–24 cohort provided higher ratings. Regarding inter-cohort student comparisons, significant differences were observed in all items, with consistently higher ratings from practicum students in the 2023–24 cohort.

4. Discussion

The aim of the study was “To examine the perceptions of learning assessment among two cohorts of practicum students and university faculty in Initial Teacher Education in Physical Education from two cohorts from subsequent moments, with difference of four years”.
Regarding the frequency with which faculty provide information about learning through the assessment system, practicum students from both cohorts reported receiving information about their learning “sometimes”, while faculty gave significantly higher ratings. As highlighted by other studies, such as Gallardo-Fuentes et al. (2022), this discrepancy between these actors can be attributed to faculty occasionally claiming to use assessment with formative characteristics, while students do not share this view (Arribas-Estebaranz et al., 2015; Silva-Rodríguez & López-Pastor, 2015). Regarding the data presented by the students, they may be indicating a perceived limited presence of feedback in the teaching–learning process (TLP). The concerning aspect of this is that when feedback is not observed, formative assessment is unlikely to exist (Barrientos & López-Pastor, 2015).
A particularly concerning aspect is that, when analyzing inter-cohort student ratings (2019–20 vs. 2023–24), significantly lower differences were identified in the most recent cohort. This could be associated with various factors, such as greater emphasis on feedback in older cohorts, which reinforces the idea that there is a limited presence of this element in the more recent cohort. Feedback is an essential component for the effective implementation of formative assessment (Molina-Moreira et al., 2023; Molina-Soria et al., 2023), suggesting a concerning deviation from the objectives outlined in Decree 67.
The results show differences in perceptions between faculty and students regarding the presence of certain cognitive abilities, such as applying, analyzing, and evaluating, with higher ratings given by faculty. This aligns with reports from other studies where faculty, aware of the relevance of these abilities in formative processes, claim to use them, even though students do not share the same opinion (Fraile-Aranda et al., 2018; Gallardo-Fuentes et al., 2022).
When comparing inter-cohort perceptions between faculty vs. faculty and students vs. students, it was observed that both groups in the 2023–24 cohort gave higher scores for all assessed cognitive abilities, with statistically significant differences in nearly all items. This pattern reveals a greater sensitivity to assessment in recent cohorts, with a particular emphasis on the ability to apply, analyze, and evaluate (Stravakou, 2024). It also highlights the importance these educational actors place on clarifying the knowledge, skills, and attitudes that students must demonstrate in executing or solving specific formative tasks (Troya-Santillán et al., 2024).
These differences reflect a trend towards an increasing concern among Initial Teacher Education (ITE) students about learning assessment (Arriaga-Costa & Petiz-Pereira, 2022), especially in disciplines like Physical Education, which bears the legacy of traditional practices focused on measuring sports performance (Moreno-Doña et al., 2014; Patel, 2024). These practices diverge from the objectives proposed by Decree 67.
Regarding the importance students and faculty assign to cognitive abilities, the results indicate that practicum students give significantly higher importance to the cognitive ability of “remembering” compared to faculty. This has been documented in previous studies, which suggest that despite the low impact of this ability on learning, its constant use in formative contexts leads students to consider it a relevant element in their educational processes (Azhimatova & Kutpidin-Uulu, 2024; Barba-Martín et al., 2020a). This is also related to the student’s perception of its greater presence in assessment systems and ITEPE programs (Fraile-Aranda et al., 2018). This finding is concerning given the transferability of assessment practices experienced by teacher trainees to their future professional practices.
Studies have demonstrated that formative assessment oriented toward learning can positively impact future teaching practices, especially when effectively incorporated into practicum processes (Herrero-González et al., 2021). Moreover, when comparing student vs. student and faculty vs. faculty ratings across both cohorts, it was observed that in the 2023–24 cohort, mean scores were significantly higher for abilities such as applying, understanding, analyzing, synthesizing, and evaluating. This is promising, as these data indicate the relevance that these educational actors assign to higher-order cognitive abilities, which should be developed in university training (Cantón-Mayo, 2012).
Some authors suggest this could be related to greater formative intentionality linked to learning assessment (Asenjo-Paredes et al., 2024). However, this does not guarantee that in practice, appropriate and genuinely learning-centered assessment systems will be implemented, as institutional processes may overshadow them (Carney et al., 2022).
In terms of the frequency of use of various assessment instruments and procedures, significant differences were observed in perceptions of the presence of “closed-ended question exams” and “open-ended question exams”, where faculty reported a significantly lower frequency compared to students. The faculty’s lower valuation of closed-ended questions aligns with the findings of Fraile et al. (Fraile-Aranda et al., 2018). This finding is consistent with previous studies that reinforce students’ perception of the predominance of traditional cognitive abilities and instruments in evaluative processes (Pereira et al., 2022).
For inter-cohort comparisons of faculty vs. faculty, no significant differences were observed for most assessment instruments, except for reports or written assignments and field journals, which received higher ratings in the 2023–24 cohort. Faculty in more recent cohorts may be more inclined to adopt such instruments due to regulatory changes encouraging formative assessment and more diversified tools in the Chilean educational context (Gallardo-Fuentes et al., 2023).
In terms of student vs. student inter-cohort comparisons, differences were more pronounced, with significantly higher ratings in the 2023–24 cohort. Tools such as portfolios, recognized in the literature for their positive impact on learning (Pagone et al., 2024; Rodrigues dos Santos, 2024), and physical practical tests, commonly used in disciplines like Physical Education due to their contextual relevance (Cantón-Mayo, 2012; Quennerstedt et al., 2024) and their role as a formative experience (Stieg et al., 2024). However, these often follow a traditional logic, focused on measurement and contributing to social control (Sarni et al., 2021).
Higher and consistent ratings in both cohorts and groups corresponded to “reports or written assignments”, contrasting with numerous studies that generally report a high prevalence of various types of exams (Castejón et al., 2018). These results reflect an evolution in the use of assessment instruments in ITE programs, with tools like “essays” and “portfolios” gaining prominence, especially in the 2023–24 cohort from students’ perspectives. However, instruments like “field journals” and “classroom observations” received the lowest ratings, despite being associated with more complex cognitive abilities and being most recommended for formative assessment processes (Caleachina, 2023). This demonstrates inconsistency with the data reported about the previously valued cognitive abilities.
The results regarding how grades are determined in ITEPE courses show that, even in the most recent cohort (2023–24), students gave significantly higher ratings, compared to faculty and students from the 2019–20 cohort, to the statement “Grades are decided by faculty based on the evaluation”. This finding aligns with studies emphasizing the need for greater student participation in defining grades, and promoting more collaborative and learning-centered approaches (Fuentes-Diego & Salcines-Talledo, 2018; Romero-Martín et al., 2014). University students positively perceive grading based on dialogue, indicating that it improves interaction with faculty (Otero-Saborido et al., 2022).
The results highlight a shift in perception towards practices such as self-grading, which is positive as it is considered a democratic strategy that positively contributes to learning (Campillo-Ferrer & Miralles-Martínez, 2023; Marbun et al., 2024). There is also an openness in the most recent cohort (2023–24) to formative and participatory assessment practices, particularly among students. This could reflect an evolution in the pedagogical strategies implemented in ITE programs, consistent with findings from other studies (Gallardo-Fuentes et al., 2018). However, it is concerning that, even in the most recent cohort, the statement “faculty decide grades…” continues to receive higher ratings, associated with the persistent presence of traditional evaluative practices (Canrinus et al., 2019; Richmond et al., 2019). This deviates from students’ demands to understand and discuss their grades (Landrum, 2019).
Among the study’s limitations, it can be mentioned that since the sample pertains to the Chilean FIPEF, the generalization of the results to other educational contexts and disciplines may be limited.

5. Conclusions

In conclusion, the study identified significant differences in the ratings related to “perceptions on practices” regarding learning assessment in ITEPE across three Chilean universities. These differences highlight progress in the implementation of formative and participatory approaches, as evidenced by the differentiated ratings between the 2019–20 and 2023–24 cohorts. Despite the evolution of assessment practices, traditional assessment processes remain prevalent. This is particularly evident in the higher value placed by students on the presence of the “remember” ability compared to faculty members’ ratings, a pattern observed in both cohorts. However, abilities such as applying, analyzing, and evaluating received increased attention, especially in the 2023–24 cohort, reflecting a potential shift toward more complex and reflective pedagogical practices. Strategies should be promoted to encourage the effective use of these practices in the ITEPE context.
Students from the most recent cohort (2023–24) reported a greater presence of tools such as portfolios and essays, which are notable for their positive impact on learning. However, instruments like field journals and classroom observations, which are associated with more advanced cognitive abilities, received low ratings, highlighting a challenge for ITEPE administrators. In line with this, grade determination is still predominantly perceived by students as being under the faculty’s control. Lastly, the study reveals that while progress has been made toward more participatory and learning-centered practices, tensions remain due to the marked presence and influence of traditional assessment practices. This is particularly evident in students’ perceptions of the limited feedback they receive.
Regarding possible implications of the study, these are divided into:
(a)
Theoretical implications, where comparing different cohorts allows for the identification of changes in students’ and teachers’ perceptions in response to regulatory and pedagogical modifications, highlighting the evolution of assessment in educational contexts.
(b)
Practical implications, where the results underscore the importance of designing teacher training strategies that reinforce formative assessment and encourage student participation in evaluation processes.
Given these findings, there is a need to strengthen training in learning assessment for both students and faculty members in ITEPE, promoting a shift that positions assessment as a tool for learning rather than merely a mechanism for verification and certification of achievements. Additionally, it is crucial to continue investigating how assessment perceptions and practices evolve in response to normative and pedagogical changes. Such research will ensure that these transformations positively impact the training and professional performance of future teachers graduating from ITEPE and working in school contexts.

Author Contributions

All authors contributed to the design, writing, and final review of the study. F.G.-F. and B.C.-T. were responsible for the design of the initial draft. J.G.-F. contributed to the methodological design and data analysis. S.P.-T. and S.P.-N. provided the first comprehensive review and preparation of the original draft. Finally, all authors contributed to a round of review and editing of the final version of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Agency for Research and Development (ANID), Programa Fondecyt N° 1230609 and N° 1240883.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and the published data were collected as part of the project “ANID, FONDECYT REGULAR 2023 (1230609)”, titled “When school regulations change, should Teacher Training also change? The impact of the new decree on assessment, grading, and school promotion in the preparation of future Physical Education teachers.” This project was approved under Code “H009/2023”, issued by the Scientific Ethics Committee of the University of Los Lagos (CEC-Ulagos) on 28 March 2023.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data from this study are available upon request from the corresponding author in a version that maintains the anonymity of the participants.

Acknowledgments

This article is linked to the project “ANID, FONDECYT REGULAR 2023 (1230609)”, titled “When school regulations change, should Teacher Training also change? The impact of the new decree on assessment, grading, and school promotion in the preparation of future Physical Education teachers” and “ANID, FONDECYT REGULAR 2024 (1240883)”, titled “What Are the Articulation Processes in Knowledge and Disciplinary Didactics in Physical Education Teacher Training? The Impact of Teaching Profession Standards on Disciplinary Knowledge through Reciprocal Assessment”.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Aasland, E., Nyberg, G., & Barker, D. (2024). Enacting a new physical education curriculum: A collaborative investigation. Sport, Education and Society, 1–14. [Google Scholar] [CrossRef]
  2. Arriaga-Costa, C., & Petiz-Pereira, O. (2022). Students’ and teachers’ perception of the teaching-learning process: What brings them together or apart. International Journal of Education Economics and Development, 13(2), 119–136. [Google Scholar] [CrossRef]
  3. Arribas-Estebaranz, J. M., Manrique-Arribas, J. C., & Tabernero-Sánchez, B. (2015). Assessment tools used in the initial teacher trainingand its relevance to the development of professionalcompetencies in students: Vision of students, graduates and faculty. Revista Complutense de Educación, 27(1), 237–255. [Google Scholar] [CrossRef]
  4. Asenjo-Paredes, C., Gallardo-Fuentes, F., Carter-Thuillier, B., & Peña-Troncoso, S. (2024). Concepts and assessement practices in initial teacher training for physical education. Revista Ciencias de la Actividad Física, 25(1), 1–17. [Google Scholar] [CrossRef]
  5. Azhimatova, E., & Kutpidin-Uulu, E. (2024). The use of cognitive tasks in teaching mathematics to future primary school teachers of a pedagogical college. Вестник Иссык-Кульскoгo университета, 51(07), 141–147. [Google Scholar] [CrossRef]
  6. Barba-Martín, R., Hernando-Garijo, A., Hortigüela-Alcalá, D., & González-Calvo, G. (2020a). After nearly a decade of Bologna: Have we really improved the quality of teaching? Journal Espiral Teachers’ Notebooks, 13(26), 97–108. [Google Scholar] [CrossRef]
  7. Barba-Martín, R., Hortigüela-Alcalá, D., & Pérez-Pueyo, Á. (2020b). To evaluate in physical education: Analysis of the existing tensions and justification of the employment of the formative and shared assessment. Educación Física y Deporte, 39(1), 23–46. [Google Scholar]
  8. Barreyro, G. B., & Rothen, J. C. (2007). Avaliação e regulação da educação superior: Normativas e órgãos reguladores nos 10 anos pós LDB. Avaliação: Revista da Avaliação da Educação Superior (Campinas), 12(01), 133–147. [Google Scholar]
  9. Barrientos, E., & López-Pastor, V. (2015). Formative assessment in higher education. An international review. Peer-Reviewed Journal of CIEG, 21, 272–284. [Google Scholar]
  10. Beltrán-Véliz, J., Barros-Ketterer, J., & Carter-Thullier, B. (2020). Technical-instrumental rationality in physical education. A qualitative study in the chilean context. JournalEspacios, 41(4), 1–11. [Google Scholar]
  11. Berlanga-Ramírez, M., & Juárez-Hernández, L. G. (2020). Evaluation paradigms: From the traditional to the socio-formative. Diálogos Sobre Educación. Temas Actuales en Investigación Educativa, 11(21), 1–14. [Google Scholar] [CrossRef]
  12. Brandt, A., Nascimento, F., & Magalhães, N. (2018). Avaliação compatilhada entre professores formadores e estudantes dos cursos de licenciatura. Revista on line de Política e Gestão Educacional, 22(2), 507–523. [Google Scholar] [CrossRef]
  13. Caleachina, O. (2023). The referential framework of the management concept of the assessment of school results. Competitiveness and Innovation in the Knowledge Economy, 100, 199–203. [Google Scholar] [CrossRef]
  14. Campillo-Ferrer, J. M., & Miralles-Martínez, P. (2023). Impact of the flipped classroom model on democratic education of student teachers in Spain. Education, Citizenship and Social Justice, 18(3), 280–296. [Google Scholar] [CrossRef]
  15. Canrinus, E., Klette, K., & Hammerness, K. (2019). Diversity in coherence: Strengths and opportunities of three programs. Journal of Teacher Education, 70(3), 192–205. [Google Scholar] [CrossRef]
  16. Cantón-Mayo, I. (2012). La Universidad, un espacio para el aprendizaje. Más allá de la calidad y la competencia. Madrid: Narcea, 345 pp. Bordón. Revista de Pedagogía, 64(3), 153–154. [Google Scholar]
  17. Carney, E. A., Zhang, X., Charsha, A., Taylor, J. N., & Hoshaw, J. P. (2022). Formative assessment helps students learn over time: Why aren’t we paying more attention to it? Intersection: A Journal at the Intersection of Assessment and Learning, 4(1), 1–12. [Google Scholar] [CrossRef]
  18. Castejón, F. J., Santos Pastor, M. L., & Cañadas, L. (2018). Development of teaching competencies in initial physical education teacher training: The relationship with assessment instruments. Estudios Pedagógicos (Valdivia), 44(2), 111–126. [Google Scholar] [CrossRef]
  19. Castillo-Retamal, F., Almonacid-Fierro, A., Castillo-Retamal, M., & de Oliveira, A. A. B. (2020). Physical Education teacher training in Chile: A historical view. Retos, 38, 317–324. [Google Scholar] [CrossRef]
  20. Dill, D. D. (2003). The regulation of academic quality: An assessment of university evaluation systems with emphasis on the United States. Department of Public Policy Paper, University of North Carolina. [Google Scholar]
  21. Fraile-Aranda, A., Catalina-Sancho, J., De Diego-Vallejo, R., & Aparicio-Herguedas, J. L. (2018). The cognitive abilities in the evaluation of the initial formation of the faculty of physical education. Sportis. Scientific Journal of School Sport, Physical Education and Psychomotricity, 4(1), 77–94. [Google Scholar] [CrossRef]
  22. Fuentes-Diego, V., & Salcines-Talledo, I. (2018). Study on the implementation of formative and shared assessment in a higher education training cycle. RIEE. Ibero-American Journal of Educational Evaluation, 11(2), 91–112. [Google Scholar] [CrossRef]
  23. Gallardo-Fuentes, F., Carter-Thuillier, B., López-Pastor, V., Ojeda-Nahuelcura, R., & Fuentes-Nieto, T. (2022). Assessment systems in Physical Education teacher training: A case study in Chilean context. Retos, 43, 117–126. [Google Scholar] [CrossRef]
  24. Gallardo-Fuentes, F., Carter-Thuillier, B., Peña-Troncoso, S., Martínez-Angulo, C., & López-Pastor, V. (2023). Critically analyzing the incorporation of the current regulations on «evaluation, grading and school promotion» in the initial training of physical education teachers in Chile. Interciencia, 48(10), 544–551. [Google Scholar]
  25. Gallardo-Fuentes, F., López-Pastor, V., & Carter-Tuhillier, B. (2018). Effects of the Application of a formative assessment system on the self-perception of acquired competencies in initial teacher training. Pedagogical Studies (Valdivia), 44(2), 55–77. [Google Scholar] [CrossRef]
  26. Gedda-Muñoz, R., & Guerrero-Azócar, R. (2021). The educational curriculum as an epistemological field of education: Its construction through classroom research. Journal Current Research in Education, 21(2), 501–528. [Google Scholar] [CrossRef]
  27. Giuliano, F. (2020). Evaluative reason and educative judgment scenarios. Pensamiento Actual, 20(34), 74–90. [Google Scholar]
  28. Guerra-Sialer, J., Paiva-Jurupe, K., Casas-Montenegro, J., & Rodas-Torres, L. (2023). Formative assessment: A major challenge in early education. Journal of Climatology Special Edition Social Sciences, 23, 466. [Google Scholar] [CrossRef]
  29. Herrera-Bravo, C. A., Aguirre-Zúñiga, P., Honores-Barrios, F., & Riveros-Diegues, N. (2023). How is the current regulation on school evaluation, qualification and promotion articulated in the evaluation regulations in Antofagasta? Revista Enfoques Educacionales, 20(2), 1–26. [Google Scholar] [CrossRef]
  30. Herrero-González, D., Manrique-Arribas, J., & López-Pastor, V. (2021). Incidence of pre-service and in-service teacher education in the application of formative and shared assessment in physical education. Retos, 41, 533–543. [Google Scholar] [CrossRef]
  31. Jiménez-Moreno, J. A. (2019). Epistemological approaches to educational evaluation: Between must be and relativity. Foro de Educación, 27, 185–202. [Google Scholar]
  32. Landrum, B. (2019). ‘See me as i see myself’: A phenomenological analysis of grade bump requests. Qualitative Research in Education, 8(3), 315–340. [Google Scholar] [CrossRef]
  33. Looney, A., Cumming, J., Van Der Kleij, F., & Harris, K. (2018). Reconceptualising the role of teachers as assessors: Teacher assessment identity. Assessment in Education: Principles, Policy & Practice, 25(5), 442–467. [Google Scholar] [CrossRef]
  34. López-Pastor, V., & Pérez-Pueyo, Á. (2017). Formative and shared assessment in education: Successful experiences at all educational stages. Universidad de Leon. [Google Scholar]
  35. Magdalena, I., Nurchayati, A., & Apriliyani, D. (2023). Pentingnya Peran Evaluasi dalam Pembelajaran di Sekolah Dasar. MASALIQ, 3(5), 833–839. [Google Scholar] [CrossRef]
  36. Marbun, R., Siringo-ringo, J., Nadeak, D., & Siburian, I. (2024). Implementing democratic learning through independent learning. Advances In Social Humanities Research, 2(8), 1015–1023. [Google Scholar] [CrossRef]
  37. MINEDUC. (2018). Guidelines for the implementation of Decree 67/2018 on assessment, grading, and school promotion. Ministerio de Educación Santiago de Chile. [Google Scholar]
  38. Molina-Moreira, A., Velásquez-Orellana, O., Zambrano-Murillo, D., & Zambrano-Villamil, M. (2023). Importance of feedback in the student evaluation process. International Journal of Social Sciences, 6(3), 168–172. [Google Scholar] [CrossRef]
  39. Molina-Soria, M., López-Pastor, V., Hortigüela-Alcalá, D., Pascual-Arias, C., & Fernández-Garcimartín, C. (2023). Formative and shared assessment and feedback: An example of good practice in physical education in pre-service teacher education. Cultura Ciencia Deporte [CCD], 18(55), 157–169. [Google Scholar] [CrossRef]
  40. Moreno-Doña, A., Gamboa-Jiménez, R., & Poblete-Gálvez, C. (2014). Physical education in Chile: A critical analysis of ministerial documentation. Revista Brasileira de Ciências do Esporte, 36(2), 411–427. [Google Scholar] [CrossRef]
  41. Mutiawati, M., Mailizar, M., Johar, R., & Ramli, M. (2023). Exploration of factors affecting changes in student learning behavior: A systematic literature review. International Journal of Evaluation and Research in Education (IJERE), 12(3), 1315. [Google Scholar] [CrossRef]
  42. Ngatia, L. W. (2022). Student-centered learning: Constructive alignment of student learning outcomes with activity and assessment. In Experiences and research on enhanced professional development through faculty learning communities (pp. 72–92). IGI Global. [Google Scholar]
  43. Nikoladze, M. (2023). Student-centered educational process and protection of children’s rights. Language and Culture, 30, 141–144. [Google Scholar] [CrossRef]
  44. Otero-Saborido, F., Estrada, F., & Devia, C. P. (2022). Perception of university students of Physical Education on the dialogue mark. Retos: Nuevas Tendencias en Educación Física, Deporte y Recreación, 43, 300–308. [Google Scholar] [CrossRef]
  45. Pagone, B., Primogerio, P. C., & Dias Lourenco, S. (2024). Pedagogic and assessment innovative practices in higher education: The use of portfolio in economics. Journal of International Education in Business, 17(2), 228–245. [Google Scholar] [CrossRef]
  46. Pascual-Arias, C., & Molina-Soria, M. (2020). Assessing to learn in the “Practicum”: A proposal for formative and shared assessment during pre-service teacher education. Publicaciones, 50(1), 183–206. [Google Scholar] [CrossRef]
  47. Patel, J. (2024). Assessment methods in physical education: Advancements, challenges, and best practices. Innovations in Sports Science, 1(1), 22–25. [Google Scholar] [CrossRef]
  48. Pereira, D., Cadime, I., & Flores, M. A. (2022). Investigating assessment in higher education: Students’ perceptions. Research in Post-Compulsory Education, 27(2), 328–350. [Google Scholar] [CrossRef]
  49. Pérez-Pueyo, Á., Hortigüela-Alcalá, D., Fernández-Fernández, J., Gutiérrez-García, C., & Santos-Rodríguez, L. (2021). More hours yes, but how can they be implemented without losing the pedagogical approach of Physical Education? Retos, 39, 345–353. [Google Scholar] [CrossRef]
  50. Porlán, R. (2018). University teaching: How to improve it. Ediciones Morata. [Google Scholar]
  51. Quennerstedt, M., Barker, D., Johansson, A., & Korp, P. (2024). Teaching with the test: Using fitness tests to teach paradoxically in physical education. European Physical Education Review, 1356336X241283796. [Google Scholar] [CrossRef]
  52. Rauner, F., & Wittig, W. (2010). Differences in the organisation of apprenticeship in Europe: Findings of a comparative evaluation study. Research in Comparative and International Education, 5(3), 237–250. [Google Scholar] [CrossRef]
  53. Richmond, G., Bartell, T., Carter Andrews, D., & Neville, M. L. (2019). Reexamining coherence in teacher education. Journal of Teacher Education, 70, 188–191. [Google Scholar]
  54. Rodrigues dos Santos, A. R. D. (2024). Reflective portfolios: A learning and self-assessment tool. South Florida Journal of Development, 5(10), e4511. [Google Scholar] [CrossRef]
  55. Romero-Martín, R., Fraile-Aranda, A., López-Pastor, V., & Castejón-Oliva, F. (2014). The relationship between formative assessment systems, academic performance and teacher and student workloads in higher education. Journal for the Study of Education and Development, 37(2), 310–341. [Google Scholar] [CrossRef]
  56. Ruiz-Gallardo, J., Ruiz-Lara, E., & Ureña-Ortín, N. (2013). The assessment in initial teacher training: What we do and what students perceive. Cultura, Ciencia y Deporte, 8(22), 17–29. [Google Scholar] [CrossRef]
  57. Ruz-López, D., Pérez-Arredondo, C., & Bañales-Faz, G. (2024). Evaluation, grading and school promotion in Decree 67 in Chile: Discursive analysis of the conceptual perspectives and roles of educational actors. Discurso & Sociedad, 18(2), 337–367. [Google Scholar] [CrossRef]
  58. Sarni, M., Luis-Corbo, J., & Noble, J. (2021). Constellations of evaluation in ISEF-Udelar PROGRAMS. Inter-Cambios Dilemas y Transiciones de la Educación Superior, 8(2). [Google Scholar] [CrossRef]
  59. Silva-Rodríguez, I., & López-Pastor, V. M. (2015). How do students experience assessment in initial teacher education? Available online: http://hdl.handle.net/10550/44766 (accessed on 20 December 2024).
  60. Stieg, R., Gama, J. C. F., Sarni, M., & Santos, W. D. (2024). Evaluative experiences of students in the teachers training courses of physical education in Colombia and Uruguay. Revista de Ensino, Educação e Ciências Humanas, 25(1), 40–50. [Google Scholar] [CrossRef]
  61. Stravakou, P. A. (2024). Assessment and evaluation in higher education—The case of Greek university students. Scholars Bulletin, 10(03), 70–79. [Google Scholar] [CrossRef]
  62. Torkildsen, L. G., & Erickson, G. (2016). ‘If they’d written more…’—On students’ perceptions of assessment and assessment practices. Education Inquiry, 7(2), 27416. [Google Scholar] [CrossRef]
  63. Troya-Santillán, B. N., Troya Santillán, C. M., Guamán Santillán, R., Boza Aspiazu, H. P., Arzube Plaza, D. M., Nivela Cedeño, A. N., & Bernal Párraga, A. P. (2024). Evaluation: An opportunity to facilitate learning. Ciencia Latina Revista Científica Multidisciplinar, 8(5), 7019–7035. [Google Scholar] [CrossRef]
  64. Villegas, L. A. (2024). The Influence of evaluation system in professional growth on basic education teachers. International Journal of Innovative Science and Research Technology (IJISRT), 1212–1215. [Google Scholar] [CrossRef]
  65. Wang, L. (2023). The impact of student-centered learning on academic motivation and achievement: A comparative research between traditional instruction and student-centered approach. Journal of Education, Humanities and Social Sciences, 22, 346–353. [Google Scholar] [CrossRef]
  66. Zeichner, K. (2010). Rethinking the connections between campus courses and field experiences in college- and university-based teacher education. Journal of Teacher Education, 61(1–2), 89–99. [Google Scholar] [CrossRef]
Table 1. Presence of cognitive abilities in ITEPE courses.
Table 1. Presence of cognitive abilities in ITEPE courses.
In How Many Courses You Have Taken/Taught Have the Following Cognitive Abilities Been Present in the Assessment Systems?
Practicum Cohort 2019–20Practicum Cohort 2023–24FM. vs. FM. and S. vs. S. Inter-Cohort
FM1.S1.FM1 vs. S1.FM2.S2.FM2. vs. S2.FM. (2019–20 vs. 2023–24)S. (2019–20 vs. 2023–24)
M.dt.M.dt.U FM-SSig.M.dt.M.dt.U FM-SSig.U FM1-FM2Sig.U S1-S2Sig.
Remembering2.50.72.70.82781*0.022.71.03.10.83965*0.001033.50.1911,822.5*0.00
Applying3.20.62.90.82770*0.013.60.63.40.64470.5*0.03815*0.0010,460*0.00
Understanding3.20.62.80.82720*0.013.50.63.30.746960.10872*0.0110,484.5*0.00
Analyzing2.90.92.60.930260.113.30.83.10.84502.5*0.04888.5*0.0211,621.5*0.00
Synthesizing2.81.02.60.93002.50.093.20.83.00.847080.11912.5*0.0311,406*0.00
Evaluating2.91.22.51.02899*0.053.30.93.01.04251.5*0.01952*0.0512,461*0.00
U = Mann–Whitney U test; FM = Faculty Members; S = Students * p0.05.
Table 2. Importance of cognitive abilities in ITEPE training.
Table 2. Importance of cognitive abilities in ITEPE training.
Important do You Consider the Following Cognitive Abilities for Your Training/for Your Students’ Training?
Practicum Cohort 2019–20Practicum Cohort 2023–24FM. vs. FM. and S. vs. S. Inter-Cohort
FM1.S1.FM1 vs. S1.FM2.S2.FM2. vs. S2.FM. (2019–20 vs. 2023–24)S. (2019–20 vs. 2023–24)
M.dt.M.dt.U FM-SSig.M.dt. M.dt.U FM-SSig.U FM1-FM2Sig.U S1-S2Sig.
Remembering2.50.92.91.12685.5*0.012.81.03.20.94382*0.029820.0914,220.50.06
Applying3.60.63.70.634390.653.80.43.80.55342.50.8210350.1014,460.5*0.04
Understanding3.60.63.70.63368.50.483.80.43.80.45197.50.481028.50.1013,873.5*0.00
Analyzing3.50.63.50.735310.913.80.43.70.647740.07899.5*0.0113,753*0.01
Synthesizing3.40.73.40.735170.883.80.53.50.74419*0.01894.5*0.0114,5090.09
Evaluating3.50.83.50.734110.603.80.43.60.84644.5*0.04930*0.0115,360.50.46
U = Mann–Whitney U test; FM = Faculty Members; S = Students * p0.05.
Table 3. Presence of assessment instruments used/experienced.
Table 3. Presence of assessment instruments used/experienced.
How Often Have You Used/Has the Professor Used the Following Assessment Instruments and Procedures in the Courses You Have Taken?
Practicum Cohort 2019–20Practicum Cohort 2023–24FM. vs. FM. and S. vs. S. Inter-Cohort
FM1.S1.FM1 vs. S1.FM2.S2.FM2. vs. S2.FM. (2019–20 vs. 2023–24)S. (2019–20 vs. 2023–24)
M.dt.M.dt.U FM-SSig.M.dt.M.dt.U FM-SSig.U FM1-FM2Sig.U S1-S2Sig.
Teacher Observation in Class (Observation Sheets)2.21.21.91.13003.50.102.31.32.11.147260.141131.50.5714,454.50.11
Monitoring Classroom Participation (in Groups and Debates)2.21.02.21.035360.932.51.12.61.05354.50.899700.0812,547.5*0.00
Multiple-Choice Exam2.41.12.61.13126201.71.52.61.03568.5*0.00882.5*0.0215,437.50.58
Open-Ended Question Exam2.80.92.21.12457.5*0.002.01.32.41.14652.50.10798*0.0014,6480.16
Short-Answer Exam (Brief Explanations)2.10.92.21.033670.561.81.32.41.14159.5*0.0110660.2914,2750.07
Closed-Ended Question Exam (Definitions)1.81.02.50.92205*0.001.61.42.51.13333.5*0.0010670.3015,1000.36
Written Exams Allowing Access to Documents2.01.22.01.134900.832.01.42.61.14190*0.011189.50.8810,883*0.00
Oral Exams2.11.31.71.229930.092.11.62.11.354010.971169.50.7712,993.5*0.00
Practical Tests of Physical Nature (Physical Exercises, Game Situations, etc.)2.71.32.90.93549.50.972.91.43.01.15119.50.5110500.2314,166*0.05
Portfolios2.01.32.11.134080.652.41.52.71.04885.50.251036.50.2111,253.5*0.00
Field Journals1.51.41.51.134880.821.71.42.21.34298.5*0.021093.50.4010,451*0.00
Reports or Written Assignments3.10.83.10.93506.50.863.30.83.70.63719*0.001046.50.229193.5*0.00
Essays Based on Written Texts or Audiovisual Materials2.51.12.61.034740.792.71.13.30.73664*0.0010720.319106*0.00
U = Mann–Whitney U test; FM = Faculty Members; S = Students * p0.05.
Table 4. How grades are defined in ITEPE.
Table 4. How grades are defined in ITEPE.
What Is Your Level of Agreement with the Following Statements Regarding How Grades Have Been Determined in the Courses You Have Taken/Taught?
Practicum Cohort 2019–20Practicum Cohort 2023–24FM. vs. FM. and S. vs. S. Inter-Cohort
FM1.S2.FM1 vs. S2.FM2.S2.FM2. vs. S2.FM. (2019–20 vs. 2023–24)S. (2019–20 vs. 2023–24)
M.dt.M.dt.U FM-SSig.M M.dt. M.dt.U FM-SSig.U FM1-FM2Sig.U S1-S2Sig.
The grade is decided by the professor based on the assessment2.51.12.70.932520.352.71.13.10.94150*0.0111280.5512,241*0.00
Students self-assess (partially or completely)2.11.12.11.035130.882.80.92.71.051010.49778.5*0.0010,853.5*0.00
Grades are determined through dialogue and consensus (between faculty and students) (partially or completely)2.31.02.01.130200.112.51.12.41.152540.721034.50.2012,224.5*0.00
Grades are determined based on self-assessment (partially or completely)1.91.02.01.033450.512.71.02.60.950300.39677.5*0.0010,554*0.00
Grades are determined based on peer assessment (among classmates) (partially or completely)2.01.22.21.133450.522.71.12.61.15165.50.59832.5*0.0112,601*0.00
U = Mann–Whitney U test; FM = Faculty Members; S = Students * p0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gallardo-Fuentes, F.; Carter-Thuillier, B.; Peña-Troncoso, S.; Pérez-Norambuena, S.; Gallardo-Fuentes, J. Perceptions of Learning Assessment in Practicum Students vs. Initial Teacher Education Faculty in Chilean Physical Education: A Comparative Study of Two Cohorts. Educ. Sci. 2025, 15, 459. https://doi.org/10.3390/educsci15040459

AMA Style

Gallardo-Fuentes F, Carter-Thuillier B, Peña-Troncoso S, Pérez-Norambuena S, Gallardo-Fuentes J. Perceptions of Learning Assessment in Practicum Students vs. Initial Teacher Education Faculty in Chilean Physical Education: A Comparative Study of Two Cohorts. Education Sciences. 2025; 15(4):459. https://doi.org/10.3390/educsci15040459

Chicago/Turabian Style

Gallardo-Fuentes, Francisco, Bastian Carter-Thuillier, Sebastián Peña-Troncoso, Samuel Pérez-Norambuena, and Jorge Gallardo-Fuentes. 2025. "Perceptions of Learning Assessment in Practicum Students vs. Initial Teacher Education Faculty in Chilean Physical Education: A Comparative Study of Two Cohorts" Education Sciences 15, no. 4: 459. https://doi.org/10.3390/educsci15040459

APA Style

Gallardo-Fuentes, F., Carter-Thuillier, B., Peña-Troncoso, S., Pérez-Norambuena, S., & Gallardo-Fuentes, J. (2025). Perceptions of Learning Assessment in Practicum Students vs. Initial Teacher Education Faculty in Chilean Physical Education: A Comparative Study of Two Cohorts. Education Sciences, 15(4), 459. https://doi.org/10.3390/educsci15040459

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop